Friday, September 8, 2023

Robert Epstein: Inside Big Tech’s Manipulation Machine and How to Stop It

Robert Epstein: Inside Big Tech’s Manipulation Machine and How to Stop It


April-07-2022
“They hold in their hands the power to change thinking and behavior on a massive scale, the power—in close elections anyway—to pick the winner, in country after country after country.”
I sit down with Dr. Robert Epstein, a senior research psychologist at the American Institute for Behavioral Research and Technology (AIBRT). He studied at Harvard University under B. F. Skinner, previously was editor-in-chief of "Psychology Today," and founded the Cambridge Center for Behavioral Studies. Today, he is perhaps best known for leading studies looking at how tech giants influence human behavior, and conducting extensive monitoring projects of bias in Google products and beyond. According to his team’s research, Google shifted at least 6 million votes in the 2020 elections.
“All these so-called free services, these services are not free. You pay for them with your freedom,” he says.
In this episode, he breaks down his team's latest findings, detailed in his report “Google’s Triple Threat,” and major ways in which Big Tech companies secretly manipulate their users without leaving any paper trail. With Congress unable to come to any consensus on how to deal with Big Tech, he says he’s found another way to force Big Tech to stop.
Below are some links Epstein mentions in the interview:

 
Jan Jekielek: Dr. Robert Epstein, such a pleasure to have you back on American Thought Leaders.
Dr. Epstein: Well, thanks. Nice to be back.
Mr. Jekielek: You've been studying for almost a decade now, how Big Tech is able to effectively manipulate people and in ways that people aren't really aware of. And the last time you were on the show, I think it was in November of 2020, you had told me, "We have all this data and I can't talk about it just yet, because we haven't analyzed it," because you're so rigorous in the work that you do. So, where are things at right now?
Dr. Epstein: Well, we've made more progress in the last, I'd say year and a half, than we have in the previous eight years combined. So when we last met, we were talking about the 2020 presidential election. We recruited field agents mainly in swing counties, in four swing states. And we had, with their permission, installed special software in their computers, which allowed us to look over their shoulders as they were doing anything that was election-related on their computers. And we had 1,735 of them all registered voters, they were balanced between conservatives, liberals and moderates. We preserved more than 1.5 million ephemeral experiences on Google, Bing, Yahoo, YouTube, Facebook, Google Homepage and more.
So what started as a very tiny project about six years ago in the 2016 presidential election, where we had 95 field agents in 24 states, has grown into something much, much more sophisticated. We've made a lot more discoveries, we've got lots more numbers and they're all terrible. In the sense that, they're telling us over and over and over again that we are pawns. We are being manipulated in ways that we cannot see, and in ways we cannot counteract and using methods that don't leave a paper trail for authorities to trace. So yes, I've been saying that for years, but now, we have so much more data, so much more information, so much more knowledge about how that works.
Mr. Jekielek: You mentioned ephemeral experiences, so briefly if you can, explain what it actually means. And that's actually, as I understand it, the language that Google itself uses.
Dr. Epstein: They're brief experiences that we have online. In fact, most of the experiences we have online are ephemeral that affect us, like a newsfeed is flashed before our eyes, or some search results or some search suggestions, or all kinds of things. A sequence of YouTube videos, a suggestion for which video to watch next, they affect us, they disappear, they're stored nowhere and they're gone.
It's the ideal form of manipulation. People have no idea they're being manipulated, number one, and number two, authorities can't go back in time to see what people were being shown. In other words, how they were being manipulated, so as it happens, the experiments I've been doing for a very long time, now almost a decade, that's what they're all about.
They're all about ephemeral experiences and trying to understand them, trying to name them and trying to quantify them to figure out the power that each kind of ephemeral experience has to change thinking and behavior and votes.
Mr. Jekielek: So you've been working in the field of psychology for many decades actually, and I just love to learn the trajectory of your career, and also how you came to end up focusing on this exactly.
Dr. Epstein: Well, I got my doctorate under B. F. Skinner at Harvard. I was actually his last doctoral student. And Skinner was kind of obsessed, I was going to say interested in, but that's too weak. He was kind of obsessed with issues of control. Are we controlled? Is there such a thing as free will? Could control be benign? And that was, he felt, in fact, that control could be benign. And therefore, that behavioral scientists had a responsibility to help engineer humanity. Somehow, whether we're going to engineer society, so that people are as happy as they can be, as productive as they can be and as creative as they can be. And people generally didn't like his concept, because we all feel that we're really not being controlled. Skinner's reply to that was, "Well yes, you are, you just don't know it."
So it's interesting that, that's in my background, but I didn't study control per se for a very, very long time. It wasn't until 2012, so that's decades after I got my doctorate, that I suddenly got interested in control again. That's because my website got hacked, I ended up getting a bunch of email alerts from Google, I think eight or nine or 10 of them saying, "Your website's been hacked and we're going to prevent people from going there and you better fix it."
I was baffled, but why was I getting these alerts from Google? Who made Google the sheriff of the internet? I've been a coder, a programmer since I was a teenager, so I also was about some of the tech involved in their ban. Somehow [they] were blocking me also through Apple Safari, that made no sense—different company. They're also blocking me somehow through Firefox, which was a browser that was created by a nonprofit organization called Mozilla. That made no sense.
There were all kinds of aspects of this that really got me looking at Google with a critical eye, which I had never done before, so that's early 2012. By the end of 2012, I was noticing a growing literature on search results, this was quite interesting. This was coming out of the field of marketing. Marketers, of course, want to help companies sell things.
And marketers were finding out that if you can just get up one more notch in Google Search results, that could make the difference between the success and failure of your company. It could increase sales by 30 percent or more. I mean, just literally going up one notch. That if you're on the second page of search results, you're done, that's it. You've got to be on that first page.
So there was a lot of new information and it caught my eye, because I was wondering, "Well, if people are so trusting of high ranking search results, could search results be used to change people's opinions?" "Could they even be used maybe to change their votes, their voting preferences?" And so early 2013, I began a series of experiments that just hasn't stopped till this day. And these are randomized, controlled, counterbalanced, double blind experiments, so they adhere to the very highest standards of scientific research.
And I thought, "Let's see if I can change people's opinions." I'll randomly assign people to two groups. In one group, the search results are going to favor one candidate and in the other, the search results are going to favor the opponent, the opposing candidate. What does that mean? That means that if someone clicks on a high ranking search result, they're going to get to a webpage that makes that candidate look really good and might make the other candidate look really bad. And in these experiments, I used real search results, real webpages, webpages we got from online, the search results we got from Google. And people were randomly assigned to these different groups and I thought, "I could shift people's voting preferences in this way by two percent or three percent." Very first experiment we ever ran on this, we shifted voting preferences by more than 40 percent. So I thought, "That's impossible, that can't be." We repeated the experiment with another group, got a shift of more than 60 percent. So I realized, "Wait a minute, maybe I've stumbled on something here."
Mr. Jekielek: You're just talking about this so casually, but so, okay. But because we're going to be talking about this a lot, what does it actually mean to shift people's opinion by, let's say the median or the middle of that, 50 percent? What does that mean exactly?
Dr. Epstein: It means that if I'm starting out with a 100 people and we're always using people who are undecided, because those are the people who can be influenced. So if I have a hundred undecided people and I say, "Okay, who are you going to vote for right now?" It's almost like they're doing mental coin tosses, so I end up with 50 people voting for one candidate and 50 voting for the other, so that's how I'm always starting.
And now, in fact, I will ask people some questions before the manipulation part of the experiment, I'll ask them questions. "How much do you like each candidate?" "How much do you trust each candidate?" And then ask them, "Okay, who would you vote for if you had to vote today?" How do we find people who are undecided? Very simple. We use participants from the United States and the elections we use are always somewhere else. They're usually from Australia.
And we ask specifically, before the manipulation, we ask, "How familiar are you with the politician named..." And whoever it may be, Tony Abbott and we get those numbers. If people are very familiar with one candidate or the other, we throw them out. We want undecided participants, so we start that way. And when I say there's a 40 percent shift in the direction of either candidate, okay, so that's important because remember, this is a random assignment. We could be putting them in the pro candidate A group or the pro candidate B group. So 40 percent shift means that I take 40 percent of the 50, in that case, it would be 20 and I move them over to the other group. So now, I've only got 30 leftover here and over here, I've got 70. I've taken a 50/50 split and turned it into a 30/70 split. I now have a win margin of 40 percent. So a 40 percent shift means I mix things up in such a way that I get a win margin that corresponds to that percentage.
These are obviously huge, huge numbers. Can this really be done? Oh yeah, because I've now done scores, and scores and scores of experiments. Yes, this is real.
Mr. Jekielek: Because typically, you think it's the statistical significance. There's all sorts of tests assigned to figure out if a very small shift is actually... These things are off the charts significant, basically. Yeah.
Dr. Epstein: They're so off the charts that really, you don't even need to do statistics. We do them anyway, but you don't need to. But you're quite right, usually in the behavioral and social sciences, we're looking at small effects. The effects that we have found in identifying these new sources of influence that the internet has made possible. These are among the largest effects ever discovered in the behavioral and social sciences in more than a hundred years that, in fact, people in my field have been looking for new kinds of influence. These are crazy big effects. So that's one thing that makes them scary, but that's not all, they're scary because people are generally unaware that they're being influenced.
They're crazy because all these experiences that people have in these experiments, they're all ephemeral, so there's no paper trail. And most of all, they're scary because they're controlled almost exclusively by four American tech companies. Now put that all together, now you've got something that's frightening, because you have sources of influence controlled by really a handful of executives who are not accountable to any public, not the American public, not any public anywhere, they're only accountable to their shareholders.
And yet, they hold in their hands, the power to change thinking and behavior on a massive scale. The power in closed elections anyway, to pick the winner in country after country, after country. We calculated it at one point, that as of 2015, which was quite a while ago, as of 2015, upwards of 25 percent of the national elections in the world were being controlled, being determined by Google's Search algorithm. This is because no one uses any other search algorithms—everyone uses Google's Search engine.
Mr. Jekielek: We don't know if someone is actually putting their fingers on the scale or not. But I remember you telling me, you actually are more concerned if they're not in a way and I thought that was always a very fascinating answer. I want to get you to reprise that because it's counterintuitive in a way.
Dr. Epstein: Sure.
Mr. Jekielek: Yes.
Dr. Epstein: There are various ways in which content could be constructed, so that it shifts opinions, shifts beliefs, shifts purchases, shifts votes, various ways in which that could happen. It could be that an executive of the company says to his or her minions, "Do this, make this happen." The second way is that, it's what I call the Marius Milner effect, is that one single software engineer at a company can simply tinker with some parameters.
And, is there precedent for that? Oh yeah. In fact, a software engineer named Marius Milner at Google, apparently according to Google, created this massive program in which Google Street View vehicles were driving around 30 different countries for more than three years, not only photographing our homes and our businesses, but also vacuuming up WiFi data. And according to Google, that whole project was just the invention, never authorized, the invention of one software engineer named Marius Milner
Did they then fire him? Oh, no. Marius Milner is still at Google and he's a hero at Google. So the point is, there could be rogue employees, rogue programmers that can do it, or it could be maybe they're just not paying attention. In fact, let's call this as I do in a new article I have coming out, Algorithmic Neglect. Okay, say they're just neglecting it, let's say there's an election coming up in Fiji, I used to live in the Fiji Islands with my wife, and more than 90 percent of searches in Fiji are done on Google. So let's say they don't care about Fiji at Google, so they're not paying any attention to the election. Guess what? Their algorithm is still going to favor one candidate over another. Why? Well, that's how it's built, that's what it's supposed to do. It's supposed to put one kind of webpage ahead of another, it's judging webpages and it's judging which ones are better.
And when it does that, it not only puts one dog food ahead of another, or one guitar brand ahead of another, it puts one candidate ahead of another. If you're not paying attention to it, it still does this, that's what it... Put it another way, there's no equal time rule built into Google's Search algorithm. We wouldn't want it to have an equal time rule, we want it to tell us what's best and we want it to put that near the top of the search results.
The reason why, to me, that's the scariest possibility is because that means that a computer program is picking the winners of elections, or is picking what dog food we buy or is picking what we should think. That's very scary, because computer programs are really, really, really stupid. So if computer programs are determining who runs the world, who runs many countries around the world, that can't be good for humanity. They're just not smart enough to make good decisions for us.
Mr. Jekielek: Just to beliguer this a little bit longer, people just say, "Well, that's just the natural consequence of the algorithm. That's fine." I think that would probably be a common way to think about it.
Dr. Epstein: Well, people don't know how algorithms work. They don't even know what algorithms are. So it could be people are just going to be indifferent about them. But there's another piece in this puzzle that I think would disturb a lot of people and that is the human element. We know from a variety of research that algorithms are built in a way that incorporates the biases of the programmers.
Now 96 percent of donations from Google and other tech companies in Silicon Valley go to one political party. It happens to be the party I like, which is the Democratic Party. But the point is, there's a lot of political bias in these companies. And we know from very good research that when people have bias, when programmers have bias, that gets programmed into the algorithms, so the algorithms end up having bias. Google admits to adjusting its search algorithm more than 3000 times a year, roughly 10 times a day, human beings are adjusting and making adjustments, changing it.
Now again, presumably they're changing it in ways that reflect either their personal bias, or the bias of their  supervisor or the bias of the CEO of the company. So again, the human element. The algorithms at Google and other companies, also really to make them run smoothly, they check lists—blacklists and whitelists. Our main concern here is the blacklists.
In 2016, I published an investigative piece, quite lengthy, in “U.S. News & World Report” that was called The New Censorship. And, it was about nine of Google's blacklists. Now, I had never seen any of Google's blacklists, Google never admitted to having blacklists, but I knew they existed because I'm a programmer. And one of the simplest ways to make an adjustment in what an algorithm is doing is have your algorithm check a blacklist before it displays any results to anyone.
Mr. Jekielek: You kind of predicted these things existed, because you saw that certain things just wouldn't appear in search for example, right?
Dr. Epstein: Oh, exactly.
Mr. Jekielek: Yes.
Dr. Epstein: Or things that were appearing, suddenly no longer appear. Or a company like Google, or Twitter or Facebook announces that certain kinds of points of view are really not acceptable, so suddenly, they're banned. Well, what do you do then? All you do is add some phrases and some words to your blacklist, that's how you do it. You don't have to reprogram anything, it's so simple.
But yes, I was writing about these things pretty early on. In 2019, I testified before the Senate Committee in Washington. And just before I testified, a representative of Google, a vice president from Google testified, he was under oath and a United States Senator asked him, "Does Google have blacklists?" And this man replied, "No Senator, we do not."
Speaker 1: Has Google ever blacklisted or attempted to blacklist a company, group, individual or outlet from its advertising partners, or its search results for political reasons?
Speaker 2: No, ma'am. We don't use blacklists, whitelists to influence our search results or...
Speaker 1: For what reason does Google blacklist a company?
Speaker 2: As I said, per your previous question, we do not utilize blacklists or whitelists in our search results to favor political outcomes, that's not...
Dr. Epstein: A few weeks later...
Mr. Jekielek: ... political outcomes, it's not.
Dr. Epstein: A few weeks later, a fellow named Zach Vorhies, who had been a senior software engineer at Google for eight-and-a-half years, walked out of Google and took property with him. This was the first time that a whistleblower had actually walked out with property. He walked out with more than 950 pages of documents and a two-minute video. Among those documents were three documents labeled blacklists. This was just a few weeks after that hearing. This man was lying under oath to Congress, which is a felony.
Mr. Jekielek: What is the strongest effect that you've been able to observe in your own manipulations and what method was used to create that effect and how did it really work, so people can get a picture?
Dr. Epstein: The strongest effect that we've discovered so far is abbreviated OME which stands for opinion matching effect. A paper on this was accepted for presentation at a scientific conference. So, end of April in the year 2022, we'll be presenting this at a conference. It's so simple. You are, let's say, “The Washington Post” or you might even be Tinder and you tell people, "Hey, election is coming up and we're going to help you make a decision about who to vote for," because a lot of people have trouble. Some people have very strong political views. They know who they're voting for.
But close to an election, in fact, news organizations and sometimes some strange companies like Tinder say, "We'll help you make up your mind. So come to our website, go to this website, click here, and we'll give you a quiz. And based on how you answer the questions about your opinions on immigration, your opinions on nuclear weapons, your opinions on Russia, whatever it may be, we'll tell you which candidate is the best match for you." That's why this is called opinion matching.
We have found that when we, in experiments, give people a quiz and then we tell them how good a match they are for one candidate or another, so you match Candidate A by 85 percent. The other one, you only match by 23 percent, and then we say, "Okay, now who are you going to vote for, or how much do you like this candidate or trust this candidate," all the numbers shift in the direction of whichever candidate we said you match. They all shift. And the shifts we get are between 70 and 90 percent. Now, there's something else... Which are the biggest numbers we've ever gotten consistently in any kind of experiments we've run in, now, almost a decade. But there's something else about opinion matching, about OME that is crazy.
Normally, when we run experiments like this, some people, at least a few people, are suspicious and they think that maybe there's some bias in the content they're being shown. Sometimes it's four percent or five percent, sometimes it's 25 percent, it depends. But in OME, nobody suspects that there's any kind of bias involved in the content that we're showing them. How could they see bias? They don't have an opportunity to see bias, even though we're not paying any attention whatsoever to their answers. We don't even look at their answers. But not a single person that has ever run in our OME experiments has said anything about bias. Now, are there really opinion matching websites out there that ignore your answers? Oh, yeah. So that's the other part of that research project.
We have been looking at dozens and dozens of opinion matching websites. We've actually created algorithms that do this, so we can actually take a quiz many, many, many times, just typing in random numbers and then checking to see what results we're getting. So we found websites that help you decide which political party you should join.
On one website, you almost always will be told, "You're a perfect match for Republicans. Join the Republican Party," and then on another website, you're told almost always, "You're a perfect match for Democrats, for the Democratic Party. Sign up here." We're typing in random answers. So opinion matching is a fantastic way to manipulate people because you can shift people very, very, very dramatically and they have no clue. They do not suspect any kind of bias or manipulation.
Mr. Jekielek: Let's jump to the monitoring now. Broadly speaking, from that data set from 2020, what is it that you found?
Dr. Epstein: In 2020, at first we had about 700 field agents who were monitoring the presidential election and they were located in swing counties in three swing states. We found, as we have found in the past, a very strong, liberal bias, but not on Bing or Yahoo. Bing and Yahoo had a little bit of a conservative slant, but they don't affect very many votes because hardly anyone uses them. On YouTube, 93 percent of the videos being recommended to people by Google's Up Next algorithm, 93 percent came from strongly liberal news sources. Now, that's going to all of our field agents. And in fact, there was more of a liberal bias in those recommendations on YouTube going to conservatives than going to liberals.
Think of it as a kind of arrogance. Now, you could say, "Well, maybe there's just a lot more liberal content out there. Maybe that's why we're seeing what appears to be biased. Maybe it's not really biased, maybe it's just telling you about the content that's available." I don't think so. We've looked into that. There's no way you could get to 93 percent on YouTube just according to the availability of content. That's impossible. So a lot of stuff [is] happening on Google. There was so much happening and we were getting so much data.
At that point, I think we had about a half a million ephemeral experiences preserved that we did something we've never done in the past because this is a larger scale project and done more rigorously than our previous projects. We decided we're going to go public. So among other things, we contacted various papers, “The Washington Post” and other places. We weren't seeing much interest. So we contacted the “New York Post.” A woman there took a very strong interest, a journalist there, and she took all our content. She started writing a fabulous piece, which is about how the tech companies are rigging our elections. And she read me some of the piece and I thought it was great, frankly.
Next step, this was Friday, October 30th, next step was that her editor had to get comment from Google on some of the factual content. This is normal. This is perfectly normal. And the next thing that happened was, this was supposed to come out the next morning, but later that night, really two things happened, number one is “The New York Post” killed the piece. Wow, I couldn't believe it.
But then, I looked up their traffic sources, and about 40 some odd percent of their traffic was coming from Google. So okay, I get it. You can't really attack Google at least on this kind of a scale without risking your business. So okay, they killed the piece. That's number one. And number two, Google turned off its manipulations in the presidential election. Some of the bias we had been seeing disappeared the next day. So those last couple of days before the election, which was on November 3rd, Google appeared to turn off their manipulations, and we thought, "That's interesting."
So along the way here, I contacted someone I knew in Senator Cruz' office. On November 5th, two days after the election, three U.S. senators sent a very, very strong threatening letter to the CEO of Google summarizing Epstein's preliminary findings in the presidential election. And then, a miracle occurred, because at this point we had more than 1,000 field agents located throughout Georgia and we were very, very carefully monitoring the content coming from the tech companies prior to the Senate runoff elections in Georgia which was in January of 2021.
And lo and behold, there was the usual bias on Bing and Yahoo and Facebook. Wherever we looked, the usual bias was there, except all the bias was gone from Google. I mean, gone. I mean, literally zeros every single day when we're looking at bias in their search results, zeros, and no Go Vote reminders. And that's true for our liberal field agents, our conservative ones and our moderate ones. Not a single Go Vote reminder.
To put this another way, we, with the help of some senators, got the biggest vote manipulator in history to back down and stay away. The lesson there being, this is the solution to the way in which at the moment these companies, some more than others, interfere in our democracy and interfere in our lives and interfere with our children. This is the solution, which is permanent, large-scale monitoring, 24 hours a day in all 50 states, doing to them what they do to us and our kids. If we do it to them, if we monitor, we capture, we archive, and we expose, then they will stay out of our lives. They'll still make a fortune, they'll still make a lot of money, but they will give us back our free and fair election.
Mr. Jekielek: So you just said the biggest vote manipulator in history. That's a very strong statement. So explain to me the evidence and which papers can people look up to see the data that qualifies that.
Dr. Epstein: It is a strong statement. And so, I don't make strong statements like that unless I can back them up, unless I think there's evidence. And in this case, when it comes to Google and votes, the evidence is overwhelming. First of all, we have our monitoring data, and in these last couple elections alone, we preserved 1.5 million ephemeral experiences. We have a lot of data that we've analyzed now in great detail. We spent more than a year on the analysis, but there's more.
We have, for example, a video that leaked from Google called The Selfish Ledger in which Google employees are talking about the power the company has to impose company values, that phrase is in this film, company values on all of humankind to literally re-sequence behavior, that's also a phrase right from the video, to re-sequence behavior affecting all of humankind and spreading company values.
Speaker 3: As gene sequencing yields a comprehensive map of human biology, researchers are increasingly able to target parts of the sequence and modify them in order to achieve a desired result. As patents begin to emerge in the behavioral sequences, they too may be targeted. The ledger could be given a focus, shifting it from a system which not only tracks our behavior, but offers direction towards a desired result.
Dr. Epstein: We've got a leak of a PowerPoint presentation from Google called The Good Censor in which they're talking about the fact that they have to suppress some content and boost other content that, in effect, they have no choice. They have to censor content, but that they're good censors. What content are they censoring? Well, as it happens, because of the people who work there, they tend to be censoring content that has a certain political leaning namely conservative content. Now, I'm not a conservative so I say, "Keep going. Yeah, absolutely. I love it," but I don't love it because I don't want a private company that's not accountable to the public deciding what billions of people can see and cannot see. The problem there is you don't know what they don't show.
And then, we've got the whistleblowers. So we've had about a dozen in the past two years and they're coming out of Google, they're walking out of Google sometimes with documents saying over and over and over again, "There's crazy political bias at this company and they are imposing it on people around the country and around the world." The whistleblowers are telling this to us. They're saying it. A couple of them have testified before various congressional committees. So we've got documents now too. We've got leaked emails. The evidence is overwhelming that Silicon Valley companies and Google, above all, are messing with our lives and messing with our elections.
Interesting footnote. Country by country, they don't necessarily lean left. They do whatever suits their needs as a company. So in Cuba, for example, where the government is obviously left, it's a leftist government, but it's very opposed to companies like Google. Google supports the right. Google has worked with the government of Mainland China off and on for a number of years, helping Mainland China control its population. Now, that's, I wouldn't say is in the spirit of democracy. Google does what it needs to do country by country by country. But in the United States of America, they definitely have impacted elections on a very large scale and they definitely lean in one direction politically.
Mr. Jekielek: And just in terms of the impact on the elections, I just want to see if I understand this correctly. You're seeing certain types of what you believe to be manipulations based on this over-the-shoulder recording of ephemeral experiences. At the same time, you're comparing that to the experiments that you've run on the impact of those same types of manipulations can have on people's actual voting behavior. And then, that is how you determine that they're actually having this massive impact. Do I understand that right?
Dr. Epstein: That's exactly right. In fact, we can estimate a range of votes that can be shifted with each technique. So for example, we didn't have much data in 2016, but based on the data that we did collect and based on the level of bias that we observed on Google which was absent on Bing and Yahoo, we calculated that Google's search engine shifted to Hillary Clinton, whom I supported, somewhere between 2.6 and 10.4 million votes over a period of months before the actual election.
In the 2018 elections, the midterm elections, we were able to calculate how many votes were being shifted. It was in the millions spread across hundreds of elections. In 2020, again, we were able to do calculations showing that at least 6 million votes were shifted by Google to Joe Biden and other Democrats. These are estimates, but they're estimates based on almost a decade of very rigorous controlled experiments. Add to that the leaks, add to that the whistleblowers, and we get a pretty solid picture, I think, of what's occurring.
I've been working also with attorneys general from various states with members of Congress. We are getting more aggressive in going after Google and other companies, but the EU is way ahead of us and they've been collecting their own sorts of evidence now for a long time. They have fined Google since, I believe 2017, they have fined them over €10 billion. And by the way, euros are worth a lot more than dollars. So that's a lot of fines.
I was invited just a few weeks ago to talk to members of the European parliament. I've spoken to them before. They're moving on these issues and they could make a big difference. In our country, it's not clear that our Congress is going to do very much to protect us from manipulation by these companies. The EU, I think, could make a huge difference. They can take very dramatic actions to protect Europeans at first, but that would have a ripple effect around the world.
Mr. Jekielek: This particular work obviously is incredibly politically charged. I noticed that you keep mentioning your own political persuasion. Well, actually, quickly, why do you keep mentioning it?
Dr. Epstein: I keep mentioning my own political persuasion because I'm cursed. I have a curse. I'm cursed by the fact that my findings tend to help conservatives, which is not what I'm trying to do. And as a result though, because that is the case, a lot of my own peers, even members of my family, are mad. They think I'm a Trump supporter which I'm positively not, and it is a kind of curse. It's awful. I keep trying to cast off the curse. So over and over again, far too many times, I keep saying I'm not a conservative. Does it take care of the curse? No.
Mr. Jekielek: So when you, for example, with some of this politically charged work, for example, that shows that your estimate is 6 million votes were shifted in 2020 across all these different candidates in the presidential election, have you subjected that to peer review, those types of findings? And if so, what do those peers say when they see it?
Dr. Epstein: Well, some of the findings from our 2020 and 2021 monitoring projects, those have gone through a level of peer review for a scientific presentation, and that is a level of peer review. So we have presented that work to colleagues, again, in a context of scientific meetings. And that will soon be submitted for publication in a peer-reviewed journal. That's a process that takes time, and that will most certainly be published either toward the end of 2022 or perhaps early 2023. And it'll be there. It'll be on the record and it will have gone through peer review. I hope it helps when people are skeptical about what I'm doing. I hope it will help, but it may not. People are not thinking very clearly these days.
Partisan thinking dominates everything. It dominates media far more than it should. It dominates our Congress which, for the most part, can't get anything done. It even dominates how people view science. All I can do is what I know how to do, which is I know how to do very, very good science, and that's what I'm doing. Without the work that my team and I are doing, people are just, they're blind. They're oblivious. They have no idea how new technologies are impacting their lives. They can see it at the superficial level.
Like, my teenage daughter seems to spend 200 hours a week on her devices, and there aren't even 200 hours in a week. So I may have had a lot of people say things like that to me. They can see what's happening superficially, but they don't understand the impact that these companies are having on people's minds and  nobody really understands how these companies are impacting our children, especially young children. So that's become my latest obsession is trying to figure that out.
Mr. Jekielek: Well, it almost seems, you know, to go back to the beginning of the interview a little bit, it's almost like they've kind of taken a page at a B. F. Skinner's playbook, so to speak, right? And perhaps imagine themselves sort of trying to positively engineer human thinking towards the right way that it should be.
There is this famous quote from Mark Twain, I never remember it exactly, but basically, “It's a lot easier to dupe people than to show them or convince them that they have been duped.” I just think there's kind of an intrinsic opposition in a lot of people's minds to even accepting that everything we've just talked about could be true because as you said before, I'm the master of my own fate, how dare you suggest that? Somehow I've been manipulated to make my decision, right?
Dr. Epstein: Well, the problem is actually darker than what you're suggesting, because there's very good research showing that when you use a method of manipulation to manipulate people, that they cannot see. People end up believing wholeheartedly that they have made up their own minds. So it's very hard to change people's minds when in fact people believe that they've made up their own minds. You can even go back and show them what you did. You could show them that they were randomly assigned to one group and they made up their mind according to how they were supposed to make up their mind, according to the bias that we had control over. You can show them all that, and it doesn't help very much. People are not clear thinkers as a rule.
Fortunately, I don't need people to understand all these intricacies. I don't need everyone getting scared. In fact, that would be scary. I don't want everyone being scared. What we have to do is take concrete measures to get these companies off our back. Now, we can do that without convincing everyone that these companies are doing some bad things or they have the power to do bad things. We don't have to convince everyone of that.
So now and then, the government sets up programs to protect people, even though people don't necessarily understand the issues. It was our government. It was particularly, one particular surgeon general who stepped up and said, cigarettes are dangerous. They're bad for our health. I don't think the American public believed that, but the government started a program. It was a long term program of education, of changing various kinds of laws regarding smoking in confined areas and so on and so forth. They stepped up because there were sound medical reasons for doing so.
Well, the same is true here. There are sound reasons for wanting to get these companies out of our elections, because if we don't, it makes democracy into a kind of a joke or at least an illusion because it means in close elections, election after election after election, it means these companies are choosing the winner. But the public doesn't necessarily have to understand this or understand the details, just our leaders or some subset of our leaders has to understand, or some very wealthy philanthropists have to understand what's going on and say, “You know what? Epstein, set up that monitoring system. Let's get these companies out of our elections.” The cool thing about monitoring is that if you can get these companies to back down, the monitoring system detects it. And if these kind companies come back in and start fiddling around again, the monitoring system detects it. I'll give you one quick example.
If on election day in 2016, Mark Zuckerberg had decided to send out go vote reminders just to Democrats—Zuckerberg is a Democrat and supports Democrats. The vast majority of donations from Facebook or Facebook Meta in fact go to Democrats. So Mark Zuckerberg on election day decides to set out go vote reminders just to Democrats. First of all, could he have done that? Of course. What impact would it have had on votes? Well, we know from Facebook's own published data in a study that was done with former colleagues of mine at the University of California, San Diego, which is right near where we're sitting, we know from Facebook's own published data that, that would've given to Hillary Clinton, that day, approximately 450,000 additional votes. Now, did Zuckerberg do that? I don't believe he did, but we weren't monitoring that.
Now, fast forward to let's say the upcoming midterm elections in 2022 or how about the presidential election 2024, we are going to be monitoring. Okay, if Zuckerberg sends out go vote reminders and he does it in a targeted way, he's sending it only to one particular group, we would detect that within minutes, and we would report that to the media, to the Federal Election Commission, to members of Congress, all hell would break loose. And I guarantee you that within an hour or two, he would take down those vote reminders. Now, meanwhile, he would've definitely shifted a lot of votes. That's the problem.
But then he would've been charged probably with a crime. What's the crime? Well, he's actually breaking the laws that have to do with campaign donations. He's breaking the laws. Because even though it didn't cost him one dime to send out those targeted vote reminders, it doesn't matter. What he's just done has made a huge in kind donation to a political campaign without declaring it, and that's against the law.
If you have very large scale, extremely well-designed monitoring systems running, and you're capturing a massive amount of data showing, look at this, we have 10,000 field agents and 3,000 are our Democrats, 3,000 are Republicans, 3,000 are moderates or independents. And look at this, it's only the Democrats, only liberals are getting his reminders. Okay, that could be presented to a court of law. In which case, he could be in serious trouble. Not just him, but his colleagues at the company. Not just them, but the company itself could be in deep trouble. If we're documenting more and more precisely what it is they're doing, and how they're affecting people, and how they're affecting elections, and how they're affecting children, it's very possible, I would say likely, that our leaders will pass laws.
Now, is this necessarily going to happen in the federal government? No, not necessarily. I mean, they're dysfunctional. But could it happen in states? Positively. California has a data protection law. It's the only state so far that has a strong data protection law, but it's a law just like the data protection law in Europe. So states could go after these companies, again, based on a massive amount of data being collected.
Think of the alternative. The alternative is if there is no monitoring, then these billions upon billions of ephemeral experiences that are affecting us are gone. They disappear forever. Every single decade, thousands of elections occurring and millions of children being impacted. And if you don't monitor, you'll never understand what happened.
I've had parents telling me that they're very concerned about what's happening with their kids, how their kids are behaving. They don't get it, and they think maybe it's because of social media, but they don't really understand what's happening on social media. You know, so I took away my daughter's cell phone, I took away her iPad, and all that did was create chaos in the house. That didn't solve any problem. And I don't think she was really off social media anyway. I think she was getting on ... In other words, people are going nuts because they don't really understand, they don't know what's happening. Monitoring systems will tell us— they'll show us.
Mr. Jekielek: You mentioned this opinion matching effect earlier, which you said was the strongest effect. Briefly, could you kind of summarize some of the strong effects that you've found across these different platforms as well so people can get a picture of what is happening?
Dr. Epstein: Well over the years, we've been very successful in building simulations of different online platforms. First one we built was the Google search engine. Our platform was called Kadoodle. We have a superb YouTube simulator, we have a Twitter simulator, we have a simulator of Amazon's personal assistant. So we have a bunch of these and we're building more. These allow us to do various kinds of experiments and to see whether activities on these platforms could indeed shift thinking and behavior. We're trying to figure out how the manipulation works, but most importantly, we're trying to quantify it. So we're trying to figure out how many votes could we shift? How far could we shift opinions? How many more people could we convince to get off their sofas and vote who otherwise would stay home?
With kids, since we haven't really started that work yet, but with kids, we're not even sure what the questions are at this point. But when we begin that work, we're going to be generating, I'm sure over time, hundreds of different questions, and we're going to find ways to understand the potential for manipulation and then to quantify it.
So we found all kinds of manipulations and we have been able, over time, to quantify them and understand them. The first one was SEME, the search engine manipulation effect. So that was 2013, but we're still working on that. We're still learning about that almost a decade later. The SEME-
Mr. Jekielek: And that one is rank changing the ranking positions, right? If I understand it correctly.
Dr. Epstein: Well, it involves filtering and ordering. So filtering means you're picking certain webpages and certain search results and not others. And then there's the ordering and you're putting them in a certain order. So SEME, we found, can easily shift 20 percent or more of undecided voters. In some demographic groups, the number can be much higher. The highest number we've ever found was 80 percent, and that was moderate Republicans, and that was in a nationwide study that we did in the United States. We did that in India. Again, we found some groups that we could influence by over 60 percent.
Search suggestions, that's another source of influence, a new kind of influence. And so we named the search suggestion effect or SSE. We found that search suggestions are very powerful. This is because mainly of the ability that a search engine has to suppress certain kinds of suggestions and allow those suggestions to appear when it suits their needs.
For example, let's say it's Clinton versus Trump, and let's say I support Clinton. So the first thing I would do with my search suggestions is make sure when someone is typing a search term, that they never get a negative search suggestion for Clinton. So you're screening out, very, very rapidly, you're screening out negative terms and all you're left with are some neutral or some positive terms, but you don't do that for her opponent.
So for Trump, you just let everything fly up no matter how negative the terminology is. As it happens, in June of 2016, a news organization called SourceFed released a video that was quite well produced. And basically, it said, “Look, we've been looking at search suggestions on Google, Bing and Yahoo. And on Google, it's just about impossible to get any negative search suggestions for Hilary Clinton, but you can get them for Trump.” You can get Trump as an idiot popping up as you start to type a search term. Trump is a horse in a hospital, that one pops up. Okay, but for Hillary, no.
On Bing in Yahoo, if you start to type in, Hillary is ... you'll get, Hillary is the devil, Hillary is ill, Hillary is sick, Hillary ... You'll get eight or 10, highly, highly negative suggestions. Now, if you go over to one of Google's tools that tells you what people are actually searching for, and you look up Hillary Clinton, you're going to find that they're searching for negative things. But Google's search engine, so it appeared, at least in the summer of 2016, was suppressing negatives.
So we've done controlled experiments on SSE, search suggestion and effect, and we have found that just by either suppressing or not suppressing negative search suggestions, we could turn a 50/50 split among undecided voters into a 90/10 split with no one having the slightest idea that they're being manipulated.
More recently, we've been looking at these so-called answer bot effects. This applies both to the answer boxes that Google now shows pretty frequently above search results. And you know, if you show people an answer box, number one, they don't spend as much time looking at search results and they don't click on very many search results. Very often, they just accept what's in the answer box as being true.
But we've shown that when you put an answer box above search results, it increases the shift that you get from biased search results. And if you show people perfectly unbiased results, but you give them an answer box that favors one candidate, we've produced shifts of about 38 percent just from the answer box where all the search results themselves are unbiased. And then we realized, wait, there's something else out there in the world now that's just like that, where just someone asks a question and they get one answer and that's the personal assistant.
So we call these IPAs, intelligent personal assistants, and that's assistants like Apple Siri, which just so happens, gets all of its answers from Google. Like Amazon's Alexa. It's another one of these devices like, well, the Google Assistant, which comes on Android phones, and answers your questions. And with personal assistants, you ask a question and you don't get a list of 10,000 search results, you just get an answer. It might be wrong, it might be right.
We have been conducting experiments now with personal assistants and they talk. In fact, they sound just like Amazon Alexa does. And we have been able to produce shifts of 40 percent or more just based on what one question and answer interaction. If we let people ask multiple questions, six questions or more, we're getting shifts of 65 percent or higher.
The downside of that, by the way, is that if you let people ask multiple questions and you're showing them biased answers over and over and over again, they start to detect the bias, but it doesn't have much impact on the shift that you're getting in their opinions. And that we've found pretty consistently now for again, almost 10 years. Merely being able to see the bias, doesn't necessarily protect you from the bias.
Mr. Jekielek: Now, I have to ask this, have you done any work to assess whether the effect of this is cumulative? Because of course, people are being affected by multiple of these types of interactions with their devices.
Dr. Epstein: So far, we've only done one kind of experiment in which we repeat a manipulation. And sure enough, if you repeat a manipulation, the numbers go up. Now, they go up kind of more and more slowly as one would expect, because you're reaching some sort of asymptote, some sort of maximum, but we are planning actually a few months from now, that's on our schedule, to approach that issue very aggressively. Because as it happens, almost all, if not all of the tech companies in Silicon Valley have the same political bias. So it has crossed our mind. I mean, it's a natural thing. What if people were seeing the same kind of bias, not just over and over again on one platform, but across platforms?
And in a way I regret having focused so much on Google, although I think it is the biggest player here, because I think that what's really happening is that there is a cumulative effect of not just political bias, but of values; literally a cumulative effect of being exposed to certain kinds of values over and over and over again on one tech platform after another. And I think that the people who are most vulnerable to being impacted by that kind of process are children.
And so that's why it's so important to me ... As a dad, it's so important to me to finally begin to explore that. I thought about this years ago, but I just couldn't figure out how we would do that. How are we going to look at kids? How are we going to look over their shoulders as they're on their little mobile devices or their little iPads? How are we going to do that? We finally figured it out and we finally got funding for it too. So that's going to happen this year. That line of research is going to happen. We're going to be looking at multiple sources of influence and we're going to be looking at kids
Mr. Jekielek: Well, so this is exactly what I wanted to touch on here. And when it comes to being undecided on things, I think a lot of kids are actually quite undecided. Probably a much larger portion of younger people, they're just trying to figure out their identities, who they are, what they like, they're testing different things. These giant systems are there ready to provide answers in a particular direction based on their values as you suggest. Wow! That is really unprecedented in history. And so first of all, like how long has this been going on? I mean, that's something else that you've been ... Have you been able to assess these types of manipulations, when they would've started?
Dr. Epstein: I have a rough idea and it's only rough, but the fact is if you look at Google's history. So Google was really the first company that turned into one of these mega giants that we have now. In 2004, I believe, they introduced search suggestions or what they called auto complete. The programmer who invented it named Kevin Gibbs, I believe, he thought it was a cool thing and he said, “Wow! Look at this cool thing, it's going to give you an idea of what other people are searching for. It'll speed up your searches, et cetera, et cetera.” But in 2008, they changed it. Originally, this was an opt-in feature. In 2008, it became standard and you couldn't even opt out of it. Okay, so I think something's starting to shift there. And then in 2010, they shifted from always having had 10 search suggestions on the list now to having just four.
Now, as it happens, we discovered in 2016 with our SSE research, that if you want to maximize your control over people's searches, using search suggestions, the optimal number of suggestions to show them is four. You guessed it. That's right. It's four. So ...
You guessed it, that's right. It's four. So what that's telling me is somewhere during that last decade, that last decade of the current century, 2008, 2010, somewhere in there, they were shifting. I recently heard from a man named James Whittaker, he was the first high level Google executive to leave the company. And when he left, he was being badgered, how could you leave? You're making a fortune and it's so cool. It's such a cool place. And so he issued a little public statement and the public statement said, "Well, yeah, I was there at Google almost from the very beginning and it was like the coolest place on the planet to be working. There was so much creativity and everything was fun. And we were just... It was wild," he said. He said, "And then one day it turned into a ruthless advertising company. It turned into something else—something very different."
He said, "And that's not what I signed up for. So I left." So there was a shift, there was a shift because what people were realizing was that they had all these cool tools, which initially they were just using to help people find information, but they realized they could make a lot of money. And then they realized that they could actually exert influence over people. In other words, they started discovering how they could, I guess the right term is repurpose, what they already had. And it's that repurposing that's led to this scary world that we live in now. Only a decade or so later, this is a scary world in large part because of the tech companies and these crazy powers that they have. And they know it, they know they have these powers. How do I know that?
Well, one of the whistle blowers is a fellow named Tristan Harris. He was featured, I think just last year in a very popular Netflix film called The Social Dilemma. Tristan, who I know just actually came out into the real world and said, "Guess what I was doing the last few years, I was working at Google on a team of hundreds of people. And we were just trying to figure out every single day how to manipulate a billion people around the world."
Mr. Jekielek: You mentioned Zach Vorhies earlier. I believe that one of the documents that he took out at the time that he left Google was the thing about machine learning fairness, if I recall, right? So that's a kind of manipulation that is very, very transparent, actually. Unlike the things that we were just talking about, but it can give us insight into how Google perceived its, I guess ideological mission, if you will.
Dr. Epstein: It will sure. I don't know how carefully people have ever looked at those documents that Zach brought with him out of the facility, but there's some amazing stuff in there. There are multiple documents on machine learning fairness or algorithmic fairness, they call it, where basically they're saying, look, most firefighters are male. When people type into Google firefighter and then they click on images, look at this, they're getting all these men who are firefighters. And that may be true. It may be factually true. That proportion might be accurate, but it's not fair. So we're going to take care of that. We are going to make our algorithms fair. So that means now, even though it's not technically true, now people are going to see half male and half female.
In fact, now they throw in some transgender firefighters and now they're going to balance them for race and ethnicity. And they're creating a fake world because of the values that they hold. Now, not everyone holds those values. There was also a manual in there called the Twiddler manual or it's the manual for their Twiddler system. I never heard of it before. And right on the front cover, it says something like, Twiddler, a system for re-ranking search results. What? Now over and over and over again, when some Google representative would criticize my work publicly, they would always say, "We never re-rank our results for political purposes." And I, in my head would go, re-rank, what's that? I never accused them of re-ranking anything.
And here's a manual that's all about re-ranking. What's going on here? What's going on here is something amazing, which is that they actually have software that would allow them, as Zach told me, this is an exact quote, that would allow them to turn bias on or off, like flipping a light switch, like flipping a light switch. And that's what I’m pretty sure they did in Georgia in those Senate runoff elections in late 2020, early 2021. I think that's what they did. So they have the ability not just to shift people's thinking and behavior by showing bias content, they have the ability to turn the bias off. That's astonishing to me because search engines are pretty complicated algorithms. But they figured out how to do it. I mean, there's some brilliant people there and they figured out how to do it. You know what? Let's take advantage of that. Let's monitor  them. Let's do to them what they do to us, let's monitor them, let's expose what they do and let's force them to flick that switch to the off position.
Mr. Jekielek: So how do you, and or what would you suggest to people watching the show right now, how to avoid being influenced this way? Because clearly in all these myriad ways, or at least most people aren't ready to turn it all off. I probably, like you, spend the vast majority of my day online doing things.
Dr. Epstein: It's hard to answer that question. You're asking a very, very good question, and I can tell you what you can do fairly easily, but it's not going to ease all your concerns. What you can do fairly easily is just go to my website, which is at myprivacytips.com. So that's really easy to remember, myprivacytips.com and you'll get to an article of mine. And it begins, I have not received a targeted ad on my phone or my computers since 2014. And that's true. I have learned, I started learning back in 2014, 2013, when I started doing this kind of research, I have learned how to use tech and keep my personal life to myself. I've learned how to protect my privacy pretty, pretty well. And that's why I don't get targeted ads. So everyone can learn that. There's some pretty easy steps there. It's not that complicated. Do you have to spend some money instead of just using all these so-called free services? These services are not free.
You pay for them with your freedom. So do you have to pay a fortune if you're giving up these so-called free services? No, you're paying maybe 10, $15 a month maximum and yes, you're signing up for some alternative services that give you fantastic kinds of results that work beautifully. But you're pulling back from Google, you're pulling back from the companies that depend on surveillance for their income, instead give someone five bucks a month, okay. So they don't have to turn you into a product that they're going to sell. Okay, so yeah, you spend a little money, you sharpen up some skills, you can definitely use tech and protect your privacy pretty well. So that part, I know how to help people with. And the research institute where I do my work has now set up a new organization, which I hope people will keep an eye out for, which is called Internet Watchdogs.
Internet Watchdogs is going to be a place where there's a big forum and there's lots of videos and lots of services. It's going to help people break free of Google. It's going to help people protect their privacy, help people protect the privacy of their children and their family. People will be able to sign up. People can become members. And in fact, if you get involved early on, you can take on a position of real responsibility.  I think the world needs this.
We're going to help, I hope over time, hundreds of thousands, maybe millions of people break free of this surveillance business model. I think it's very creepy. Now, there are larger questions here though. I mean, how do you protect democracy? What do I do about my kids who don't care? A lot of young people now just don't even care about privacy. They've never had it. So they just don't care about it. They don't understand that it used to exist, that you used to be able to finish a letter to someone and seal it and put it into a mailbox.
And unless you're a criminal and there's a court order against you, no one will see what's in that letter until it reaches the recipient; no more. If you use Gmail, all of your emails, the incoming emails, all the outgoing emails that you write, all the emails that you kind of write, and then you say, "No, I'm not going to send that, that's crazy. I'll get fired." Even those emails are preserved permanently by Google. They're analyzed by Google's algorithms. They're used to create digital models of you to predict your behavior. They're used to control you.
So young people don't even understand that. They've grown up in a world where there is no privacy. So that's a tough problem. I don't have any easy answer for that problem. That's a generational phenomenon that we have to live up to, we have to face. No one intended that to happen, but yeah, we're raising kids now who don't know what privacy is, and I don't have a simple solution to that problem.
Mr. Jekielek: What about just the basic question, what search engine do you use? Obviously it's not Google, or maybe once in a while you flip on VPN, an incognito mode or something like that, and you check Google, but what do you use mostly?
Dr. Epstein: Well, if you go to myprivacytips.com, you'll find out that I use a fairly new browser, which is called Brave. And within Brave, I use a search engine, that's also called Brave. And it's pretty darn good. I'm very impressed with it. I don't use Gmail. I use an email system that's based in Switzerland  called ProtonMail. So they're all very, very private, they're encrypted end to end. They're just like the old letters, because when I send out an email, it gets encrypted. No one can read it, not even the ProtonMail company can read it. The only person who can read it is the recipient, you at the other end, just like...
Mr. Jekielek: Well, as long as I have a ProtonMail, though, it gets a little less secure if…
Dr. Epstein: As long as you have a ProtonMail, that's correct.
Mr. Jekielek: Incredibly valuable information. And count me in on being an early one to sign up for Internet Watchdogs and any final thoughts before we finish.
Dr. Epstein: I'm hoping that people, who don't just dismiss this, I'm hoping people will take a little time, learn a little bit more, and I've been trying to make that easier and easier for people. So I testified before Congress in 2019. Very recently, just a few weeks ago, I took the congressional testimony and I updated it. And it's now in a pretty cool looking booklet, that's called “Google's Triple Threat To Democracy, Our Children And Our Minds.”
That is free for people to download it's at Googlestriplethreat.com. So if you want to learn more, go there, if you want to learn even more and branch out to all the different kinds of studies that we're doing, go to my mygoogleresearch.com. That is also a place where you can support our research if you have the ability to do that. But I actually do want to make a particular kind of plea, a plea, we'll call it.
We are ready now to take the work that we have done to take it to the next level. And by that, I mean, we know how to do the research. We know how to do the monitoring. We know how to do the monitoring on a large scale. We know how to do the analysis. We even have now an algorithm that will calculate the political bias in a webpage in a split second.
That's going to allow us in the elections that are coming up to look at content that's being generated by the tech companies and analyze it in real time, analyze it lightning fast and find bias as it is occurring. So we've made a lot of progress. We need to go to the next level. The next level means setting up a permanent large scale self sustaining monitoring system in all 50 U.S. states. This is something which I think is required. It's not optional.
This is required for us. We must do this to protect our country, our democracy [and] our children. This must be done. It'll protect us from the Googles of today and the Googles of tomorrow because monitoring is tech, and it can keep up with tech, with emerging technologies, even decades from now, monitoring will protect us.
So here's my plea, my plea is that if you're in a position to help us go to the next level, please contact us. If you know someone who's in a position to help us build this big system in all 50 states, please contact us. You can reach me through our website, which is aibrt.org. You can reach our associate director, Michelle Volo, who is a fantastic human being, and you can reach her and she can get you involved with Internet Watchdogs. There's all kinds of ways to join us. And I hope very much that you will join us.
Mr. Jekielek: So I have to ask this and this certainly many of our viewers have kind of been watching how, let's call it the administrative state, for example, in the government has become a lot less accountable to the American people. There's a lot of this kind of perception. How is it that your system here, Internet Watchdogs and these other ones, the monitoring systems, how are they going to stay, I guess pure to their mission?
Dr. Epstein: That's a great question. I actually have an answer for it because we are like Google was at the beginning and Google got twisted into a ugly, kind of greedy version of itself because of money, because it's a for profit, because they found ways to get obscenely rich. Now, Internet Watchdogs, AIBRT, any project I've ever touched really is in the nonprofit world, is in a different world. There's no ownership. I don't own anything in these projects. No one does, you can't by definition because they're nonprofits. So we are the original Google. We are doing creative, crazy wild stuff. We're protecting humanity, protecting the kids, protecting democracy, having a ball, having fun while we're doing it. I mean, my team loves coming in. They love working long hours. Some of them don't get paid at all and they still do it. And so that's where we are. And I think that's going to remain the case.
The reason why good Google turned into something else, something twisted and demonic and distorted is because of money, because it's a for-profit and people realized after a while that they could become millionaires and then billionaires and that's what's happened, and money makes people nuts. All the projects that I've set up my whole life have been nonprofits. So people do what we're doing because of passion, because of love, because of good intentions. And I think that will still be true of the research Institute that I helped to found also of Internet Watchdogs. I think if we're lucky enough to be chatting again, a decade from now or even 20 years from now, and we're both going to look fabulous by the way, just as we do now, we're still going to look great.
I think in the future, we're still going to be excited about what we're doing, because we're in this kind of contest, a kind of cat and mouse game in which we are using good tech to fight bad tech, to protect kids, to protect humanity, to protect democracy without the money factor to drive people crazy. So I think we're on a good track. We're making fabulous progress and I think the future is going to be even better than what we're seeing now.
Mr. Jekielek: Well, Dr. Epstein, I wish you Godspeed and such a pleasure to have you on the show again.
Dr. Epstein: Thank you.
This interview has been edited for clarity and brevity.
Subscribe to the American Thought Leaders newsletter so you never miss an episode.
----------------------------------------
Source

https://www.theepochtimes.com/epochtv/dr-robert-epstein-inside-big-techs-manipulation-machine-and-how-to-stop-it-4388332

No comments:

Post a Comment

Truman Was Right About the CIA

Truman Was Right About the CIA Tags:   Political Theory,   Big Government,   Biographies,   U.S. History 03/08/2017   •   Mises Wire   •   J...