Censorship

Alternatives To Censorship: Interview With Matt Stoller By Matt Taibbi

Alternatives To Censorship: Interview With Matt Stoller By
Matt Taibbi 1

Authored by Matt Taibbi via TK News,

Led by Chairman Frank Pallone, the House Energy and Commerce Committee Thursday held a five-hour interrogation of Silicon Valley CEOs entitled, “Disinformation Nation: Social Media’s Role in Promoting Extremism and Misinformation.”

As Glenn Greenwald wrote yesterday, the hearing was at once agonizingly boring and frightening to speech advocates, filled with scenes of members of Congress demanding that monopolist companies engage in draconian crackdowns.

Alternatives To Censorship: Interview With Matt Stoller By
Matt Taibbi 2

Again, as Greenwald pointed out, one of the craziest exchanges involved Texas Democrat Lizzie Fletcher:

Fletcher brought up the State Department’s maintenance of a list of Foreign Terrorist Organizations. She praised the CEOs of Twitter, Facebook, and Google, saying that “by all accounts, your platforms do a better job with terrorist organizations, where that post is automatically removed with keywords or phrases and those are designated by the state department.”

Then she went further, chiding the firms for not doing the same domestically. asking, “Would a federal standard for defining a domestic terror organization similar to [Foreign Terrorist Organizations] help your platforms better track and remove harmful content?”

At another point, Fletcher noted that material from the January 6th protests had been taken down (for TK interviews of several of the videographers affected, click here) and said, “I think we can all understand some of the reasons for this.” Then she complained about a lack of transparency, asking the members, “Will you commit to sharing the removed content with Congress?” so that they can continue their “investigation” of the incident.

Questions like Fletcher’s suggest Congress wants to create a multi-tiered informational system, one in which “data transparency” means sharing content with Congress but not the public.

Worse, they’re seeking systems of “responsible” curation that might mean private companies like Google enforcing government-created lists of bannable domestic organizations, which is pretty much the opposite of what the First Amendment intended.

Under the system favored by Fletcher and others, these monopolistic firms would target speakers as well as speech, a major departure from our current legal framework, which focuses on speech connected to provable harm.

As detailed in an earlier article about NEC appointee Timothy Wu, these solutions presuppose that the media landscape will remain highly concentrated, the power of these firms just deployed in a direction more to the liking of House members like Fletcher, Pallone, Minnesota’s Angie Craig, and New York’s Alexandria Ocasio-Cortez, as well as Senators like Ed Markey of Massachusetts. Remember this quote from Markey: “The issue isn’t that the companies before us today are taking too many posts down. The issue is that they’re leaving too many dangerous posts up.”

These ideas are infected by the same fundamental reasoning error that drove the Hill’s previous drive for tech censorship in the Russian misinformation panic. Do countries like Russia (and Saudi Arabia, Israel, the United Arab Emirates, China, Venezuela, and others) promote division, misinformation, and the dreaded “societal discord” in the United State? Sure. Of course.

But the sum total of the divisive efforts of those other countries makes up at most a tiny fraction of the divisive content we ourselves produce in the United States, as an intentional component of our commercial media system, which uses advanced analytics and engagement strategies to get us upset with each other.

As Matt Stoller, Director of Research at the American Economic Liberties Project puts it, describing how companies like Facebook make money:

It’s like if you were in a bar and there was a guy in the corner that was constantly egging people onto getting into fights, and he got paid whenever somebody got into a fight? That’s the business model here.

As Stoller points out in a recent interview with Useful Idiots, the calls for Silicon Valley to crack down on “misinformation” and “extremism” is rooted in a basic misunderstanding of how these firms make money. Even as a cynical or draconian method for clamping down on speech, getting Facebook or Google to eliminate lists of taboo speakers wouldn’t work, because it wouldn’t change the core function of these companies: selling ads through surveillance-based herding of users into silos of sensational content.

These utility-like firms take in data from everything you do on the Web, whether you’re on their sites or not, and use that information to create a methodology that allows a vendor to buy the most effective possible ad, in the cheapest possible location. If Joe Schmo Motors wants to sell you a car, it can either pay premium prices to advertise in a place like Car and Driver, or it can go to Facebook and Google, who will match that car dealership to a list of men aged 55 and up who looked at an ad for a car in the last week, and target them at some other, cheaper site.

In this system, bogus news “content” has the same role as porn or cat videos — it’s a cheap method of sucking in a predictable group of users and keeping them engaged long enough to see an ad. The salient issue with conspiracy theories or content that inspires “societal discord” isn’t that they achieve a political end, it’s that they’re effective as attention-grabbing devices.

The companies’ use of these ad methods undermines factuality and journalism in multiple ways. One, as Stoller points out, is that the firms are literally “stealing” from legitimate news organizations. “What Google and Facebook are doing is they’re getting the proprietary subscriber and reader information from the New York Times and Wall Street Journal, and then they’re advertising to them on other properties.”

As he points out, if a company did this through physical means — breaking into offices, taking subscriber lists, and targeting the names for ads — “We would all be like, ‘Wow! That’s outrageous. That’s crazy. That’s stealing.’” But it’s what they do.

Secondly, the companies’ model depends upon keeping attention siloed. If users are regularly exposed to different points of view, if they develop healthy habits for weighing fact versus fiction, they will be tougher targets for engagement.

So the system of push notifications and surveillance-inspired news feeds stresses feeding users content that’s in the middle of the middle of their historical areas of interest: the more efficient the firms are in delivering content that aligns with your opinions, the better their chance at keeping you engaged.

Rope people in, show them ads in spaces that in a vacuum are cheap but which Facebook or Google can sell at a premium because of the intel they have, and you can turn anything from QAnon to Pizzagate into cash machines.

After the January 6th riots, Stoller’s organization wrote a piece called, “How To Prevent the Next Social Media-Driven Attack On Democracy—and Avoid a Big Tech Censorship Regime” that said:

While the world is a better place without Donald Trump’s Twitter feed or Facebook page inciting his followers to violently overturn an election, keeping him or other arbitrarily chosen malignant actors off these platforms doesn’t change the incentive for Facebook or other social networks to continue pumping misinformation into users’ feeds to continue profiting off of ads.

In other words, until you deal with the underlying profit model, no amount of censoring will change a thing. Pallone hinted that he understood this a little on Thursday, when he asked Zuckerberg if it were true, as the Wall Street Journal reported last year, that in an analysis done in Germany, researchers found that “Facebook’s own engagement tools were tied to a significant rise in membership in extremist organizations.” But most of the questions went in the other direction.

“The question isn’t whether Alex Jones should have a platform,” Stoller explains. “The question is, should YouTube have recommended Alex Jones 15 billion times through its algorithms so that YouTube could make money selling ads?”

Below is an excerpted transcript from the Stoller interview at Useful Idiots, part of which is already up here. When the full video is released, I’ll update and include it.

Stoller is one of the leading experts on tech monopolies. He wrote the Simon and Schuster book, Goliath: The Hundred Year War Between Monopoly Power and Democracy, and is a former policy advisor to the Senate Budget Committee. His writing has appeared in the Washington Post, the New York Times, Fast Company, Foreign Policy, the Guardian, Vice, The American Conservative, and the Baffler, among others. Excerpts from his responses to questions from myself and Katie Halper are below, edited for clarity:

Matt Taibbi: There’s a debate going on within the Democratic Party-aligned activist world about approaches to dealing with problems in the speech world. Could you summarize?

Matt Stoller: There are two sides. One bunch of people has been saying, “Hey, these firms are really powerful…” This is the anti-monopoly wing. Google and Facebook, let’s break them up, regulate them. They’re really powerful and big, and that’s scary. So, without getting in too deep, there’s the Antitrust subcommittee, that’s been saying, “Hey, these firms are really powerful, and they’re picking and choosing winners.” Usually, they talk about small businesses, but the issue with speech is the same thing.

Then there’s another side, which is, I think, noisier and has more of the MSNBC/CNN vibe. This is the disinformation/misinformation world. This is the Russiagate people, the “We don’t like that Trump can speak” type of people. What their argument is, effectively, is that firms haven’t sufficiently curated their platforms to present what they think is a legitimate form of public debate. They’re thinking, “Well, we need to figure out how to get them to filter differently, and organize discourse differently.”

Ideologically, they just accept the dominance of these firms, and they’re just saying, “What’s the best way for these firms to organize discourse?”

Taibbi: By conceding the inevitability of these firms, they’re making that concession, but saying they want to direct that power in a direction that they’d like better.

Stoller: That’s right. I mean, there’s a lot of different reasons for that. Some of them are neoliberal. A lot of the law professors are like, “Oh, this is just the way of technology, and this is more efficient.” Therefore, the question is, “How do you manage these large platforms?” They’re just inevitable.

Then there are people who are actually socialists who think, “Well, the internet killed newspapers. The internet does all of these things. Also, there’s a bunch of them that never liked commercial press in the first place. A lot of well-meaning people were like, “We never liked advertising models to begin with. We think everything should be like the BBC.”

So, those are the two groups that accept the inevitability thesis. It’s really deep-rooted in political philosophy. It’s not just a simple disagreement. Then there are people like us who are like, “No, no. Actually, technology is deployed according to law and regulation, and this specific regulatory model that we have, the business structures of these firms, the way they make money from advertising, those are specific policy choices, and we can make different ones if we want.”

Katie Halper: When you say socialist, some may identify as socialists, but that there’s a general group of people who just believe, “We oppose hate speech and White supremacy,” and so we have to make these companies that are evil, and give them moral authority and a content moderation authority, which is an inherent contradiction/wishful thinking/inconsistent paradox.

In other words, you’re saying leftists, right? Leftist, not liberals, not neo-liberals, not even liberals, but people who are really would identify as left.

Stoller: Yes. There’s a part of the socialist world that’s like, “What we really want is egalitarianism in the form of a giant HR compliance department that tells everyone to be tolerant.” Right? Then there are most people who are like, “No. I just don’t like wall street and I want people to be equal and everyone should have a little bit over something,” and they both call themselves socialists.

Taibbi: You and the American Economic Liberties Project have said, there’s a reason why taking Trump off Twitter isn’t going to fix the problem, because you’re not fixing those incentives. Can you talk about what those incentives are, and why they cause the problems?

Stoller: Google and Facebook, they sell ads, right? They collect lots of information on you and they sell ads and ads are valuable for two reasons. One, you’re looking at them. Two, if they know who you are and they know information about you, then they can make the ad more valuable. A random ad space isn’t worth very much, if you’re showing it to some undefined person. An ad space you’re showing to a 55-year-old man who’s thinking of buying a luxury car, somebody will pay a lot for that ad space, if you know who that person is and you know that that person has actually been browsing luxury car websites and reading the Wall Street Journal about how best to liquidate their portfolio or something to buy a luxury item.

Google and Facebook want to sell that advertising particularly on their properties, where they get to keep 100% of the profits. If Google sells an ad on YouTube, they get to keep the money. Facebook sells an ad on Instagram or Facebook, they get to keep the money. So, their goal is to keep you using their sites and to collect information on you.

Taibbi: What methods do they use to keep you on the sites?

Stoller: They have all sorts of psychological tricks. Engagement is the way that they talk about it, but it’s like if you go and you look for something on YouTube, they’re going to send you something that’s a little bit more extreme. It’s not necessarily just political. It’s like if you’re a vegetarian, they’ll say, or if you look at stuff that’s like, “Here’s how to become a vegetarian,” they’ll say, “Well, about becoming a vegan?” If you look at stuff that suggests you’re a little bit scared of whether this vaccine will work, if you search for, “I would want to find safety data on this vaccine,” eventually, they’ll move you to like serious anti-vax world.

So, the question that we have to ask is whether you should block crazy people from saying things, or do something else… Like Alex Jones, for example, is crazy person or an entertainer, he says things that I don’t particularly like or agree with. The question, though, isn’t whether Alex Jones should have a platform. We’ve always allowed people to go to the corner of the street and say whatever they want or to write pamphlets or whatever.

The question is, should YouTube have recommended Alex Jones 15 billion times through its algorithms so that YouTube could make money selling ads? That’s a different question than censorship.

Taibbi: Conversely they’re not recommending other material that might discourage you from believing Alex Jones.

Stoller: Right. The other thing is, it’s not just that they want to create more inventory so they can sell ads. It’s also the kinds of ads that they’re selling. So, you can sell an ad based on trust. The New York Times or the Wall Street Journal — I hate using them as examples — they have an audience and people. They built that audience by investing in content, and then they sell ads to that audience, and the advertiser knows where that advertising is going and it’s based on trust.

The alternative model which we have now is simply based on clickbait. It’s just, “Generate as many impressions as possible, and then sell those impressions wherever the person is on the web.” That creates a kind of journalism, which is designed to get clicks or not even journalism. It’s just you’re creating content just to get engagement and not actually to build trust.

So, what this business model does, we call it surveillance advertising, but it’s an infrastructure player, a communications player manipulating you so that they could put content, engage content in front of you. What that does is it incentivizes a low trust form of content production. It both kills trusted content producers, a.k.a. local newspapers, because you no longer need to be able to put advertising in the Pittsburgh Post-Gazette, or whatever. You can just geotarget those people through Google and Facebook. You can get some Eastern European to falsify stories and people will click on that.

So, it kills legitimate newspapers and it creates an incentive for low trust content, fraudulent content, defamatory content, whatever it is that will keep people engaged and is often fraudulent. It hits really local newspapers and niche newspapers the most, so Black-owned newspapers and also newspapers having to do with hobbies. The actual issue is more about niche audiences themselves, and the kind of low-trust content that we’re encouraging with our policy framework, versus what we used to do, which we would encourage higher trust forms of content.

Taibbi: How would you fix this problem, from a regulatory perspective?

Stoller: The House Antitrust Subcommittee had a report where they recommended what we call regulated competition. That would say, “Okay. You break up these platforms in ways that wouldn’t interfere with the efficiency of that particular network system.” So, Google and YouTube don’t need to be in the same company, you could separate them out.

There are ways that you’d have to handle the search engine. You couldn’t split Facebook into five Facebooks, because then you wouldn’t necessarily be able to talk to your friends and family, but you could separate Instagram and Facebook easily. You could force interoperability and then split up Facebook if you want to do that. So, you could separate those things out and then ban surveillance advertising for a starter.

Taibbi: What would that do to content if you ban surveillance advertising? ANd how would that work?

Stoller: It would force advertisers to go back to advertising to audiences. So, they would no longer be able to track you as an individual and say, “We know this is what you’re interested in.” They would go to what’s called contextual advertising, and they would say, “Okay. If you’re on a site that has to do with tennis, then we’ll advertise tennis rackets on that site because we assume that that people are interested in tennis rackets.”

That’s called contextual advertising, versus the current system: you read an article about tennis in a tennis magazine and the platforms say, “Oh, that’s expensive to buy an ad there, so we’ll track you around the web and when you’re on Candy Crush, we’ll show you a tennis racket ad.” That’s the surveillance advertising model we have. That pulls all the power to Google and Facebook who are doing all the tracking, versus the contextual ad where the power is actually with the tennis racket site that has the relationship with the people interested in tennis.

Taibbi: So, the idea would be you would create a sort of a firewall between the utilitarian functions of a site like Facebook or Google, that provide a service where either you’re searching for something or you’re communicating with somebody, and they wouldn’t be allowed to take that data from that utility-like function to sell you an ad?

Stoller: That’s right. Germany is hearing a court case saying that you can’t combine advertising from Facebook, Instagram, and WhatsApp, and then third parties to create a super profile of someone and show them ads. They were saying that’s an antitrust violation. There’s a court hearing on that, but, more broadly, that’s what you have to do.

Ultimately, what we would want is we would want to have subscription-based communication networks paying for services. This is something that’s worked for thousands of years. I give you something in value, you give me money. It’s an honest way of doing business. If I don’t value it enough to give you money, then I won’t get it.

If people are like, “Oh, I don’t want to pay for Facebook, or I don’t want to pay for YouTube,” or whatever it is, that makes no sense. You’re already paying. You’re either paying with a Friday night you spend surfing YouTube, where they sell a bunch of ads and you give up your Friday night — or you pay with money, and it’s an honest transaction, and it’s in the long run, a lot cheaper and more honest method of payment.

For more of this interview, check out UsefulIdiots.Substack.Com in the next days. For Stoller’s writings on the subject, see here.

Read the Full Article

Censorship
'Bad Girls Club' Star Deshayla Harris Killed in Virginia Beach Shootings
Maine Republican Party Rejects Censure of Sen. Susan Collins for Vote to Impeach Trump

You might also like
Menu