|
|
August 12 · Issue #365 · View online |
|
By now you’ve probably read enough about the disgraced financier Jeffrey Epstein, his death in a Manhattan jail, and the attendant conspiracy theories that consumed social networks over the weekend. President Trump led the charge, retweeting a conspiracy theory that sought to implicate former President Bill Clinton. At the heart of the online fiasco is Twitter, which has come to largely program the political conversation and much of the press. Twitter is magnetic during huge breaking stories; news junkies flock to it for up-to-the-second information. But early on, there’s often a vast discrepancy between the attention that is directed at the platform and the available information about the developing story. That gap is filled by speculation and, via its worst users, rumormongering and conspiracy theories. On Saturday, Twitter’s trending algorithms hoovered up the worst of this detritus, curating, ranking and then placing it in the trending module on the right side of its website. Despite being a highly arbitrary and mostly "worthless metric," trending topics on Twitter are often interpreted as a vague signal of the importance of a given subject. This hands-off approach to editorial intervention in the news cycle, coupled with algorithms that promote the most popular posts, is by now a familiar villain. It played a key role in, for example, the promotion of anti-vaccine zealots on Facebook, and the growth of Alex Jones’ audience on YouTube. The Epstein case was already a conspiracy theorist’s dream before he apparently hanged himself in his jail cell; in the early hours after his death, when little information was still available, Twitter was a perfect petri dish for proposing and amplifying outrageous conspiracy theories. The first problem with "trending" is that it selects and highlights content with no eye toward accuracy, or quality. Automated trending systems are not equipped to make judgments; they can determine if things are being shared, but they cannot determine whether that content should be shared further. […] This is the other problem of "trending," conceptually: It’s eminently gameable, but the platforms that use the term never make the rules clear. "Trending" is given the imprimatur of authority — videos or topics handed down from on high, scientifically determined to have trended — when really it’s a cobbled-together list of content being obsessively shared or tweeted about by people who love Justin Bieber. Or Logan Paul. Or who believe in crisis actors. Removing algorithmically generated modules of trending content would deny bad actors an easily gamed avenue for delivering hoaxes to platforms’ user bases. A more modest approach might be to build editorial teams that keep watch over trending hashtags and remove obvious hoaxes and conspiracy theories. But what if that were … illegal? That’s the question I had after reading coverage of the White House’s vague but unsettling plan to have the Federal Communications Commission and Federal Trade Commission police censorship on social networks. Brian Fung saw a partial draft: The draft order, a summary of which was obtained by CNN, calls for the FCC to develop new regulations clarifying how and when the law protects social media websites when they decide to remove or suppress content on their platforms. Although still in its early stages and subject to change, the Trump administration’s draft order also calls for the Federal Trade Commission to take those new policies into account when it investigates or files lawsuits against misbehaving companies. Politico first reported the existence of the draft. If put into effect, the order would reflect a significant escalation by President Trump in his frequent attacks against social media companies over an alleged but unproven systemic bias against conservatives by technology platforms. And it could lead to a significant reinterpretation of a law that, its authors have insisted, was meant to give tech companies broad freedom to handle content as they see fit. Fung talks to experts who describe the plan variously as "horrible" and "makes no sense." No one seems to think that the FCC or FTC want to do this work, or could do this work, either practically or constitutionally. It’s just one more disturbing idea floated by the Trump Administration that leaves us all wondering whether to take it seriously, literally, or not at all. I believe you can’t have editorial neutrality without having Nazis and other purveyors of hate speech and abuse. I also believe that restricting platforms from moderating content beyond what is required by law would threaten their businesses — Nazis have a way of chasing away users and advertisers. It would be heartening if Twitter took this moment to retire trending topics and take other concrete steps to slow the spread of conspiracy theories. But with the White House’s aggressive saber rattling would seem to make that less likely.
|
|
Democracy
|
White House questions tech giants on ways to predict shootings from social media
Tony Romm has a readout from Friday’s meeting between tech companies and the White House on its quest for a mass murder prevention technology that does not involve gun control or addressing white supremacy: Top officials in the Trump administration expressed interest in tools that might anticipate mass shootings or predict attackers by scanning social media posts, photos and videos during a meeting Friday with tech giants including Facebook, Google and Twitter. The technology could serve as an early-warning system for potential attacks, White House officials proposed at the brainstorming session, perhaps compiling information from across social sites to identify deadly incidents before they occur, according to three people familiar with the matter but not authorized to discuss a private gathering on the record.
|
How YouTube Radicalized Brazil
Max Fisher and Amanda Taub examine how YouTube has changed politics in Brazil: Teachers describe classrooms made unruly by students who quote from YouTube conspiracy videos or who, encouraged by right-wing YouTube stars, secretly record their instructors. Some parents look to "Dr. YouTube" for health advice but get dangerous misinformation instead, hampering the nation’s efforts to fight diseases like Zika. Viral videos have incited death threats against public health advocates. And in politics, a wave of right-wing YouTube stars ran for office alongside Mr. Bolsonaro, some winning by historic margins. Most still use the platform, governing the world’s fourth-largest democracy through internet-honed trolling and provocation.
|
The Global Machine Behind the Rise of Far-Right Nationalism
Jo Becker reports on how immigration led to a populist surge in Sweden. It’s being propelled by a disinformation campaign that is likely funded by the Russian government, Becker reports: Russian and Western entities that traffic in disinformation, including an Islamaphobic think tank whose former chairman is now Mr. Trump’s national security adviser, have been crucial linkers to the Swedish sites, helping to spread their message to susceptible Swedes. At least six Swedish sites have received financial backing through advertising revenue from a Russian- and Ukrainian-owned auto-parts business based in Berlin, whose online sales network oddly contains buried digital links to a range of far-right and other socially divisive content.
|
How Facebook Is Changing to Deal With Scrutiny of Its Power
Mike Isaac reports that Facebook tried to buy Houseparty but abandoned the bid over antitrust concerns: But weeks into the discussions, Facebook’s corporate development team killed the talks with Houseparty, the people said. Houseparty’s executives were told that a deal would draw unwelcome federal government scrutiny to Facebook, they said. Houseparty was later purchased by Epic Games, the makers of the video game Fortnite.
|
Far-Right Accounts Retweeted by Trump Keep Getting Suspended by Twitter
Hoo boy. From Kevin Poulsen: Nearly 10 percent of the unverified accounts retweeted by President Trump since his inauguration are currently suspended from Twitter for various violations of the platform’s policies, ranging from hate speech to running fake sockpuppet accounts, a Daily Beast analysis has found. The most recent occurrence was last week, when Twitter tapped the brakes on "LYNNTHO06607841" just hours after Trump retweeted the account’s proclamation that "DEMOCRATS ARE THE TRUE ENEMIES OF AMERICA!" Before it was boarded up, the account’s timeline was thick with all-caps conspiracy theories targeting prominent Democrats, including a recent tweet claiming Bill and Hillary Clinton "torture and sacrifice children" to get at "a drug that can only be found inside the human skull."
|
Telegram Was Built for Democracy Activists. White Nationalists Love It.
April Glaser reports on how Telegram has become a popular refuge for white supremacists who get banned from more polite social networks: As social networks like Facebook and Twitter have cracked down more aggressively on hate speech over the past year or so, one of the less-discussed places where white supremacists, violent men’s groups, anti-PC agitators, and trolls of various stripes have flocked to is Telegram. At the same time, democracy activists in Hong Kong have relied on Telegram to coordinate protests against new restrictions from the Chinese government, illustrating that its intended purpose is alive and well. Also at the same time, the app has made a cameo in Puerto Rican politics, as the leak of nearly 900 pages of sexist, debasing, and homophobic messages from a Telegram group chat led to popular protests and the governor’s resignation, illustrating that, when they think no one is eavesdropping, politicians and their confidantes will still say anything. Telegram supports private messaging, private group chats, and audio calls—all of which are reliably encrypted—for its hundreds of millions of members. But its ability to create public pages and groups has provided a platform for fallen stars on the extreme right. "Instagram is going to ban me in minutes," wrote Laura Loomer, a popular right-wing social media figure known for her anti-Muslim and conspiracy-laden rhetoric, in May when Facebook gave Loomer and others a heads-up before suspending their accounts for violating its rules. "Sign up for my Telegram," she said. On Telegram, Loomer posts multiple times a day to more than 11,800 subscribers. (On Instagram, she had more than 115,000 followers.)
|
How the El Paso Killer Echoed the Incendiary Words of Conservative Media Stars
Jeremy W. Peters, Michael M. Grynbaum, Keith Collins, Rich Harris and Rumsey Taylor examine the link between right-wing TV hosts and the recent spate of mass murders. An extensive New York Times review of popular right-wing media platforms found hundreds of examples of language, ideas and ideologies that overlapped with the mass killer’s written statement — a shared vocabulary of intolerance that stokes fears centered on immigrants of color. The programs, on television and radio, reach an audience of millions.\In the four years since Mr. Trump electrified Republican voters with slashing comments about Muslims and Mexicans, demonizing references to immigrants have become more widespread in the news media, the Times review found. In the four years since Mr. Trump electrified Republican voters with slashing comments about Muslims and Mexicans, demonizing references to immigrants have become more widespread in the news media, the Times review found.
|
TikTok is fuelling India's deadly hate speech epidemic
Funny how this problem seems to recur across every social network at scale. Nilesh Christopher reports on TikTok hate speech’s hate speech problem, and links it to real-world violence: Videos found on TikTok include casteist-hate speech posted by users identifying themselves from high castes while celebrating and singing the praises of their communities. These quickly spill into threats of physical violence with members of some communities claiming dominance over other castes. "We must sever, not the fingers, but the heads of those who dare to lay their hands on us (our community)," one user says in a video, identifying himself as part of the Nadar community. Nadars have traditionally occupied a low status in the caste ladder but have risen through entrepreneurship. This video has been liked by thousands of users, and more than 89 videos have sprung up lip syncing to the speech.
|
Elsewhere
|
The techlash has come to Stanford.
Good April Glaser piece on the emerging trend of young people no longer wanting to work for Silicon Valley giants over ethical concerns: Students don’t feel that [working at Facebook] has the same cachet," a San Francisco–based tech recruiter with 15 years of experience (who asked not to be named because Facebook is currently one of his clients) told me in an interview. "It doesn’t seem like the kind of name that students want to have on their résumé for their first go, and because they have optionality, there becomes very few reasons to go to Facebook, especially feeling like that brand is a little tarnished right now." After all, he added, students are getting very attractive compensation packages elsewhere from other multibillion-dollar tech firms that aren’t courting such negative headlines.
|
Ninja calls out Twitch after his dormant channel highlights porn
After the world’s most popular Fortnite player defected from Twitch to Mixer, Twitch began promoting other channels on his dormant page — including one that was just a porn broadcast. Ninja seemed very aggrieved about the situation in a video he posted to Twitter, and forgive me but I find the entire situation hilarious.
|
YouTube’s Susan Wojcicki: 'Where's the line of free speech – are you removing voices that should be heard?'
Emine Saner interviews YouTube’s CEO about the past few months: For all her careful, frustratingly corporate answers, Wojcicki is in an almost impossible position. Aside from the gargantuan task of trying to sift through the never-ending torrent of content, she has to contend with the fact that removing far-right commentators’ videos turns them into free-speech martyrs. She also has to keep "creators", many of whom make a handsome living through the site, happy. I have no reason to disbelieve Wojcicki when she says "responsibility has been my number one priority". The question is whether it is a task beyond her – and whether Google will tolerate changes that result in lower profits.
|
YouTube moderators explain how the company lets its biggest stars break the rules
Elizabeth Dwoskin talks to YouTube moderators who say they don’t have adequate tools for removing bad content from the site: Creators who break YouTube’s rules face the consequence of having their channels or videos stripped of ads — or their content being removed entirely. But unlike at rivals like Facebook and Twitter, many YouTube moderators aren’t able to delete content themselves. Instead, they are limited to recommending whether a piece of content is safe to run ads, flagging it to higher-ups who make the ultimate decision.
|
Facebook in Talks to Take More Space at Manhattan's Hudson Yards
An office complex known for its extremely rich residents might be getting another one, Noah Buhayar and Natalie Wong report:
Facebook Inc. is in talks to take on a much bigger lease at Manhattan’s Hudson Yards, giving the social media company space across three buildings at the $25 billion mega-project, according to people with knowledge of the matter. The deal could involve some 1.5 million square feet (139,000 square meters), about 50% more than was previously reported, said the people, who asked not to be identified discussing the private negotiations.
|
Launches
|
Twitter’s Search feature for Direct Messages is underway
|
ByteDance Brings Google-Like Search to China -- With Censorship
Zheping Huang reports on a product we ought to all keep an eye on: China’s ByteDance Ltd. launched a search engine that bears a striking resemblance to Google’s clean, uncluttered home page but produces heavily sanitized results in keeping with one of the world’s most rigidly censored internet regimes. ByteDance, the creator of popular apps including viral short video service TikTok, presents the most serious threat yet to rival Baidu Inc. Following the 2010 departure of Google from the market amid government censorship, Baidu has enjoyed a near-monopoly in internet search in China. ByteDance has yet to display sponsored products or ads in its search feed, but the results heavily prioritize content from its own Toutiao news app.
|
Silent Messages, Slow Mode, Admin Titles and More
Imagine it Twitter had this feature that Telegram just launched: When an admin enables Slow Mode in a group, you will only be able to send one message per the interval they choose. A timer will show how long you have to wait before sending your next message.
|
Takes
|
Why Aren’t We Talking About LinkedIn?
I don’t know, because it’s terrible? But John Herrman says the site’s career focus has a moderating impulse on our worst impulses: "You talk on LinkedIn the same way you talk in the office," said Dan Roth, LinkedIn’s editor in chief. "There are certain boundaries around what is acceptable." Criticism of other users’ posts, he said, tends to be measured — "there’s a certain range in the voice," he said — and users will often make the platform’s numerous implicit norms explicit, when they feel it’s necessary. "If you read the comments," said Mr. Roth, "when someone goes out of bounds, you have other members saying, ‘Hey, bring this back.’" "This is something that your boss sees, your future boss, people you want to work with in the future," Mr. Roth said. "It’s as close to your permanent record as you can get."
|
Facebook let white supremacist dog whistle "invasion" thrive for years
Natalie Martinez argues that Facebook has promoted the spread of racist, anti-immigrant ideologies, which are at the heart of recent mass shootings: Even though the company has policies that seem to prohibit most if not all "invasion" content, Facebook still allows it to exist and spread on its platform. In March, Facebook claimed it did not consider a post pushing white supremacist claims about a "Muslim Invasion" in the U.K. to be a violation of its community standards. A year after leaked documents revealed that Facebook allowed praise for white nationalism and separatism on its platform after the August 2017 "Unite the Right" rally in Charlottesville, VA, the company implemented a so-called white nationalist ban. But the auditors Facebook had hired to oversee its goals of "advancing civil rights on our platform" criticized the ban as " overly narrow."
|
And finally ...
|
How TikTok Has Turned the School Bathroom Into a Studio
Brian Feldman reports on the emergence of high school bathrooms as an unlikely hub for content creation: But there are also a slew of videos in which students recontextualize the bathroom social space and play with its perception. Earlier this year, a meme comparing what happens in boys’ and girls’ locker rooms took off, the general gist of which was that girls complained about gym class in their locker room and boys turned theirs into a Mad Max state of anarchy. My high school bathroom was a dank and foreboding pit, so glad to see today’s teens are putting it to good use.
|
Talk to me
|
Send me tips, comments, questions, and bathroom TikToks: casey@theverge.com.
|
Did you enjoy this issue?
|
|
|
|
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
|
|
|
|
|
|