The topic at hand was Meta’s controversial Cross Check program, which gave special treatment to posts from certain powerful users—celebrities, journalists, government officials, and the like. For years this program operated in secret, and Meta even misled the board on its scope. When details of the program were leaked to The Wall Street Journal, it became clear that millions of people received that special treatment, meaning their posts were less likely to be taken down when reported by algorithms or other users for breaking rules against things like hate speech. The idea was to avoid mistakes in cases where errors would have more impact—or embarrass Meta—because of the prominence of the speaker. Internal documents showed that Meta researchers had qualms about the project’s propriety. Only after that exposure did Meta ask the board to take a look at the program and recommend what the company should do with it. The meeting I witnessed was part of that reckoning. And the tone of the discussion led me to wonder if the board would suggest that Meta shut down the program altogether, in the name of fairness. “The policies should be for all the people!” one board member cried out. That didn’t happen. This week the social media world took a pause from lookie-looing the operatic content-moderation train wreck that Elon Musk is conducting at Twitter, as the Oversight Board finally delivered its Cross Check report, delayed because of foot-dragging by Meta in providing information. (It never did provide the board with a list identifying who got special permission to stave off a takedown, at least until someone took a closer look at the post.) The conclusions were scathing. Meta claimed that the program’s purpose was to improve the quality of its content decisions, but the board determined that it was more to protect the company’s business interests. Meta never set up processes to monitor the program and assess whether it was fulfilling its mission. The lack of transparency to the outside world was appalling. Finally, all too often Meta failed to deliver the quick personalized action that was the reason those posts were spared quick takedowns. There were simply too many of those cases for Meta’s team to handle. They frequently remained up for days before being given secondary consideration. The prime example, featured in the original WSJ report, was a post from Brazilian soccer star Neymar, who posted a sexual image without its subject’s consent in September 2019. Because of the special treatment he got from being in the Cross Check elite, the image—a flagrant policy violation—garnered over 56 million views before it was finally removed. The program meant to reduce the impact of content decision mistakes wound up boosting the impact of horrible content. Yet the board didn’t recommend that Meta shut down Cross Check. Instead, it called for an overhaul. The reasons are in no way an endorsement of the program but an admission of the devilish difficulty of content moderation. The subtext of the Oversight Board’s report was the hopelessness of believing it was possible to get things right. Meta, like other platforms that give users voice, had long emphasized growth before caution and hosted huge volumes of content that would require huge expenditures to police. Meta does spend many millions on moderation—but still makes millions of errors. Seriously cutting down on those mistakes costs more than the company is willing to spend. The idea of Cross Check is to minimize the error rate on posts from the most important or prominent people. When a celebrity or statesman used its platform to speak to millions, Meta didn’t want to screw up. The board’s biggest ask is that Cross Check should concentrate on human rights content. The 23-member group concluded that the program is mainly a project set up to kiss the butts of powerful people to help Meta’s business. The Oversight Board recommends changing it into something that makes sure that content protecting human rights doesn’t get removed in error—and that posts from prominent accounts harmful to those rights get taken down quicker. Board member and former Guardian editor in chief Alan Rusbridger explains the group’s reasoning thus: “We understand that, repeatedly, you were getting it in the neck from people who are not being handled well. You talk a good game about the kinds of people that you want to protect and enhance. But actually, this program is much more mundane. If you’re going to do it, you should  do it better, with clear criteria.” The board’s other recommendations aim to make Cross Check more transparent, efficient, and accountable. They lay out a long list of requirements and reporting systems. If Meta took all the board’s suggestions, it would have to set up an elaborate new infrastructure to improve the program and document its operation. Since the entire company is cutting head count, and fixing Cross Check isn’t going to bring new revenue, fight TikTok, or make the metaverse happen sooner, I have my doubts about how eagerly Meta will embrace all those suggestions. It has 90 days to respond to the report. Whether or not Meta accepts the recommendations (I suspect it will adopt some but not all), the report adds another chapter to an experiment that may not have drastically changed Meta but has made some steps to improve it. Rusbridger says that this is a virtue in itself. “However cynical you want to be about Meta, they’re spending an awful lot of money and inviting an awful lot of pain,” he says of the Oversight Board. “I can’t see that it’s worth it for them, unless there is a sincere belief that we could help them become a better company.” Well if you’re really cynical you could say it’s an effort to stave off regulation. But no matter how you view it, Meta’s effort to set up an independent board is suddenly looking pretty golden in contrast to what’s happening at another prominent platform for online speech. He could have outsourced the task to an independent body similar to Meta’s Oversight Board, which includes Nobel laureates, a former head of state, and distinguished journalists and human rights activists. Though Musk tweeted a couple of weeks ago that he’d like an advisory council of his own to help with content policy, nothing came of that. Instead he handed over data from inside Twitter to two Substack writers who often hit on right-wing talking points about “free speech.” The results so far are risible. The “Twitter Files” didn’t expose a corrupt system, but showed what seemed like well-intentioned efforts not to screw things up before an election. In the process, writer Matt Taibbi’s tweet-storm sharing screenshots of internal emails wound up violating Twitter’s own policies about doxing, as he included personal email addresses of a US Congress representative and company cofounder Jack Dorsey.  Compared to the sophisticated interplay between Meta and its board in the Cross Check investigation, the Twitter approach seems like something out of the Stone Age. The Oversight Board, which has ambitions to extend its work to platforms beyond Meta, sees an opportunity here. “I hereby ask Elon Musk to read this,” says Greene of the report his board just released. Good luck with that. I don’t have to go far into the past for this one—it’s from my deep dive last month into the short but eventful history of the Oversight Board. Here’s the section where I sit in on one of the board’s discussions that led to its sweeping recommendations about Cross Check.  Fifteen people gathered around a set of tables arranged in a rectangle and set up with all the formality of a United Nations summit. A team of translators was on hand so every member could speak their native language, and each participant got an iPod Touch through which to listen to the translations. Once the conversation got underway, it quickly became heated. Some members abandoned their home tongues and spoke in less-polished English so the others could hear their urgency straight from their mouths. The board members seemed to understand Meta’s argument that giving special treatment to well-known accounts could be expeditious. Employees could more quickly assess whether an improper post was excusable for its “newsworthiness.” But the members zeroed in on the program’s utter lack of transparency. “It’s up to them to say why it should be private,” the cochair who was moderating the session remarked.The members discussed whether Meta should make public all the details of the program. One suggestion was that the Privileged Posters be labeled. After listening to all this back-and-forth, one member finally burst out an objection to the entire concept of the program. “The policies should be for all people!” she exclaimed. It was becoming clear that the problems with the Cross Check program were the same seemingly intractable problems of content moderation at scale. Meta is a private service—can it claim the right to favor certain customers? Of course not, because Meta is so entwined with the way people express themselves around the globe. At one point, a member cried out in frustration: “Is being on Facebook a basic human right?” Zack asks, “Is Paxlovid worth taking?” Thanks for the question, Zack. First let me share tasting notes for the Covid antiviral. Stout bouquet of a greenish puddle on the floor of a 1940s auto body shop in Oklahoma. Hints of rusty tractor bolts, battery acid, and Meadowlands Superfund runoff. Lingering aftertaste of a Porta Potty in an underfunded jam band festival. But in answer to your question, and speaking as a recent user, I have to say … damned if I know. When I came down with Covid last week, I of course consulted my own doctor, who turned out to be not particularly a fan of that treatment. The upside was that it might spare me a couple of days of symptoms and perhaps cut down the already low odds that my case would turn serious. The downside was an increased risk of a reprise of the infection, and some possible side effects. He left the decision up to me. More informally, I also spoke to a physician friend, Robert Wachter, chair of UC San Francisco’s department of medicine, whose tweet threads on Covid have been essential reading since 2020. He shared statistics with me that backed up what my doctor said and added that taking Paxlovid would cut down the chances that I would wind up with long Covid. But I will tell you one thing I’m sure of. This country is delusional when it comes to Covid. The pandemic is anything but over. Anecdotally, I’m one of many people who have recently taken sick for the first time—this disease is clearly unavoidable. Thousands die every week, and we’re one variant away from a more harmful virus. Only a minority of Americans have taken booster shots. If you take sensible mask precautions, you’re an outlier and made to feel uncomfortable. Meanwhile our lawmakers can’t manage to pass a bill to continue spending on tests, vaccines, and especially research that could develop a more effective vaccine. It’s almost three years since this pandemic hit, and it seems very likely to me that three years from now it will still be killing people and disrupting our lives and our economy.  Shame on us. You can submit questions to mail@wired.com. Write ASK LEVY in the subject line. Terrorists who took down several North Carolina power stations might have done it to stop a drag show. Tony Fadell, key creator of the iPod and Nest smart home products, has now produced what he hopes is the iPod of Crypto. I trekked to Paris (tough assignment!) to get the story. After screwing up real estate prices in cities, Airbnb is making it tough for renters and home buyers in small-town America. Have you noticed that everyone seems to be sick right now? (I have!) Here’s why. That’s it from me until 2023. I have some vacation time to burn, which will take me to the end of the year. For the next two weeks, Plaintext will have some pinch hitters and then take a holiday break.  Barring a belated Covid rebound, I’ll be bouncing back to the helm on January 6, kicking off a year that seems destined for plenty of discussion about Elon, chatbots, crypto, metaverse, and maybe some new topics. Have a great season!

Elon Musk s Twitter Is Making Meta Look Smart - 57Elon Musk s Twitter Is Making Meta Look Smart - 9Elon Musk s Twitter Is Making Meta Look Smart - 25Elon Musk s Twitter Is Making Meta Look Smart - 68Elon Musk s Twitter Is Making Meta Look Smart - 2