menu-control
The Jerusalem Post

Watchdog says Musk threatened legal action after hate speech report

 
 Elon Musk, Chief Executive Officer of SpaceX and Tesla and owner of Twitter, looks on as he attends the Viva Technology conference dedicated to innovation and startups at the Porte de Versailles exhibition centre in Paris, France, June 16, 2023. (photo credit: Gonzalo Fuentes/Reuters)
Elon Musk, Chief Executive Officer of SpaceX and Tesla and owner of Twitter, looks on as he attends the Viva Technology conference dedicated to innovation and startups at the Porte de Versailles exhibition centre in Paris, France, June 16, 2023.
(photo credit: Gonzalo Fuentes/Reuters)

A letter from a law firm representing Musk's X. Corp accused a watchdog of attempting to harm Twitter's ad revenue.

Elon Musk's X. Corp (formerly Twitter) threatened the Center for Countering Digital Hate (CCDH) with legal action after the CCDH published a report stating that X had failed to act on hate speech published by Twitter Blue accounts, CCDH stated on Monday.

In June, the CCDH published a report showing that after it reported 100 tweets by Twitter Blue subscribers which contained racist, homophobic, neo-Nazi, antisemitic or conspiracy content, Twitter only acted on one of the tweets and none of the accounts were suspended.

The tweets reported included tweets such as "The Jewish Mafia wants to replace us all with brown people” and "Hitler was right."

On Monday, CCDH published a letter sent to the organization by a law firm representing X which accused the CCDH of making "a series of troubling and baseless claims that appear calculated to harm Twitter generally, and its digital advertising business specifically."

Advertisement

The attorney representing X wrote that CCDH "regularly posts articles making inflammatory, outrageous, and false or misleading assertions about Twitter and its operations." The attorney additionally accused the organization of "failing to conduct (or even attempt) anything resembling the rigorous design process, analytical procedures, or peer review that a reasonable person would expect to accompany research product published by any reputable organization."

 Meta's Threads app and Twitter logos are seen in this illustration taken July 4, 2023. (credit: REUTERS/DADO RUVIC/ILLUSTRATION)
Meta's Threads app and Twitter logos are seen in this illustration taken July 4, 2023. (credit: REUTERS/DADO RUVIC/ILLUSTRATION)

Concerning the CCDH's report published in June, the attorney wrote that the claims made concerning a failure to act on hate speech were "false, misleading, or both, and they are not supported by anything that could credibly be called research."

The attorney added that the report provides "no evidence" of differing treatment in moderation actions against Twitter Blue subscirbers and non-subscribers.

"This article leaves no doubt that CCDH intends to harm Twitter’s business by driving advertisers away from the platform with incendiary claims," continued the letter. "Twitter takes its commitment to free speech, the enforcement of its rules and policies protecting users, and its strong relationships with its advertising partners all extremely seriously."


Stay updated with the latest news!

Subscribe to The Jerusalem Post Newsletter


The attorney accused CCDH of being supported by "funding from X Corp.'s commercial competitors, as well as government entities and their affiliates" in order to conduct a "campaign to drive advertisers off Twitter by smearing the company and its owner." The attorney added that the firm is investigating whether it can launch legal action against CCDH under Section 43(a) of the Lanham Act, which deals with false claims in commercial advertising.

CCDH calls Twitter's claims 'ridiculous'

A law firm representing CCDH sent a letter to X in response, calling the letter from X "ridiculous."

Advertisement

"These allegations not only have no basis in fact (your letter states none), but they represent a disturbing effort to intimidate those who have the courage to advocate against incitement, hate speech and harmful content online, to conduct research and analysis regarding the drivers of such disinformation, and to publicly release the findings of that research, even when the findings may be critical of certain platforms," wrote the firm representing CCDH.

"Tellingly, after CCDH published [the report], Twitter did not spend its time and resources addressing the hate and disinformation that CCDH had identified, despite Twitter’s purported commitment to addressing hate speech on its platform. Instead, your clients decided to 'shoot the messenger' by attempting to intimidate CCDH."

The law firm noted that while CCDH did not review all the millions of tweets posted daily on Twitter, it also never claimed to have done so. The law firm added that since Musk took control of Twitter, the company has "taken steps to curtail research on the platform," with the letter including a link to an article about Twitter limiting the number of tweets users can read in an attempt to cut down on data scraping.

The letter stressed that X's threat to launch legal action against CCDH under the Lanham Act is "bogus" as none of the examples cited in X's letter constitute the advertisement or commercial speech relevant under the Lanham Act.

"To the contrary, the statements you complain about constitute political, journalistic, and research work on matters of significant public concern, which obviously are not constrained by the Lanham Act in any way," wrote the law firm. "Moreover, as a nonprofit working to stop online hate, CCDH is obviously not in competition with Twitter, which makes your allegations of a Lanham Act injury even more fanciful."

The law firm additionally stressed that CCDH also publishes reports about other social media platforms, including Instagram, Facebook, and Tik Tok, platforms which are competitors to Twitter.

"Simply put, there is no bona fide legal grievance here. Your effort to wield that threat anyway, on a law firm’s letterhead, is a transparent attempt to silence honest criticism. Obviously, such conduct could hardly be more inconsistent with the commitment to free speech purportedly held by Twitter’s current leadership."

The law firm added that X is "free to pursue litigation if they choose to do so," but should "be mindful of the risks involved in bringing frivolous claims to intimidate thoughtful critics and stifle legitimate commentary on issues of clear public interest."

The firm additionally warned that if X seeks litigation, it will seek immediate discovery (a legal process conducted pre-trial to obtain evidence from the other party) regarding "hate speech and misinformation on the Twitter platform; Twitter’s policies and practices relating to these issues; and Twitter’s advertising revenue."

"In that event, a court will determine for itself the truth of the statements in our client’s report in accordance with the time-tested rules of civil procedure and evidence."

"In the meantime, CCDH will not be bullied by your clients. It intends to continue its research and its reporting. And in line with its mission to protect online civil liberties and ensure accountability, CCDH intends to publish Twitter’s letter and this response. Our clients believe that open and transparent public discussion of these issues is essential, and they will not shirk from their responsibilities in the face of Twitter’s efforts to threaten and silence inconvenient facts or viewpoints with which your clients do not agree."

Musk has repeatedly come under fire for failure to moderate hate speech

This isn't the first time Twitter has gotten in hot water under Musk for issues related to the moderation of hate speech.

In January, HateAid and European Union of Jewish Students (EUJS) jointly filed a civil action against Twitter, accusing the platform of insufficiently moderating content including content considered "sedition" under Germany's Criminal Code.

The lawsuit concerns six antisemitic and illegal comments that were reported by the two organizations, but were not removed. The organizations asserted that the failure to remove the content in question contradicts Twitter's Rules and Policies which are part of the terms of service for users. The case 

As of mid-July, two of the tweets were still public, two were removed, and two were blocked for Twitter users in Germany.

EUJS President Avital Grinberg stated on July 11 that “This shows once again that Twitter does not care about the amount of hate that we, young European Jews, are exposed to on their platform, and that even with a lawsuit, they are not willing to make the effort to remove such disgusting content. Twitter needs to take concerns and screams for improvement serious. As EUJS we commit to continue our hard work into making the online and offline space safe for all and are not leaving this issue behind us.”

A report by CASM Technology and the Institute for Strategic Dialogue (ISD) found a "major and sustained spike" in antisemitic posts on Twitter since Musk took over the company in October 2022.

The report used a digital analysis technology called Beam and a hate speech detection methodology combining over twenty leading machine-learning models and found that the volume of English-language antisemitic tweets more than doubled following Musks's takeover of the company, with the weekly average of antisemitic tweets increasing from 6,204 before the acquisition to 12,762 after the acquisition.

In an interview with the BBC in April, Musk rejected claims that hate speech on the platform was rising, stating "Do you see a rise of hate speech? I don't."

A number of Twitter users who had been removed for posting hateful or abusive content had their accounts reinstated after Musk took the helm of Twitter, including Kanye West, neo-Nazi Daily Stormer founder Andrew Anglin, Andrew Tate, and far-right social media personality Anthime Gionet.

An investigation by the BBC in March found that hundreds of accounts allowed back on Twitter were spreading abuse or misinformation, including almost 200 accounts promoting hate and violence, accounts promoting LGBT-phobic content, and misinformation concerning elections and vaccines.

Additionally, in December, Twitter disbanded its Trust and Safety Council, a volunteer group meant to advise the platform on decisions which could affect the safety and wellbeing of Twitter users.

×
Email:
×
Email: