Explained | The legal debate around a Texas law for social media platforms

0

Texas Social Media Platforms Law (HB 20) states that companies cannot censor any post based on an individual’s political views and requires them to submit semi-annual reports on content regulation on their platform. .

Texas Social Media Platforms Law (HB 20) states that companies cannot censor any post based on an individual’s political views and requires them to submit semi-annual reports on content regulation on their platform. .

The story so far: Last week, internet advocacy groups NetChoice and Computer & Communication Industry Association (CCIA), representing Facebook, Twitter and YouTube, filed an emergency petition with the US Supreme Court seeking to block a law of Texas that prohibits platforms from censoring users based on their political views. . The stay of proceedings of a district court was overturned by the 5 e US Circuit Court of Appeals on May 11 in a 2-1 verdict.

Texas law further requires social media platforms to publish a semi-annual transparency report detailing how they handled the distribution of their content. This would apply to all platforms with more than 50 million monthly active users.

Carl Szabo, general counsel for advocacy groups, argued that Texas law HB 20 is an “assault” on the Constitution’s First Amendment which guarantees free speech. In their emergency request, the advocacy groups pleaded for the Court to “allow the district court’s careful reasoning to remain in effect while an orderly appeals process unfolds.”

Texas Governor Greg Abbott had signed the law into effect in September 2021. About a fortnight later, NetChoice and the CCIA sued the State of Texas, alleging the law was “unconstitutional.” In December 2021, a federal district court issued an injunction declaring he was “constitutionally tainted.” The state moved the Fifth Circuit Court of Appeals against that order — earning them a reversal of the injunction.

While signing the law, Governor Abbot said, “It is now legal that conservative viewpoints in Texas cannot be banned on social media.”

According to the plaintiffs, this left the service providers with Hobson’s choice: those who voluntarily filtered certain messages would become responsible for all messages transmitted, so ignoring such a scenario by looking away from “problem messages” was the only way to escape. .

How is censorship relevant here?

The State of Texas has declared that every person has a fundamental interest in the free exchange of ideas and information, including the right to collect them. Social media platforms are essential forums for public debate.

He argued that HB 20 would ensure that platforms do not censor a user or their expression based on their political views or geographic location. In other words, social media platforms should host content from both sides of an ideological spectrum without any editorial curation on their part.

NetChoice said the law would trample on the First Amendment by allowing the government to force platforms to host speech they may not want, against their will. Social media platforms have grassroots policies that guide how they regulate content so as not to cause social harm. For example, in 2021, Twitter permanently suspended former US President Donald Trump’s account due to “risk of further incitement to violence” after the January 6 attack on the US Capitol.

Mandate public disclosure of information

The other provision deals with public disclosures. Texas law would require social media platforms to publicly disclose specific information about their business practices and how they handle content and data. This includes curating content, promoting (content, products and services on its platform), acceptable community guidelines, and using algorithms to categorize and place content. In addition, the law requires platforms to publish a semi-annual report detailing all violations during a given period, the type of violation, its source and the measures taken.

As for its implementation, platforms should act within 48 hours of reporting an alleged content violation. Additionally, if the author of a post challenges the censorship, the platforms will have fourteen days to review and inform the user of the reason for pursuing the retraction or reinstate the content.

NetChoice argued in court that the provision imposes burdensome disclosures and operational requirements. “This unconstitutionally requires platforms to provide notice Everytime they remove speech and provide a complaints and appeals process with short deadlines,” he argued.

He pointed out that YouTube removed 9 billion videos from its platform in a single quarter in 2020 while Facebook removed more than 40 million posts related to bullying, harassment and hateful content over a similar period in 2021. HB 20 asks the platforms to institute a remedy. process for each of them, thus making it “heavy”.

According to the appellant, this would also discourage competition in the industry.

The case of the state for freedom of expression

Both sides have distinct views on the role of social media in spreading freedom of expression.

The Texas state attorney argued that asking someone to host another person’s speech “is often a perfectly legitimate thing for the government to do.” The lawyer added that it is the duty of the state to ensure that private interests do not restrict a critical channel of communication.

The State of Texas alleged that after achieving digital dominance, platforms began to discriminate based on viewpoints. With this law, they are trying to avoid a “discriminatory dystopia where big corporations punish speakers with idiosyncratic opinions”.

Arguing that the platforms discriminated against American views in favor of foreign adversaries, he said Facebook censored Americans who suggested the coronavirus originated in Chinese labs in Wuhan, while the Communist Party was allowed to post allegations on Americans making the virus.

In addition, he accused social media platforms of bias, particularly in favor of federal bureaucrats, alleging that they failed to censor content when they openly advocated “off-label” drug use (use medications for conditions other than those listed on the label) for the treatment of symptoms of COVID-19.

The State of Texas, citing an earlier judgment, said: “Our political system and our cultural life are based on the ideal that each person should decide for himself what ideas and beliefs deserve to be expressed, taken into consideration and adherents.” The brief adds that speech cannot be restricted because it disturbs or arouses contempt. Therefore, the internet’s own discourse can become a subject of scrutiny when it tries to shape another’s discourse, he said.

Advocacy Groups and Their Cases for Free Speech

On the other hand, the plaintiffs challenged this assertion by asserting that freedom of speech also includes what not to say. Therefore, the platform’s discretion in what it posts and how it curates conveys a message about what content it deems acceptable. Editorial discretion also ensures that users are not exposed to hate speech, misinformation or irrelevant or less informative posts, according to the plaintiffs.

They argued that this law was an attempt to divert the platform’s editorial policies to those preferred by the government. “This is a content-, point-of-view- and speaker-based law that would eviscerate editorial discretion, impermissibly compel and chill speech, and mandate onerous disclosures on certain disadvantaged social media platforms,” they argued.

Social media companies exercise editorial discretion by controlling who accesses their platforms, what content is made visible, and how it is presented to users. This is done by adhering to a case-specific approach and not treating all content equally. It also involves ranking the content on the user’s “feeds” according to their individual preferences, while ensuring that it reflects accurate information and not misinformation.

According to CCIA and NetChoice, platforms should not be considered “mere conduits” (as the state of Texas proposes) without any responsibility for the content posted. “The filing confirms that when platforms failed to remove harmful content, their users and advertisers sought to hold the platforms accountable, including through boycotts,” they said.

The plaintiffs argued that the multiplicity of viewpoints in this case may also mean that this platform could air pro-Nazi expressions on equal footing with Holocaust memorial messages. Forcing the platform to also deliver speech it finds unacceptable is like forcing the platforms to speak up – signaling to the user that the platform finds the idea acceptable, they argued.

Share.

Comments are closed.