Twitter should remove accounts belonging to Iran’s Supreme Leader Ayatollah Ali Khamenei for repeatedly violating the platform’s policies, human rights activist Masih Alinejad has said.
Ms. Alinejad, the founder of My Stealthy Freedom, a movement against Iran’s compulsory hijab law, posted a tweet on January 9 calling for Iran’s leader to be banned from the social media platform.
Now it’s time for @Twitter to remove the man who has banned 83 million Iranians from Twitter, bans US & European coronavirus vaccines and ordered the crackdown that killed 1,500 protesters. Remove @Khamenei_fa now pic.twitter.com/xw0RAboIh9
— Masih Alinejad 🏳️ (@AlinejadMasih) January 9, 2021
On the same day, Twitter removed a tweet posted from Khamenei’s English language account which said that Covid vaccines from the UK and the U.S. were untrustworthy. Khamenei has multiple Twitter accounts in several languages including Arabic, English, Hindi, Persian and Spanish.
The call to ban Khamenei follows the social media company’s decision on Jan. 8 to permanently bar outgoing U.S. President Donald Trump following tweets sent from his account shortly before pro-Trump rioters stormed Washington’s Capitol building last week. The tweets were found to be in breach of the platform’s glorification of violence policy, prompting the company to close the account.
Alinejad told Kayhan Life that Twitter’s decision to bar Trump indefinitely but not Khamenei was irrational.
“It is a double standard and pure hypocrisy when Twitter suspends Trump while sparing Khamenei,” she said.
“Khamenei has been using Twitter’s platform to oppress homosexuals, Baha’is, and women protesting for their rights. He’s called for the destruction of Israel and has spread conspiracy theories. In spite of all of that, Khamenei’s account has not been removed. Does that mean Twitter doesn’t care about the human rights of Iranians?” she added.
Alinejad’s campaign to ban Iran’s Supreme Leader from the social network has drawn support across political lines. It has been backed by Russian chess grandmaster Garry Kasparov, who is also the current chair of the New York based Human Rights Foundation (HRF); CNN news anchor Jake Tapper; Fox News anchor Martha MacCallum; and Hillel Neuer, the executive director of United Nations Watch, a human rights charity based in Geneva, Switzerland.
Alinejad said the next step for the campaign was the launch of a petition which she hoped would be supported by international and Iranian activists.
Concerns about Khamenei’s tweets were raised in July by the pro-Israel activist Arsen Ostrovsky, who asked the platform why Khamenei’s tweets calling for the elimination of Israel were not considered to be in breach of its rules.
— Khamenei.ir (@khamenei_ir) November 9, 2014
Responding to the query, Ylwa Pettersson, Twitter’s head of policy for the Nordic countries and Israel, told the Knesset’s Committee for Immigration, Absorption and Diaspora Affairs: “We have an approach toward leaders that says that direct interactions with fellow public figures, comments on political issues of the day, or foreign policy saber-rattling on military-economic issues are generally not in violation of our rules.”
While the closure of Twitter accounts is not a violation of free speech as set out in the U.S. Constitution’s first amendment — which only covers government censorship and not censorship by private companies like Twitter — the move to ban Trump has been criticized by several European politicians including Germany’s chancellor, Angela Merkel, who called the ban “problematic.” Steffen Seibert, Ms. Merkel’s spokesman, said that while social media platforms needed to ensure websites were not being flooded with “hatred” and “lies,” freedom of opinion was a fundamental right.
Speaking to Kayhan Life, Professor Jeff Jarvis, who advocates an Internet that would be by and for its users rather than governments or gatekeepers, said he welcomed Trump’s removal from Twitter but said censorship could inadvertently silence marginalized voices online.
“Leaving malign content up is not necessarily bad, as it allows people to recognize it and say, ‘that’s stupid’ or ‘that’s bad,’ so generally I would default to leaving this kind of content up and letting people decide,” said Jarvis, who is director of the Tow-Knight Center for entrepreneurial journalism and professor of journalism innovation at the City University of New York (CUNY).
“In a time of moral panic among media and around the net, I believe there is good in the internet, and if we try to control one thing for one reason, we turn off the tools that can be used by other people for other reasons,” Jarvis noted. “However, we don’t have the institutions right now to assure quality and authority: it could literally take one or two hundred years to develop them.”
Recommendations on how to manage freedom of speech on social media platforms were issued in June by the Transatlantic High Level Working Group on Content Moderation Online and Freedom of Expression, of which Jarvis is a member. The group — co-chaired by Susan Ness, a former U.S. Federal Communications Commission member — included government, tech and legal experts from the UK, Europe and North America.
Among its recommendations, the Group suggested that platforms establish accountability through transparency, and that each platform could be held accountable for different things.
“Any online platform should issue — ideally collaboratively with its users and with its community — a covenant with the public which has a human rights element and outlines why the platform exists, what it will do and what it will endeavor to do,” Jarvis said. “This covenant would hold the platform accountable, and what the Group recommended was that the company would also make its data available to researchers, so that these companies could be held accountable.”
Commenting on legislation in Germany to moderate hate speech, Jarvis said a lack of understanding about social media platforms and their current ability to monitor comment online was already stifling freedom of expression.
“This has happened in Germany with the Network Enforcement Act, known as NetzDG, a hate speech law where platforms must take down any hate speech that is manifestly illegal. This means they have to decide both whether it’s illegal — a decision which really should be made in a court — and whether it is manifestly illegal. If it is manifestly illegal and the platform does not take it down, they get fined huge amounts of money,” he said.
“So what happens? As the law requires companies to act within 24 hours — and there is suggestion of reducing the window to one hour — this leads to snap judgments, so platforms are going to play it safe and take down anything without context that seems odd. That creates a chill on free speech.”