LONDON — Missing penalties in a major international soccer final was bad enough for three Black players on England’s national team. Being subjected to a torrent of racial abuse on social media in the aftermath made it even worse.
Monkey emojis. Being told to go home. The N-word.
The even sadder part? Everyone knew it was coming.
“It’s stupid,” said Nedum Onuoha, a retired Black player who was in the top divisions of English and U.S. soccer for 16 years. “But are we surprised?”
It’s the latest form of racism: technology-fueled, visual, permanently intrusive and 24/7 — a haunting reminder of the 1980s-style monkey chants and banana-throwing in a social media era.
And it is spiraling out of control on platforms where anonymity is the golden ticket for racists.
“Every time it happens, it knocks you back and floors you,” Onuoha told The Associated Press. “Just when you think everything is OK, it’s a reminder that it’s not. It’s a reminder of how some people actually see you.”
Racism is the predominant form of abuse on social media reported to Kick It Out, an anti-discrimination campaigner in soccer, according to statistics compiled over the past three seasons in English soccer.
A report last year from FIFA, the governing body of world soccer, showed that more than 50% of players competing in two international tournaments in 2021 — the African Cup of Nations and the European Championship — received some form of discriminatory abuse in more than 400,000 posts on social media. More than a third were of a racist nature.
The problem is, there’s barely any accountability and it’s so easy. Pull out your phone, find the handle of the player you want to abuse, and fire off a racist message.
Former Premier League striker Mark Bright, who is Black and regularly suffered racial abuse inside stadiums in the 1980s, was exchanging messages with friends on a WhatsApp group when those three Black players for England — Bukayo Saka, Marcus Rashford and Jadon Sancho — missed penalties in a shootout loss to Italy in the 2020 European Championship final.
“We all messaged each other and said, ‘Oh God, here we go.’ Because we know what’s around the corner,” Bright told the AP. “That’s what we expected and this is where, once again, you say ‘What can be done about it?'”
Largely speaking, the abuse hasn’t stopped Black players from using social media. It is an essential tool for marketing, leading to the paradox of soccer players using the same platforms on which they are abused.
Kylian Mbappe, who has 104 million followers on Instagram and more than 12 million followers on Twitter, was subjected to racial abuse along with fellow Black teammate Kingsley Coman after their French national team lost in the 2022 World Cup final to Argentina.
Real Madrid winger Vinícius Júnior, who has repeatedly been the target of racial insults, is followed by 38 million people on Instagram and nearly 7 million on Twitter.
Saka, who has more than 1 million followers on Twitter, remains on social media despite the abuse after England’s Euro 2020 loss and more just a few weeks ago, when a message posted on Twitter showed the Arsenal winger with his face made to look like a monkey, alongside the words: “This clown has cost us the league.” Minutes before the message, Saka had missed a penalty in an important Premier League game.
With social media continuing to fuel abuse, players and teams are coming up with ways to both raise awareness and reduce their exposure to offensive users.
GoBubble is a company that configures AI software to act as a filter to stop discriminatory comments from being seen by a social media user. It has customers from the Premier League down to the fourth division in English soccer, around Europe and in Australia.
“Yes, tech has caused the issue,” GoBubble founder Henry Platten told the AP, “but tech can actually solve the issue and this is what we are seeing as one of those pieces of the jigsaw.”
The company’s AI technology is plugged into players’ accounts and scans for toxic and potentially harmful words, images and other types of messages which can be filtered out using a traffic-light system.
“This isn’t about censorship, about sportswashing, about creating that fuzzy world,” Platten said. “This is about protection, not just for the players and their families but also the wider fan community.”
Platten said some players who approached him had experienced mental health issues that impacted their performances. Indeed, in January, Liverpool became the first Premier League club to hire a mental health consultant tasked with protecting young players from online trolling.
Governing bodies are reacting, too. During last year’s World Cup in Qatar, FIFA and players’ union FIFPRO had a dedicated in-tournament moderation service that prevented racist and other forms of hate speech from being seen online by players and their followers. This service will be offered for the upcoming Women’s World Cup.
Soccer authorities in England, including the Premier League, led a four-day social media boycott in 2021 across Twitter, Facebook and Instagram in a protest against racist abuse. It ended up being adopted by many other sports in England, and by FIFA and UEFA, the governing body of European soccer.
Still, the abuse continues on the platforms, which have been accused of being too slow to block racist posts, remove offenders’ accounts, and improve their verification process to ensure users provide accurate identification information and are barred from registering with a new account if banned.
“It needs to be regulated, you need to be accountable,” Bright said. “Everyone’s been complaining about this for a long time now. Some players have set up meetings with these social media companies. It seems to me that they’re not serious enough about it.”
So is there appetite for change within the big social media platforms?
“No one should have to experience racist abuse, and we don’t want it on our apps,” Meta, which owns Instagram and Facebook, said in a statement to the AP. “We take action whenever we find it and we’ve launched several ways to help protect people from having to see it in the first place.”
That includes “Hidden Words,” which filters offensive comments and direct messages and is on by default for creator accounts, and “Limits,” which hides comments and DMs from people who don’t follow you or only followed you recently, the statement said.
“We know no one thing will fix abusive behavior,” Meta said, “but we’re committed to continuing working closely with the football industry to help keep our apps a safe place for footballers and fans.”
Twitter responded with an automated reply of a poop emoji when the AP reached out for comment.
For GoBubble founder Platten, platforms are striking a balance between keeping a large user base for revenue purposes while being seen to be tough on racism.
“There’s always going to be a position where they may move closer to solving the problem,” he said, “but are never going to go the full hog that we all want them to, in terms of really cracking down and solving it.”
Some teams and athletes are choosing alternative platforms to promote not just themselves but also more ethical behavior online.
These include Striver, a user-generated content platform backed by Roberto Carlos and Gilberto Silva — both World Cup winners with Brazil in 2002. And PixStory, a platform with nearly 1 million users which ranks them according to the integrity of their posts and aims to create “clean social” by prioritizing safety in a way big tech companies are not doing.
England’s Arsenal club, Italy’s Juventus and Paris Saint-Germain’s women’s team are collaborating with PixStory, whose founder, Appu Esthose Suresh, says teams and athletes are in a “Catch-22 situation.”
“They want to live in this space because it’s a way to reach out and interact with their fans, but there’s not enough safety,” Suresh told the AP. “There is an alternative way — and that’s change the business model.”
Ultimately, the biggest change will likely come through legislation. Last month, the European Union clinched an agreement in principle on the Digital Services Act, which will force big tech companies to better protect European users from harmful online content or be punished with billions of dollars in fines for noncompliance. In Britain, the government has proposed the Online Safety Bill, with potential fines amounting to 10% of the platforms’ annual global turnover.
Meanwhile, the number of perpetrators of online racial abuse facing criminal charges has increased. In March, a man who abused England striker Ivan Toney was banned from every soccer stadium in Britain for three years in what police described as a “landmark ruling.”
Onuoha welcomed these developments but he’s still keeping his social media accounts on a private setting.
“There will be lots of good people who won’t be able to connect with me but it’s a consequence of not having enough trust and faith in enough good people being allowed to enter the account,” he said. “It’s the 1% who offset the entire experience.”