In the wake of escalating riots across Britain, social media platforms are facing urgent calls to address online violence and hate speech that may be stoking the unrest.
Regulator Ofcom has issued a stark warning to tech leaders, emphasizing their responsibility to eliminate content that could incite violence or hatred. In an open letter, the regulator highlighted the need for immediate action to protect users from harmful online material.
Deputy Prime Minister Angela Rayner has been vocal about the spread of “fake news” and hate speech on the internet.
During a visit to a hotel in Rotherham, which was recently targeted by far-right rioters, Rayner expressed her frustration with the misinformation circulating online.
When questioned about Elon Musk’s controversial remarks, including his criticism of the Prime Minister and allegations of biased treatment towards Muslims, Rayner stressed that social media platforms must take responsibility for curbing fake news and hate speech. She emphasized that both online and offline violence would be addressed firmly by the law.
Elon Musk’s recent comments have drawn significant backlash. He labeled the Prime Minister as ‘two-tier Keir’ and suggested that the government is unfairly favoring Muslims over white individuals in handling the riots.
Musk, who boasts nearly 193 million followers on X (formerly Twitter), has been highly active in discussing issues related to immigration, crime, and UK politics since the Southport stabbings last Monday.
His platform has been criticized for not removing racist comments and for allowing controversial figures like Tommy Robinson and Andrew Tate to remain active.
A false claim about a Muslim asylum seeker being involved in the Southport incident, which originated on X, quickly went viral after being propagated by a Russia-linked fake news site.
Ofcom’s Call for Immediate Action
Ofcom’s letter also addressed the upcoming Online Safety Act, which will impose stricter regulations on tech giants, including significant fines for failing to control illegal content.
While these new rules are still months away from implementation, Ofcom urged video-sharing platforms to enhance their systems now to prevent the spread of harmful videos.
The regulator’s current rules, which pre-date the Online Safety Act, require certain platforms to protect users from incendiary content. However, these rules primarily apply to platforms like TikTok, Twitch, Snapchat, and Vimeo, and do not cover major players like X and Facebook.
Government’s Increased Surveillance
In a related development, it has been revealed that a secretive government unit, previously accused of spying on anti-lockdown activists during the pandemic, has been mobilized to monitor social media during the riots.
The Counter Disinformation Unit (now known as the National Security Online Information Team) is scrutinizing online activity, despite recent calls for an independent review of its operations.
Political Reactions and Criticisms
Labour MP Andrew Lewin has strongly condemned Elon Musk, accusing him of exacerbating divisions and inflaming hatred with his comments on the riots.
Lewin, who recently won his seat from former Defence Secretary Grant Shapps, criticized Musk for his inflammatory rhetoric and called for a balance in mainstream voices against the backdrop of Musk’s controversial statements.
These developments highlight the ongoing tension between social media platforms, government regulation, and public discourse amidst the current crisis in the UK.
Mine Crypto. Earn $GOATS while it is free! Click Here!!