CEO of Substack says the Online Safety Act is bad for free speech – The Free Speech Union

0
3
CEO of Substack says the Online Safety Act is bad for free speech – The Free Speech Union


The CEO of Substack, Chris Best, has criticised the Online Safety Act, becoming one of the latest prominent figures to warn of the threat the legislation poses to free speech.

Substack is an online publishing platform for writers and journalists — both professionals and novices — and has become an increasingly important forum for creative thought, debate, and discussion. Its presence in the UK continues to grow, particularly as a communications tool for cultural and political figures seeking to reach wider audiences.

As a result of the Online Safety Act, Substack is now required to comply with Ofcom’s new rules and regulations. Chris Best has said that while implementing these requirements — as dictated by Ofcom, the Act’s regulator — he has been deeply alarmed by what they entail.

In reflecting on the Act and its ever-expanding list of obligations, Chris has hit the nail on the head. He acknowledges that, amid growing and legitimate concerns about children’s welfare online, the Online Safety Act may appear — on the surface — to be a “commonsense response”. In reality, however, he argues that it is something far darker: “a system of mass political censorship unlike anywhere else in the western world.”

Complying with the Online Safety Act does not simply involve hiring additional moderators or applying symbolic warning labels. In practice, it requires companies — at significant cost — to design and operate systems that continuously classify and censor speech at scale. These systems are expected to pre-emptively restrict lawful expression, based on what Ofcom may deem unsuitable for children.

Human moderators and artificial intelligence systems will be deployed to scrutinise essays, satire, photography, and comments on social media posts, scanning for potential “risk triggers”. Inevitably, in attempting to predict regulatory risk, lawful and legitimate speech will be censored alongside genuinely illegal material.

Chris believes these regulations will have a particularly damaging impact on Substack. Once content is classified as potentially sensitive, age-rating requirements often follow. Substack’s readership — overwhelmingly adults accessing lawful material — may then be required to verify their age via third-party checks, including financial or facial verification.

Such measures can reasonably be seen as a burdensome inconvenience at best, and an invasion of privacy at worst. Unsurprisingly, this is likely to deter many readers — or potential readers — from engaging with the platform. This harms not only those seeking knowledge and ideas, but also writers whose livelihoods depend on their ability to publish and reach an audience.

As Chris has put it: “The result is that vast swathes of legitimate cultural discourse are swept up, bogged down, and discouraged.”

He contrasts this with Substack’s core values: encouragement of authorial independence and a firm commitment to freedom of the press — a cornerstone of any free society.

Chris has also spoken about the formative role reading played in his own childhood, something his parents actively encouraged. It exposed him, he says, to ideas beyond his immediate surroundings, giving him “a window into a world far larger than my own Canadian suburb”. He now encourages the same curiosity in his own children.

He recalls exploring the internet in the 1990s, where he encountered both bad ideas and transformative ones — a process he views as essential to developing independent thought. Exposure to different cultures and perspectives ultimately shaped his career as a technologist and the life he built in San Francisco.

While Chris recognises the serious and genuine harms that exist online, particularly for children, he believes the Online Safety Act is the wrong response. Governments, he argues, have a duty to act — but not by censoring speech. What the Act does, he warns, is force platforms like Substack to build systems that decide when satire becomes a threat, when photography is too graphic, or when content might later be deemed unacceptable by a regulator.

Chris is right to highlight the dangers of the Online Safety Act — especially within the wider context of the UK’s already expansive policing of speech. In 2023, 13,800 people were arrested for “offensive” online posts or messages, while 90 per cent of all crimes went unsolved.

Substack, Chris says, will comply with the law. However, he rightly warns that many smaller publishers and platforms will not have the financial or technical resources to do the same.

It is increasingly clear that the Online Safety Act is not what it claims to be. Marketed as domestic legislation, it is now likely to become the frequent subject of litigation in US courts. As Chris has said: “It is not the most effective way to keep kids safe, and it hurts a free society.”

He offers a stark final warning: “If this model spreads, it won’t just block content for children. It will determine whether adults can read, write, and argue freely without first submitting to surveillance.”

Read more in The Telegraph.





Source link