The redrafted Bill has been presented to Parliament. It still contains the controversial category of legal-but-harmful. The definition of this will now be done by Government in secondary legislation, and not deputised to social media companies. This leaves open the potential for defining views critical of the government as "harmful" as used by the US Department of Homeland Security (Feb 7, 2022) - that American citizens sharing ‘misinformation’ may now be considered domestic terrorism.
(1) the proliferation of false or misleading narratives, which sow discord or undermine public trust in U.S. government institutions;
In this case misinformation is defined as:
“Misinformation, disinformation, and malinformation make up what CISA defines as “information activities”. When this type of content is released by foreign actors, it can be referred to as foreign influence. Definitions for each are below.
– Misinformation is false, but not created or shared with the intention of causing harm.
– Disinformation is deliberately created to mislead, harm, or manipulate a person, social group, organization, or country.
– Malinformation is based on fact, but used out of context to mislead, harm, or manipulate.”
Sky News
Social media firms could be fined and company bosses jailed, under new government proposals to tackle online abuse.
Nadine Dorries:
"What we're saying is - deal with those harmful algorithms; deal with that online scam advertising; don't allow our children to have pornography just popping up on their phones and their screens; don't use that allow those algorithms to send our children to suicide chat rooms. Protect our kids. And we'll also give journalists and individuals robust protections for freedom of speech."
Beeban Kidron:
"We were promised a systems and process bill - one that actually took a swipe at the system of silicon valley, and one that would make our kids safe, right across the uk, wherever they were online. And this bill does not do that yet. If the act coming out of Parliament - rather than the bill going into Parliament - does not protect the next Molly Russell, then it's not fit for purpose and that is the marker for all of us."
Daily Mirror
Whitehall Correspondent Mikey Smith explains whether Nadine Dorries' tech crackdown that has been five year in the making is more censorship than sensible.
"The Labour side bill is too little, too late and has allowed Russian disinformation campaigns to flourish in a wild west internet. And some campaigners have warned that the bill amends to a censor's charter - it doesn't include enough protections for freedom of speech. ... this is the contentious bit of the bill which is the question of material that is legal but harmful. Companies are going to have to set out in their terms of service how they plan to deal with content that isn't actually illegal but cause could cause harm to their users. And that list of material hasn't been set out yet. That'll come later in secondary legislation... but that's likely to include things like self-harm images, content about eating disorders, depression and suicide, it's also been suggested racist abuse and disinformation could come under that banner of legal but harmful.
And while all this might seem quite sensible on the face of it, and while it does have an explicit carve-out for journalism and for content made by news websites, and while there will be a legal right to appeal if your content has been removed under these rules - It's ringing some alarm bells with some campaigners who are worried that it could lead to - and I quote - "an Orwellian censorship machine". The open rights group have warned that the bill will still require companies to remove content that while unpleasant is - let's face it - legal. And while parliament will be asked for a rubber stamp on the list of harmful material, it will effectively be ministers who decide the long list of what they ask for. And the open rights groups say this amounts to state-sanctioned censorship of legal content. And that list could be modified over the years in secondary legislation - it's not tied to this bill in itself"
"It may have taken five years to get this far, but it's not out of the door yet."
GBNews
Online Safety Bill: 'Plenty to be concerned about' says Legal and Policy Officer
Mark Johnson, Big Brother Watch.
"From first look at the bill it does appear to be the kind of censor's charter that we feared. The legal-but-harmful provision does remain. It's been redrafted but there is a specific targeting of lawful expression. And the bill still deputises silicon valley to act as online speech police. So there's certainly plenty to be concerned about when it comes to upholding our rights to freedom of speech.
I think one of the most important things to do when looking at this bill is to think about the the impression that it could give around the world - the way that we as a country lead as an example around the world - and is there anything in this bill that less democratic governments could look at and say "Well, they're doing that in the UK - why can't we do it too?" and I think that's one of those provisions that I just worry about as the government move in a more draconian direction and they talk about jailing tech executives; and less democratic countries around the world will say "hey - they're doing it why can't we do it as well?"
Times Radio
Emily Taylor discusses the Online Safety Bill with Times Radio's Matt Chorley
Ian Russell
"I think it's really important that the sanctions behind the online safety bill behind the regulator OFCOM as stringent enough to refocus the corporate culture of big tech. These are huge global powerful companies and without a massive impetus they will continue to do what they've done for the last 18 or so ...to change corporate culture is difficult, so the big fines that are there are one measure, but these companies are very rich - there's nothing quite like the threat of a criminal sentence aimed at the senior managers of big companies to focus their minds, and help them make the changes to make their platforms safe - they should have made themselves when they had the opportunity.
Emily Taylor, Editor of Chatham House journal of cyber policy.
"Regulating speech is really difficult and democracies have been rightly hesitant to bring forward regulation. This really can trace its life back to all of the electoral interference and the harms over social media during the 2016 era
It's going to really really boost the powers of the regulator OFCOM. The duty of care is not without its controversy and really a lot of this comes back to the fact that regulating speech is difficult: If you don't do it you you risk a sort of overly libertarian environment in which it's just the survival of the fittest ...the vulnerable users are not sufficiently protected, if you do do it there is always the risk of over-regulation
"...what about the threat of imprisonment of senior executives and turnover-based fines... what will this do to the corporate culture? what it what the risk is is that by regulating speech you inhibit speech... you create incentives to take down content even when it is in the margins. and the recent announcements by the government since february have been like a laundry list of criminalizing more and more and more different types of speech so it's going to be very hard to keep track of very hard to really understand what the boundaries are because with speech context is everything.
The Times:
"...we could end up creating a lot of new offenses but without affecting these tech firms - who actually probably don't take much notice of fines particularly, nor really perhaps even the the the legal threats because they'll they'll make sure they're not personally liable. So it's that we end up essentially sort of curbing free speech with without uh actually addressing some of the real concerns that people have?
Emily Taylor
I think there are two things there. First of all, yes, definitely, the civil society voices are definitely warning that actually rather than having a big laundry list of different types of speech to criminalize, how about legislation that tries to go to the fundamental advertising-based business models and the sort of incentives that those create?
I probably would challenge the assumption that the platform is completely lawless ... speaking to colleagues who are implementing similar laws in Australia and so on they're putting huge teams and resources to try and comply… But spare a thought for the global platforms - in that they are trying to comply with every law in the world and that's not easy when they're all different and even within the continent of europe we're going to have the UK doing something slightly different to what the EU is doing
NTD UK News
Online Safety Bill Reaches Parliament.
The Government’s long-awaited proposals for new internet safety laws have been formally introduced to Parliament after a number of major updates. The Online Safety Bill, which has been in progress for around five years, is intended to tackle a wide range of harmful online content. It will see Ofcom as a new regulator for the sector, with the power to fine companies or block access to sites that fail to comply with the new rules
Legal-but-harmful will now be defined by the government. It'll cover specific contents such as exposure to self-harm harassment and eating disorders. Social media users will also have the right to appeal if they feel their post has been taken down unfairly.News content will be completely exempt from regulation under the bill. Executives of companies that failed to cooperate with regulator OFCOM's information requests could face prosecution and jail within two months of the bill becoming law much more swiftly than the two years stated in the bill when it was a draft. ... culture secretary Nadine Dorries said social media algorithms that decide what users see based on their online habits will also come under more scrutiny.
Nickie Aitken MP
MP for the Cities of London & Westminster
Nickie Aiken welcomes the Government's Online Safety Bill which began its journey through Parliament. Nickie is delighted that the Government took on her calls to include fraudulent advertising in the Bill. If passed, social media companies will be responsible and held to account to ensure they take a proactive approach to keeping their online platforms safe.
They misspelled safety;
c-e-n-s-o-r-s-h-I-p
noun
1. the condition of being protected from or unlikely to cause danger, risk, or injury.
"they should leave for their own safety"