Social media has changed the way we communicate together and throughout the pandemic has allowed many of us to continue to connect with our friends and families. As a regular social media, it’s also been a great way of getting information out to constituents and letting them know what is going on in Parliament and at home in Bassetlaw.
But there is another side to social media. A couple of weeks ago we saw a widespread boycott of different platforms in response to online abuse. The same weekend, I posted an example of some of the abuse I have had directed at me as part of my job. Whilst the online world can seem very different to the rest of society, as a general rule I think most people would agree that if something isn’t ok offline then it isn’t ok online either.
The Government is publishing the draft Online Safety Bill to protect children online and tackle some of the worst abuses on social media, including racist hate crimes. At the same time, Ministers have added landmark new measures to the Bill to safeguard freedom of expression and democracy, ensuring necessary online protections do not lead to unnecessary censorship.
The draft Bill marks a milestone in the Government’s fight to make the internet safe. Despite the fact that we are now using the internet more than ever, over three quarters of UK adults are concerned about going online, and fewer parents feel the benefits outweigh the risks of their children being online – falling from 65 per cent in 2015 to 50 per cent in 2019.
The draft Bill contains a number of provisions. These include new additions to strengthen people’s rights to express themselves freely online, while protecting journalism and democratic political debate in the UK.
There are further provisions to tackle prolific online scams such as romance fraud, which have seen people manipulated into sending money to fake identities on dating apps.
Social media sites, websites, apps and other services hosting user-generated content or allowing people to talk to others online must remove and limit the spread of illegal and harmful content such as child sexual abuse, terrorist material and suicide content.
In line with the government’s response to the Online Harms White Paper, companies will have a duty of care towards their users so that what is unacceptable offline will also be unacceptable online (and Ofcom will also be able to fine them). They will need to consider the risks their sites may pose to the youngest and most vulnerable people and act to protect children from inappropriate content and harmful activity.
The final legislation, when introduced to Parliament, will contain provisions that require companies to report child sexual exploitation and abuse (CSEA) content identified on their services. This will ensure companies provide law enforcement with the high-quality information they need to safeguard victims and investigate offenders.