Britain's porn crackdown proves pricey: Ofcom fines company £800,000

Britain's porn crackdown proves pricey: Ofcom fines company £800,000
Source: Daily Mail Online

Britain's porn crackdown has proved pricey for one company, which has been fined £800,000 for not having robust age checks in place.

An investigation found that Kick Online Entertainment SA failed to comply with age-check requirements between July 25 and December 29 last year.

Ofcom said the company had since introduced an age-check method that was 'capable of being highly effective'.

However the regulator said it had also fined Kick £30,000 for failing to respond to its requests for information in an accurate, complete and timely way.

It added that it would impose a daily penalty of £200 on the company until it responded or for a period of 60 days, whichever was sooner.

Suzanne Cater, director of enforcement at Ofcom, said: 'Having highly effective age checks on adult sites to protect children from pornographic content is non-negotiable.
'Any company that fails to meet this duty - or engage with us - can expect to face robust enforcement action, including significant fines.
'We continue to investigate other sites under the UK's age check rules and will take further action where necessary.'

Since July 25, the Online Safety Act has required the operators of online platforms to prevent children from viewing 'harmful content'.

That includes explicit content, like pornography, but also content that encourages self-harm or suicide, promotes dangerous challenges, shows serious violence, or incites hatred against people.

Platforms found to be in breach of the act could face a range of punishments, including fines of £18 million or 10 per cent of global turnover.

In extreme cases, companies may be blocked from operating in the UK.

Porn providers have seven options to check their visitors are over-18.

  • photo-ID matching
  • facial age estimation
  • mobile-network operator (MNO) age checks
  • credit card checks
  • email-based age estimation
  • digital identity services
  • open banking

The laws were created in response to what many consider to be an alarming rise in young children accessing disturbing or harmful content online.

A study conducted last year by the charity Internet Matters found that seven in ten children aged nine to thirteen said they had been exposed to harmful content online.

Children in this age group reported coming across hate speech (13 per cent), coming across mis/disinformation (15 per cent), and one in ten has seen violent content or content that promotes violence.

Similarly, Ofcom research found that eight per cent of UK children aged eight to fourteen visited a porn site at least once a month.

Following the crackdown, Pornhub implimented a restriction on new UK users, starting this month.

Aylo, the Cyprus-based company which owns the pornography site, said that from February 2 it would block new British users who had not previously verified their age.

While the Online Safety Act rules are intended to make it harder for under-18s to see explicit material, Aylo claims they have 'diverted traffic to darker, unregulated corners of the internet'.

As a result, it says it has 'not achieved its goal of protecting minors'.

'We cannot continue to operate within a system that, in our view, fails to deliver on its promise of child safety, and has had the opposite impact,' Aylo's statement said.
'Despite the clear intent of the law to restrict minors' access to adult content... our experience strongly suggests that the OSA [Online Safety Act] has failed to achieve that objective.'

What is the Online Safety Act?

The Online Safety Act 2023 (the Act) is a new set of laws that protects children and adults online.

It puts a range of new duties on social media companies and search services, making them more responsible for their users' safety on their platforms.

The Act will give providers new duties to implement systems and processes to reduce risks their services are used for illegal activity, and to take down illegal content when it does appear.

The strongest protections in the Act have been designed for children.

Platforms will be required to prevent children from accessing harmful and age-inappropriate content and provide parents and children with clear and accessible ways to report problems online when they do arise.

The Act will also protect adult users, ensuring that major platforms will need to be more transparent about which kinds of potentially harmful content they allow and give people more control over the types of content they want to see.