MP: Current internet UGC laws are ludicrous
Select Committee chair explains recommendations
MP John Whittingdale has told TechRadar that laws that are pushing companies not to moderate their user generated content are ludicrous.
The Conservative Member of Parliament heads up the Culture, Media and Sport Select Committee which reported on how UK internet companies should deal with content that the public put up via their websites.
Currently the likes of YouTube, owned by Google, and many of the major portal sites take what is known as a passive response to moderation – where only posts that have been complained about by other users are checked.
This is because of an EC Directive that suggests that companies that are unaware of content cannot be held liable for it online.
Legal advice
Therefore legal advice for many has suggested that a more pro-active moderation where all, or even some, of the content that is posted is checked out by the company could mean that they are assumed to be 'actively aware' and therefore liable.
"I'd be very, very surprised if any court would penalise a company for trying to moderate," Whittingdale told TechRadar.
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
"If the Directive is making companies think like this then it should jolly well be changed.
"We've essentially said that this stance is ludicrous in our report and I happen to know that the secretary of state is sympathetic to this area and accepts that something needs to be done."
Degrees of protection
Whittingdale accepts that smaller sites may be unable to moderate to the same extent as the major players, and that sites like YouTube will struggle to check every one of thousands of videos put up through its system.
"We accept that there are going to be varying degrees of protection," added Whittingdale. "But if you look at somebody like MySpace they said they were proud of telling their customers the lengths that they go to to ensure that unsuitable content is taken down so quickly.
"Someone like YouTube probably can't check every submission and to an extent their current system works, but just because you can't check everything doesn't mean you shouldn't check anything."
Amazed at lack of prioritisation
One of the biggest problems for the Select Committee was the generally accepted 24-hour timeframe for taking down problem content – regardless of whether it was child abuse pictures or a racist comment.
"We were amazed that there was no prioritisation," added Whittingdale.
"The biggest recommendation we made was that the companies - which currently all have their own rules – was that they need to have some form of self-regulatory body which sets the standards on this and that polices the industry.
"Just because you can get around the rules on the internet, doesn't mean that you shouldn't have rules."
Patrick Goss is the ex-Editor in Chief of TechRadar. Patrick was a passionate and experienced journalist, and he has been lucky enough to work on some of the finest online properties on the planet, building audiences everywhere and establishing himself at the forefront of digital content. After a long stint as the boss at TechRadar, Patrick has now moved on to a role with Apple, where he is the Managing Editor for the App Store in the UK.