YouTube removed less than one percent of the 15 million hate videos flagged to it, figures given to MPs have revealed.
Statistics requested by the Home Affairs Select Committee show that the video giant deleted 25,145 of the 14,994,703 clips raised on its site as hateful and abusive between July and December last year, equating to just 0.17 percent.
The numbers were requested from YouTube as part of the parliamentary committee’s inquiry into hate crimes.
The revelation prompted accusations from the committee’s chair, Yvette Cooper, that YouTube and its parent company Google weren’t “taking any of this seriously enough”.
Ms Cooper said: “We have raised the issue of hateful and extremist content with YouTube time and time again, yet they’ve repeatedly failed to act. Even worse than just hosting these channels, YouTube’s money-making algorithms are actually promoting them, pushing more and more extremist content at people with every click.
“We know what can happen when hateful content is allowed to proliferate online and yet YouTube and other companies continue to profit from pushing this poison.
“It’s just not good enough. Other social media companies are at least trying to tackle the problem but YouTube and Google aren’t taking any of this seriously enough. They should be accountable for the damage they are doing and the hatred and extremism they are helping to spread.”
According to YouTube the majority of the videos were flagged by artificial intelligence rather than humans. However, it said these computer programmes struggled with the “complex” area of hate speech and that the flags were often found to be inaccurate when reviewed by human moderators.
The select committee requested the figures after Marco Pancini YouTube’s Director of Public Policy Europe, the Middle East and Africa, appeared before MPs last month.
Following the release of the figures, the committee is still waiting on answers from Google on a number of other questions posed in the hearing.
Among the are why videos of the Christchurch shootings was still appearing on YouTube more than a month after the massacre. The company came under heavy criticism after copies of the live stream posted by the shooter on Facebook was uploaded to YouTube thousands of times on the day of the terror attack.
A letter from the committee said that one of the Christchurch videos had received more than 720,000 views.
MPs also want Google executives to explain why YouTube recommends videos of far-right figures such as Stephen Yaxley-Lennon (also known as Tommy Robinson), even to viewers who have never watched such content.