YouTube has shared some new stats on what it’s calling ‘Violative View Rate’, or the amount of times that users are exposed to content that violates its policies.
Like Facebook, YouTube has come under scrutiny of late over its capacity to help fuel dangerous movements and conspiracy theories, with YouTube’s recommendation system sometimes taking people down concerning rabbit holes, which some have claimed can even lead to the radicalization of users.
At least some of this type of content violates YouTube’s rules, and according to its Violative View Rate stats, YouTube says that it is getting much better at limiting such impacts.
As you can see here, YouTube’s VVR rate has declined significantly over time, with YouTube claiming that it’s now able to detect 94% of all violative content through automated flagging, with 75% of it being removed before receiving even 10 views.
As per YouTube:
“Our teams started tracking [VVR] back in 2017, and across the company it’s the primary metric used to measure our responsibility work. As we’ve expanded our investment in people and technology, we’ve seen the VVR fall. The most recent VVR is at 0.16-0.18% which means that out of every 10,000 views on YouTube, 16-18 come from violative content. This is down by over 70% when compared to the same quarter of 2017, in large part thanks to our investments in machine learning.”
Of course, there are a couple of provisos here.
For one, the content has to violate YouTube’s rules to trigger this detection, which, in many cases, doesn’t cover all of the stated concerns. Various problematic channels remain on YouTube, despite repeated complaints, so while they may still be contributing to broader concerns, they wouldn’t show up in this data track.
There’s also a question of scale. While, as YouTube notes, only 16 out of every 10,000 video views go to ‘violative content’, YouTube facilitates ‘billions of views’ every day. That means that such content is still seeing more than 1.6 million views on the platform every day, maybe a lot more – so while the overall trend is good, and it’s seeing significant improvement over time, reflecting the platform’s efforts, the scale of the issue remains problematic, in terms of overall views.
Can YouTube ever fix that, given its usage?
Well, probably not. YouTube’s systems are clearly improving in detecting such material, which is a huge positive, but with so much content being uploaded to the platform in real-time, and with so many users, it’s likely not possible that YouTube will ever be able to eliminate violative content entirely.
That means that it will still have an impact – and again, while YouTube continues to allow some concerning material to remain on its site, it still has the potential to cause significant harm, and fuel dangerous movements, via its systems.
That’s a somewhat harsh take on these numbers, which reflect YouTube’s rising investment in such tools, which are delivering results. But the reality is that social platforms, particularly those that reach billions of users, will always have the capacity to help amplify concerning elements.
YouTube’s doing better on this front, but it’s a battle that will never end.
YouTube says that it will now include Violative View Rate data in its regular Community Guidelines Enforcement reports.