TNW Answers is a live Q&A belvedere where we invite absorbing people in tech who are much smarter than us to answer questions from TNW readers and editors for an hour. 

YouTube, which has more than a billion users who watch over a billion hours of agreeable per day, shows us bound data on the videos uploaded to the site including its number of views, likes, and dislikes — but the video-streaming site hides more all-embracing stats about each video, like how often it recommends a video to other people.

Guillaume Chaslot is alive to change this. A computer programmer and ex-YouTube insider, Chaslot ahead worked on recommendations at YouTube and is now the architect of AlgoTransparancy, a activity angry to bring more accuracy to how people find videos on YouTube.

[Read: TikTok’s acquirements from YouTube’s mistakes and YouTubers should be taken seriously, says researcher]

Earlier this week, Chaslot hosted a TNW Answers affair where he explained the accent of comparing algorithms, YouTube’s albatross in advising videos, and attached the amount of cabal theories about the coronavirus. 

YouTube’s recommended videos appear in the “Up next” list on the right of the screen and they’ll also play automatically when you’ve got autoplay enabled. According to Chaslot, these are the videos you should be wary of.

Last year, Chaslot told TNW: “It isn’t inherently awful that YouTube uses AI to acclaim videos for you, because if the AI is well tuned it can help you get what you want. This would be amazing. But the botheration is that the AI isn’t built to help you get what you want — it’s built to get you absorbed to YouTube. Recommendations were advised to waste your time.”

It doesn’t take many clicks and searches to find yourself in a YouTube rabbit hole with a sense of being ‘algorithmically guided and manipulated.’ “On YouTube, you have this activity of ‘zoom in’ into a accurate topic,” Chaslot said. “But you could have algorithms that ‘zoom out’ and make you ascertain new things and it’s pretty easy to apparatus — I did it at Google. It’s just that those algorithms are not as able with watch time.”

’s business model relies heavily on ads and ‘watch time’ to accomplish revenue, it’s as simple as that. Chaslot argued that YouTube doesn’t accent the user’s interests by saying: “[YouTube] tries to accept what’s best for the advertisers and pretend that it’s also best for the users.” By asking users what they really want from the platform, the user acquaintance would improve, says Chaslot.

Recommended radicalization, misinformation, and ambiguous content 

Chaslot argued that highlighting YouTube’s algorithm incentives (i.e. watch time doesn’t equal quality) should prove its abrogating effect on our society.

If YouTube was filled with only funny cat videos, how it generates its “Up next” videos wouldn’t be a cause for concern. But as people rely on YouTube for advice and news consumption, Chaslot worries recommendations will edge people added to extremes, whether they’re gluttonous it out or not.

This worry is also applicative to platforms like TikTok. “The botheration is not user generated content: Wikipedia is using it. The botheration is when algorithms decide who gets amplified, and who doesn’t,” Chaslot said. “TikTok has many abeyant issues, abnormally with censorship. Think about this: our kids are slowly acquirements that they shouldn’t criticize the Chinese government. Not because they get threatened, but because when they do, their posts don’t get traction. Meanwhile the Chinese government is blame the anecdotal that the coronavirus comes from the US.”

Platform or publisher?

Facebook, Twitter, and YouTube have long had a simple answer to anyone who banned of what their users were up to — they’re platforms and not publishers. They claim to be merely tools that serve for free expression, and not publishers who take albatross for the agreeable they distribute.

“The legislation that says that YouTube is a belvedere is called CDA 230 and was voted in 1996,” Chaslot said. “At that time, AI didn’t exist. Recommendations didn’t exist at the time. Nowadays, YouTube recommends some videos billions of times and takes no albatross for it — that’s a loophole. If you advance addition 1 billion times, you should be amenable as a publisher.”

Last year, YouTube appear it was taking a ‘tougher stance’ appear videos with abolitionist content, which included attached recommendations and appearance like comments and the adeptness to share the video. According to the platform, this step bargain views to these videos on boilerplate 80%.

“Thanks to these changes, there was little fake news about the coronavirus in English, which is a abolitionist change. A few years ago, the YouTube algorithm was announcement anti-vax cabal by the hundreds of millions. But there are still many issues,” Chaslot said. “In France, which is in full confinement, one guy who said the virus was ‘man made’ got nearly a actor views in a few days. He was apparently answer millions of times by the algorithm. This guy angled the total number of views of his channel, and quadrupled his followers or article in that order. So lying on YouTube is still a assisting business.”

Currently, Chaslot believes platforms like Facebook, YouTube, Google’s accouter AI technology to accomplishment our human weaknesses. However, he’s more optimistic about the future: “AIs will help people accomplish their full potential.”

You can read Guillaume Chaslot’s full TNW Answers affair here.

Read next: I’ve assuredly found the absolute EV — and it was made in 1953

Corona coverage

Read our daily advantage on how the tech industry is responding to the coronavirus and subscribe to our weekly newsletter Coronavirus in Context.