News
YouTube’s algorithm may be to blame for the platform recently deleting crypto-related content, but content creators are left to speculate.
Study released by the Center for Countering Digital Hate confirms YouTube algorithms consistently push eating disorder and self-harm content to teen girls.
YouTube is a quasi-public space containing all kinds of videos: from musical clips, TV shows and films, to vernacular genres such as “how to” tutorials, parodies, and compilations.
YouTube is more likely to serve problematic videos than useful ones, study (and common sense) finds Shouldn't your next YouTube recommended video be a beauty tutorial, not a conspiracy theory?
YouTube's algorithm pushes right-wing, explicit videos regardless of user interest or age, study finds For years, there have been questions swirling around how YouTube's algorithm works.
New research indicates most users don't fall down dangerous YouTube rabbit holes—but the site can further strengthen hateful echo chambers.
With the launch of YouTube in 2005 and Netflix streaming in 2007, algorithms —not people—began directing our viewing. These recommendations can be fun.
YouTube's algorithm still amplifies violent videos, hateful content and misinformation despite the company's efforts to limit the reach of such videos, according to a study published this week.
Study released by the Center for Countering Digital Hate confirms YouTube algorithms consistently push eating disorder and self-harm content to teen girls.
Results that may be inaccessible to you are currently showing.
Hide inaccessible results