Mozilla has launched a mini-site (in English) on which 28 stories of YouTube users can be read about the recommendation algorithm. So many incidents where YouTube's algorithm of recommendations showed them sometimes unsustainable videos showing eg violence, racism, or apologizing for conspiracy theories to users – adults and children – who would have fine.
YouTube: Mozilla wants to help Google improve video recommendation algorithm
"I started watching a boxing match, then boxing matches in the street, and then I finally came across some street fighting videos, then accidents and urban violence … I am came out with a horrible view of the world and feeling bad, without it being something I wanted, says a user.
Another user also tells how the algorithm plays the overbid: "I started looking for 'fact videos' where people fall or get hurt a little. YouTube then offered me a channel that showed videos from Dash Cams in cars. At first there were minor accidents, but later it evolved into cars that explode and fall from bridges – videos in which people clearly did not survive the accident. I felt a little sick, and I never looked for this type of content.
There is also the story of this professor scandalized by the number of conspiracy videos: "I was watching serious documentaries on Apollo 11. But YouTube's recommendation algorithm is now full of videos of conspiracy theories: September 11th, Hitler's escape, alien researchers and anti-American propaganda." There are 28 stories on the project site about how YouTube showed LGBTphobic, racist, or sexual content.
70% of videos viewed on YouTube are offered by the recommendation algorithm. But this algorithm is regularly the target of criticism that it is for its dimension addictive (one can spend hours to be shown videos recommended) that for its failures. The Mozilla mini-site is the showcase of a project called #YouTubeRegrets whose goal is to find solutions.
"These stories show that the algorithm gives more weight to the engagement – it shows content that keeps the user on the hook, whether or not this content is dangerous"says Ahsley Boyd, project manager at Mozilla. And to add: "We believe that these testimonials fairly represent the larger problem of YouTube's algorithm: recommendations that can aggressively push bizarre or dangerous content. The fact that we can not study these stories in greater depth – lack of access to adequate data – reinforces the idea that the algorithm is opaque and out of control. "
Mozilla's bet is that YouTube and Google can no longer perfect their algorithm by themselves: "We do not think this is a problem that can be solved internally. It is too serious and complex. YouTube must allow independent researchers to contribute to the search for a solution, "says Ashley Boyd. The foundation would like to lead the platform to open up its data to the research community, to help them build simulation tools, and more broadly to give them more leeway than to leave them on the sidelines.
Read also: YouTube finally offers options to control video recommendations
What do you think ? What is the strangest video recommendation that YouTube has offered you? Share your feedback in the comments.
Source: The Next Web
Source link
https://www.phonandroid.com/youtube-ces-temoignages-montrent-les-dangers-de-lalgorithme-de-recommandation-de-videos.html