YouTube will put disclaimers on state-funded broadcasts to fight propaganda

YouTube logoYouTube's latest strategy to fight the spread of misinformation involves putting a disclaimer on videos from certain news sources. The online video website announced it will start labeling videos posted by state-funded broadcasters to alert viewers that the content is, in some part, funded by a government source. YouTube will begin labeling videos today, and the policy extends to outlets including the US's Public Broadcasting Service (PBS) and the Russian government broadcaster RT.

According to a report by The Wall Street Journal, PBS videos will now have the label "publicly funded American broadcaster," while RT will have this disclaimer: "RT is funded in whole or in part by the Russian government."

The new policy is YouTube's way of informing viewers about where the content they're watching is coming from, a piece of information often hidden or left unsought by the viewers themselves. “The principle here is to provide more information to our users, and let our users make the judgment themselves, as opposed to us being in the business of providing any sort of editorial judgment on any of these things ourselves,” YouTube Chief Product Officer Neal Mohan told the WSJ.

While providing more information about the source from which viewers get their news on YouTube is helpful, Mohan's sentiment is at odds with another strategy currently in development: YouTube is reportedly considering surfacing "relevant videos from credible news sources" when conspiracy theory videos pop up about a specific topic. For now, YouTube will reserve editorial judgement—until it starts deciding which news sources are deemed credible on its website. However, we don't know if this strategy will become a reality anytime soon, as it's still in the early development stages.

YouTube's decision to label all state-funded news videos comes after heavy criticism from the US government and others about big tech companies' involvement in the spread of misinformation. Facebook, Google, and others have had to answer questions about how Russian actors were able to easily spread misinformation regarding the 2016 election to millions of Americans.

The new policy also comes after YouTube has dealt with a number of controversies surrounding inappropriate content on its website. In just the past year, YouTube went through an ad-pocalypse after advertisers found out their ads were running over extremist videos; it had to address public outcry to the distorted and inappropriate children's content on the site (some of which misused popular children's characters or involved the potential abuse of children themselves); and it had to set up new rules to police its biggest creators after Logan Paul uploaded a video featuring the dead body of a suicide victim.

In short, it was only a matter of time before news organizations on YouTube would have to deal with new rules made specifically for them. The new labeling policy will be helpful for some YouTube viewers as it will shed a bit more light on their favored news sources. It will also show Congress that YouTube is, at the very least, trying to inform its audience of possible misinformation and propaganda coming from government-backed sources.

But general conspiracy-theory videos are just as big of an issue on YouTube as government propaganda videos. The company has been tweaking its algorithm ever since conspiracy-theory videos about last year's Las Vegas shooting populated search results immediately after the incident. However, most of the reported algorithm changes surround promoting more reputable sources rather than downgrading or hiding misleading sources.

YouTube is reportedly still working on changing its algorithm to serve more mainstream news results in news-related searches. But it's unlikely that algorithm tweaks will be able to totally squash conspiracy theory videos from gleaning millions of views when those misleading videos continue to pop up in a viewer's "recommended" section.

Until now, YouTube's algorithm for serving up content never focused on truthfulness—it has always been focused on delivering videos that viewers are most likely to click on next. It's unclear (and likely will be for quite some time) if the new changes will successfully convert users away from sensationalized and inaccurate conspiracy videos.

Source: Ars Technica

Tags: YouTube

Comments
Add comment

Your name:
Sign in with:
or
Your comment:


Enter code:

E-mail (not required)
E-mail will not be disclosed to the third party


Last news

 
You can use a security key instead of having a code sent to your phone
 
Adobe says that the AI can now achieve the intended result in seconds
 
A new security protocol replacing the aging WPA2
 
Download and install at your own risk, of course
 
More iPhone parts likely to be produced by Samsung
 
Starting on Friday, video views on YouTube will start to be counted by the Official Charts Company
 
LG has already announced two new V-series members in 2018
 
The method is blocked and the hack doesn’t work, it adds
The Samsung Galaxy A5 (2017) Review
The evolution of the successful smartphone, now with a waterproof body and USB Type-C
February 7, 2017 /
Samsung Galaxy TabPro S - a tablet with the Windows-keyboard
The first Windows-tablet with the 12-inch display Super AMOLED
June 7, 2016 /
Keyboards for iOS
Ten iOS keyboards review
July 18, 2015 /
Samsung E1200 Mobile Phone Review
A cheap phone with a good screen
March 8, 2015 / 4
Creative Sound Blaster Z sound card review
Good sound for those who are not satisfied with the onboard solution
September 25, 2014 / 2
Samsung Galaxy Gear: Smartwatch at High Price
The first smartwatch from Samsung - almost a smartphone with a small body
December 19, 2013 /
 
 

News Archive

 
 
SuMoTuWeThFrSa
    123
45678910
11121314151617
18192021222324
252627282930 




Poll

Do you use microSD card with your phone?
or leave your own version in comments (11)