Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search
 

Purveyor

(29,876 posts)
Wed Jan 28, 2015, 01:57 PM Jan 2015

Google: Youtube Is So Overloaded Staff Cannot Filter Content

BRUSSELS (AP) -- Internet giant Google said Wednesday that its video-sharing website YouTube is so inundated that staff cannot filter all terror related content, complicating the struggle to halt the publication of terrorist propaganda and hostage videos.

Google Public Policy Manager Verity Harding said that about 300 hours of video material is being uploaded to YouTube every minute, making it virtually impossible for the company to filter all images.

Harding spoke at a European Parliament meeting of the ALDE liberal group on a counter-terrorism action plan.

She said that "to pre-screen those videos before they are uploaded would be like screening a phone call before it's made."

The European Union's counter-terror chief believes it's time to help companies to contain the security risk by having experts from member states flagging terror-related content.

more...

http://hosted.ap.org/dynamic/stories/E/EU_EUROPE_INTERNET_TERROR?SITE=AP&SECTION=HOME&TEMPLATE=DEFAULT&CTIME=2015-01-28-12-35-57

9 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Google: Youtube Is So Overloaded Staff Cannot Filter Content (Original Post) Purveyor Jan 2015 OP
This is why they badly need efficient content algorithms to weed out repeat offending videos. chrisa Jan 2015 #1
I'm not 100% certain, but it would shock me if Gootube isn't already doing that.. tridim Jan 2015 #2
I think they are. You're right. chrisa Jan 2015 #5
The problem with the technology is very slow. Xithras Jan 2015 #8
How about a DU-style jury system? (nt) Nye Bevan Jan 2015 #3
Ha ha. That's a good one! Comrade Grumpy Jan 2015 #4
A jury voted 4-3 to LEAVE IT. KamaAina Jan 2015 #6
yet they almost immediately pull down vids with copyrighted music wyldwolf Jan 2015 #7
that is easier to do with software.. snooper2 Jan 2015 #9

chrisa

(4,524 posts)
1. This is why they badly need efficient content algorithms to weed out repeat offending videos.
Wed Jan 28, 2015, 02:01 PM
Jan 2015

The technology is still young, but it's been used to try and track child porn videos and images. I think that, in the future, it will be possible to auto-delete videos based on their content. It's very high level stuff right now, but the power of computers is growing exponentially.

For example, an algorithm that looks for nudity would automatically delete videos from YouTube.

tridim

(45,358 posts)
2. I'm not 100% certain, but it would shock me if Gootube isn't already doing that..
Wed Jan 28, 2015, 02:07 PM
Jan 2015

At least to pre-flag potential videos for further scrutiny.

We've been able to 'search by image' for a while now, same tech.

chrisa

(4,524 posts)
5. I think they are. You're right.
Wed Jan 28, 2015, 02:52 PM
Jan 2015

In the future, I envision something that would be able to analyze every frame on every uploaded video to YouTube, and reject it if it needs to be. That would be pretty insane.

Xithras

(16,191 posts)
8. The problem with the technology is very slow.
Wed Jan 28, 2015, 03:16 PM
Jan 2015

First, Google does have a system in place to filter repeat offenders, but it's imperfect. When a video or audio file is uploaded, Youtube creates a fingerprint for the file and compares it against a list of banned fingerprints. If someone is trying to upload a file that has previously been banned, it can automatically be blocked. The problem? You only need to adjust the video a bit to change its fingerprint and dodge the ban. Change the audio track, trim a few seconds off the video, and add in a subtitle or credit, and you've now altered the video enough to get it back into the system.

Automatic recognition of illegal videos works very poorly, and is generally not used by any major sites. In the US, for example, the difference between a legal video of someones daughter playing by the pool in a swimsuit and an illegal child porn video of that same child in the same swimsuit by the same pool is simply her position and demeanor. Computers are nowhere near having the capability to comprehend that kind of nuance. Similarly, with these ISIS videos, there's no way that a computer can tell the difference between footage of a Nevada hunter walking across the desert with a rifle while looking for an elk to shoot, and footage of an ISIS jihadi walking across the desert with a rifle looking for an Iraqi soldier to shoot. Or the difference between a video of some terrorist blowing up a building with innocent people inside, or a video shot by a U.S. soldier in Iraq showing his own combat experiences.

Until computers figure out context and nuance, filtering these kinds of videos will largely remain a manual job.

 

snooper2

(30,151 posts)
9. that is easier to do with software..
Wed Jan 28, 2015, 03:16 PM
Jan 2015

match 15 seconds of a Kate Perry song versus some random wacked out fundie screaming "Death to the West wack her head off!"


Latest Discussions»General Discussion»Google: Youtube Is So Ove...