General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsGoogle: Youtube Is So Overloaded Staff Cannot Filter Content
BRUSSELS (AP) -- Internet giant Google said Wednesday that its video-sharing website YouTube is so inundated that staff cannot filter all terror related content, complicating the struggle to halt the publication of terrorist propaganda and hostage videos.
Google Public Policy Manager Verity Harding said that about 300 hours of video material is being uploaded to YouTube every minute, making it virtually impossible for the company to filter all images.
Harding spoke at a European Parliament meeting of the ALDE liberal group on a counter-terrorism action plan.
She said that "to pre-screen those videos before they are uploaded would be like screening a phone call before it's made."
The European Union's counter-terror chief believes it's time to help companies to contain the security risk by having experts from member states flagging terror-related content.
more...
http://hosted.ap.org/dynamic/stories/E/EU_EUROPE_INTERNET_TERROR?SITE=AP&SECTION=HOME&TEMPLATE=DEFAULT&CTIME=2015-01-28-12-35-57
chrisa
(4,524 posts)The technology is still young, but it's been used to try and track child porn videos and images. I think that, in the future, it will be possible to auto-delete videos based on their content. It's very high level stuff right now, but the power of computers is growing exponentially.
For example, an algorithm that looks for nudity would automatically delete videos from YouTube.
tridim
(45,358 posts)At least to pre-flag potential videos for further scrutiny.
We've been able to 'search by image' for a while now, same tech.
chrisa
(4,524 posts)In the future, I envision something that would be able to analyze every frame on every uploaded video to YouTube, and reject it if it needs to be. That would be pretty insane.
Xithras
(16,191 posts)First, Google does have a system in place to filter repeat offenders, but it's imperfect. When a video or audio file is uploaded, Youtube creates a fingerprint for the file and compares it against a list of banned fingerprints. If someone is trying to upload a file that has previously been banned, it can automatically be blocked. The problem? You only need to adjust the video a bit to change its fingerprint and dodge the ban. Change the audio track, trim a few seconds off the video, and add in a subtitle or credit, and you've now altered the video enough to get it back into the system.
Automatic recognition of illegal videos works very poorly, and is generally not used by any major sites. In the US, for example, the difference between a legal video of someones daughter playing by the pool in a swimsuit and an illegal child porn video of that same child in the same swimsuit by the same pool is simply her position and demeanor. Computers are nowhere near having the capability to comprehend that kind of nuance. Similarly, with these ISIS videos, there's no way that a computer can tell the difference between footage of a Nevada hunter walking across the desert with a rifle while looking for an elk to shoot, and footage of an ISIS jihadi walking across the desert with a rifle looking for an Iraqi soldier to shoot. Or the difference between a video of some terrorist blowing up a building with innocent people inside, or a video shot by a U.S. soldier in Iraq showing his own combat experiences.
Until computers figure out context and nuance, filtering these kinds of videos will largely remain a manual job.
Nye Bevan
(25,406 posts)Comrade Grumpy
(13,184 posts)KamaAina
(78,249 posts)wyldwolf
(43,869 posts)snooper2
(30,151 posts)match 15 seconds of a Kate Perry song versus some random wacked out fundie screaming "Death to the West wack her head off!"