We are committed to providing fast, efficient, and affordable software solutions that set new standards in the software development industry.
  • The Race to Stop Violent Videos With Deep Learning
Technology Articles > Social Networking > Facebook > The Race to Stop Violent Videos With Deep Learning

Last week, a very disturbing video was posted live on Facebook. The video depicted a Korean man murdering his eleven-month-old daughter. He then killed himself. All live. All on video. As soon as Facebook heard about the video, the company took it down. But by that time thousands of people had already viewed it.

Social networks have since been scrambling to figure out how to control violent live videos. Although this recent video was not the first of its kind, it was one of the more shocking. Why some people post violent videos is unknown and unfathomable.

Regardless, companies like Facebook that attempt to present a PG image do not want videos like this one to be seen by anyone. The trick is to figure out how to ‘teach’ bots to find these videos before they become viral.

Insert Deep Learning

Deep learning is what most tech companies are currently working on (well, those that want to be the first to stop violent videos with technology). This type of Artificial Intelligence is something that can be traced back to the 1950s. It’s complex to say the least, but essentially deep learning focuses on mimicking the way that neurons interact with the brain through AI.

If it sounds like something out of WestWorld, it kind of is. Images are fed to a computer from various video files or clips. Those images then teach the computer to recognize various patterns. In the case of violent videos, these images might be something like blood or hacking motions or other violent scenes.

How Tech Firms Teach Violence

In order to teach a computer what a violent act looks like, tech companies must first find a suitable video image to show the computer. Sometimes, no such image exists. In this case, companies have to create video footage to show the computer. There are lot of hangups, though. One of the major hurdles is how to teach a computer to spot something that a human might know to be violent.

A great example of this is psychological torture. A human might seem someone being tortured psychologically but a computer cannot do the same. There are other things that a computer can’t spot like a particularly chaotic and bloody scene. To a human, these scenes are obvious, but to a computer they just look like chaos.

The Real Challenge

The challenge to stopping violent videos that have been spread throughout social media is that tech companies have to figure out how to teach a computer every kind of violent act. The other challenge is that people are always finding unique ways to create violent acts - ways that are purposely created to thwart a computer.

This means that tech companies have to constantly teach computers what to look for. That’s not something that will be easy to do, but the company that succeeds will gain a lot of needed money and attention. So far, no tech company has cracked the code but many are looking to do so soon.