When it comes to protecting users from violent videos, there is no perfect solution

Source

MUST WATCH

(CNN)Almost 17 minutes of the massacre at a mosque in New Zealand was livestreamed on Facebook, purportedly showing victims being murdered by one of the shooters.

After Facebook was alerted to the video by New Zealand law enforcement the company removed the video, according to a Facebook spokesperson. The footage, however, soon appeared on YouTube under simple search terms -- where some estimate it remained for more than three hours until finally being removed.
For years Facebook has been under scrutiny for issues of privacy, censorship and the dissemination of fake news, bringing to mind, too, the role it plays in circulating graphic videos. To deal with these problems, progressive Democrats have suggested breaking up the tech company.
    Just last week, in a blog post, Sen. (and 2020-hopeful) Elizabeth Warren proposed using antitrust laws to break up companies like Amazon, Google and Facebook in order to, she believes, create competition in the tech market place.
    "More competition means more options for consumers and content creators, and more pressure on companies like Facebook to address the glaring problems with their businesses," wrote Warren.
    Warren also wrote that if elected her administration would push for legislation making "large tech platforms" like Amazon Marketplace and Google Search into highly regulated utilities.
    It's unclear what, if anything, Warren's proposal would do to prevent violent content from being shared on these platforms. In some ways, it might actually make it worse.
    Breaking up these companies, making certain parts into utilities, or levying heavier restrictions would likely limit the resources and money companies like Facebook and YouTube currently throw at this problem.
    Daniel Castro -- vice president of the Information Technology and Innovation Foundation, a nonpartisan think-tank in Washington -- told CNN Friday that breaking up Facebook would limit the resources and researchers the company could put toward vetting content. "A lot of this is new research and Facebook has one of the best research teams," Castro said, noting that Warren's legislation would leave "less concentration of top-tier researchers" to solve the problem.
    For example, over the last five years, Facebook has gone from spending $455 million on research and development to spending $2.8 billion.
    To detect things like graphic violence and pornography, companies like Facebook, YouTube and Twitter have to sort through vast amounts of data and images. For example, in order for a program to correctly identify an image of a rabbit, it needs to see a lot of images of rabbits. To do that properly, multiplied across millions of different types of images, requires an immense amount of data, physical infrastructure and time.
    This gets exponentially harder when working in real time to immediately detect images contained in video and becomes even more difficult when working with unique video.
    The New Zealand terror attack is especially tricky for these programs to identify as graphic violence because these events are so rare. (Not to mention the fact that these videos deal with many different angles, image quality, weapons and guns, etc.) In contrast, AI has continued to improve on detecting pornography, given its ubiquity online.
    (A point of contrast: During an interview with Fortune magazine, the chief technology officer of Facebook said that their AI technology could not determine the difference between marijuana and broccoli 100% of the time.)
    In 2017, 400 hours of video were uploaded to YouTube every minute. This quantity of data alone requires a huge amount of physical infrastructure to store, not to mention what's needed to comb through and analyze this data.
      The tech companies have learned that even with all the machine learning and artificial intelligence they've thrown at this problem, they still need humans. And that's expensive. According to The Verge, 15,000 content reviewers work to filter out bad content from Facebook. Even at lower wages, that still adds a cost burden that smaller companies may not be able to take on.
      All in all, big problems may require big companies.