Photo By: 

Cambridge Analytica Part IV: Regulating the Regulators

Story 4 in a 4-part series, how the internet powers need to be watched and why this regulation will save us from ourselves.

Brian Young's picture
Mar 26, 2018

Over the last three days, UCOMM has been looking at how Cambridge Analytica (CA) used a Russian scientist to mine data on 50 million Americans. They then used this data to create emotional profiles and helped Trump target ads that won him the election. UCOMM has also looked at how this data breach will affect the future of Facebook and how it will impact unions ability to communicate online. Today in the final piece of the story, we will look at steps that Facebook and other social media platforms have taken to regulate the content and access that publishers have.

When Facebook first started, the site was a civil libertarian's dream. There was almost no censorship of content. You could say what you wanted and do what you wanted and the only people policing you were your friends. As more social networks popped up, this gave rise to “trolls.” Places like Twitter and Reddit became inundated with people who could be as disgusting as they wanted with no regulation what so ever. With no regulation, companies like CA were able to exploit loopholes in the system. It took almost a year before Facebook even realized what CA was doing and by then, it was too late. CA collected data on over 50 million people.

Over the last few years, calls have grown for more regulation of these sites. Facebook came under fire after the launch of live videos of people filming and broadcasting murders. Twitter also came under fire during the 2016 election for trolls, many on the alt-right, using the site to harass, intimidate and threaten prominent Jews. Youtube faced a backlash from advertisers after the Google ad network began delivering ads for major companies on controversial videos, like those promoting ISIS. In the aftermath of these scandals, the companies began internal regulation. Just today, UCOMM discovered that our ads were being served on Breitbart because they are hiding under the umbrella of “Google Anonymous,” this leaves many questions unanswered on how ads are placed, why and who is really watching.

In 2017, Facebook announced that they were hiring thousands of new people to become content regulators. The large surge in hiring came after evidence came to light that the Russians used Facebook ads to deliver fake news stories to about 10 million people to elect Trump.

“Censorship of opinion is becoming a common practice of publishers of content. As publishers choose to censor certain opinions, they are polarizing audiences.” Says Ray Lopez, a freelance digital marketing expert and chief digital strategist for UCOMM. “By doing so, content Publishers are wittingly or unwittingly polarizing their audiences. However, they are finding that the more loyal their audience, the higher the yield on ad Revenue. Viewers who make up these audience members are willing to pay for the publisher content and sign-up for a log-in. These loyal audiences generate the most dollars for a publisher because of the quality of data that they can target the user with. However, this leaves a vacuum open for publishers or applications that will not censor and have an open forum to collect data from users of differing opinions.” What Lopez is basically saying is that greed trumps free speech and in the end, only the wealthy are benefiting from the system. #unrigthesystem

Even with an army of internet moderators, how effective will it be and who are these people? Law enforcement? Academics?, Video gamers? Ex Walmart greeters? The job has often been called the worst in tech, with high turnover rates and some even reporting suffering from PTSD and permanent disability. Little training is given, and employees are often given just a few seconds to make a decision on whether or not content should stay or go. In addition to the immense stress of the job, moderators are also asked to make split-second decisions on important content. For example, is a video of a person getting shot by a cop exposing police brutality or is it glorifying a violent act? While philosophers and civil libertarians could spend hours debating this, Facebook moderators must decide in a matter of seconds. There is no real answer to the right approach, there is just the fact that this is the current system, and this is how content gets moderated.

For many this censorship is worrisome. With only internal regulation, these content moderators can begin censoring certain content based on personal views, like imagine a Mark Janus type monitoring UCOMM’s video content. With hundreds of complaints per shift being done, it would be easy for this censorship to expand past inappropriate content to include content critical of Google or Facebook or even political content. UCOMM has already dealt with some political censorship and we have gone onto radio shows to speak of the process.

It is also impossible to say that Facebook, Twitter or Google will never have a data breach or will never have their ad network exploited again. With over 1 billion daily users, Facebook simply can’t catch everything. Instead, the answer may be more government regulations. This may mean a slower process, but if justice is served, then maybe having more eyes on the content is best for all.

Already Senators are talking about hearings and the British parliament is planning to do the same. The US Senate is also demanding that the companies don’t just send a lobbyist but instead send the CEOs. Senators Amy Klobuchar, John McCain, and Mark Warner have introduced a bill, the Honest Ads Act, which would hold Facebook ads to the same standards that govern TV and Radio ads, including things like disclosure of who is paying for the ad.

Rather than censoring content, these simple fixes could be the answer to preventing future data breaches and meddling by foreign governments. It has now been 25 years since the founding of the internet and it may be time for laws to catch up with the times. It took our government 50 years to create Anti-Trust laws against the Oil and Railroad companies, Trump won’t lead that charge because he can’t govern, but maybe a new House of Representatives with a competent President in 2020 will help America save itself from itself.

Kris LaGrange directed this series and contributed to this piece.

Comment on this story on our Instagram page now.

Sign up for our e-Newsletter!