Cambridge Analytica Part III: What this means for the Union Movement
Story 3 in a 4-part series, UCOMM looks to help you understand how Facebook will hurt labor speech and why this should concern you.
Over the course of 2014-2016, CA teamed up with a Russian psychologist and intelligence officials to mine data and create psychological profiles for many Americans that they used to help get Trump elected.
In Part 3 of our 4-part series, we will look at how this could change the way Facebook and other social media websites manage the advertising end of their business and what this means for the union movement. Quite possibly, messaging in federal elections will never be the same again, exposing a great weakness in our fragile democratic voting system.
When the data collection happened, Facebook still had limited privacy settings. CA even argues that what they did was completely legal and that Facebook was allowing this access. Facebook, of course, to save face, is calling it a data breach and informed the company in 2015 that they believed the data was obtained illegally. The legal questions, which are sure to be sorted out through Congressional hearings and possibly in the Mueller investigation, are less relevant. The damage was done, Trump is in Washington and our nation is in trouble. The fact that Facebook realizes that they screwed up is the key issue and should not be taken lightly.
Already changes have been made to prevent something like this from happening again. Once Facebook found out what CA was doing, they began changing access permissions that third-party apps had. Apps are no longer able to access information from your list of friends and are much more restricted on what data they can collect without a user giving it to them. Facebook has also banned the app as well as the main players in it from their platform. Too little too late but this change in doing business is well deserved. In fact, after the story broke about CA, that weekend Facebook sent out a network-wide poll asking if user’s thought Facebook was good for the world.
Combined with the accusations that Facebook allowed the Russians to use their platform to meddle in the 2016 presidential election, the company is taking a deeper look at content that is going up on their site. Here at UCOMM, we have already noticed this. Ads are taking significantly longer to be approved now as Facebook takes a second look at the content. Again, too little too late. UCOMM’s current ads promoting Teacher’s sticking with their union in the wake of a negative Janus decision by SCOTUS and Suffolk AME lobbying their state’s capital to get more funding to protect abused and neglected children should not feel the negative wrath of Trump fearmongering; but it’s happening and right now the only thing we can do is write about it and hope that readers understand the real issue here.
With increased scrutiny on advertising methods, a change could also come to how companies, politicians, and unions can micro-target. With a push to protect people’s personal information, Facebook could remove some market information that has been used in the past. Could we see a world where cookies, which are little bits of information that advertisers can use to see what sites you visit, are no longer used to target ads to people? It is a very real possibility, especially on Facebook. Individuals can clear cookies in their browser anytime, but most don’t because truth in advertising practices are mostly sensitive to the needs of the online consumer and activist. UCOMM does not spam, we inform, educate and motivate unionist to take action online. It’s a good thing, what Facebook could do is take that away from us, and it’s not fair.
This increased scrutiny can also lead to more censorship. Facebook is now extremely worried about the expansion of fake news on their site after the Russians used these fake news stories to help get Trump elected.
UCOMM has experienced unjust censorship by Google, and we even went on a few radio shows to discuss this issue. It’s frustrating when your legit content is not approved by a faceless, hipster bureaucrat.
Last year Facebook began giving users the ability to mark news as “fake” and report it to be removed. This should be a concern for everyone on Facebook as it lends itself to more censorship of political and union materials. Social media platforms were once the great equalizer for democracy that allowed everyone to become a citizen journalist, but now that free speech platform is set to be severely restricted.
Following these revelations Facebook founder Mark Zuckerberg released this statement:
I want to share an update on the Cambridge Analytica situation -- including the steps we've already taken and our next steps to address this important issue.
We have a responsibility to protect your data, and if we can't then we don't deserve to serve you. I've been working to understand exactly what happened and how to make sure this doesn't happen again. The good news is that the most important actions to prevent this from happening again today we have already taken years ago. But we also made mistakes, there's more to do, and we need to step up and do it.
It is clear in the weeks and months to come more changes will be made that could vastly change the way Facebook works. However, with new regulation comes the question, who regulates the regulators.
Check back next week for Part IV: Who is Regulating the Regulators.
Kris LaGrange directed this series and contributed to this piece.
Comment on this story on our Instagram page now.