Like many who belong to the older generation, I am not technologically savvy. I do own an iPad, and use it for email, browsing, online shopping, and digital subscriptions to newspapers. But I much prefer media the old fashioned way — CBS for nightly news and “60 Minutes,” and a printed daily newspaper in my hands.
I know very little about social media and understand even less. I don’t know the difference between Twitter, Snapchat, and Instagram, and have never been on Facebook. However, I have been watching the news about the role Facebook played in the 2016 presidential election and 2014 North Carolina Senate race.
Reports that a political firm hired by the Trump campaign and the North Carolina Republican Party acquired access to private data on millions of Facebook users has sparked questions about how social media protects user information. The biggest issue seems to be the use of data mining, and the fact that Facebook did nothing to stop the private information of Facebook users from being “mined.”
Data mining is automatically searching large stores of data to discover trends. It uses sophisticated computer programs to divide the data into segments and evaluate the data in each segment. Data mining requires large databases, which is why Facebook’s nearly 2 billion active users is a big target of data miners. The mined data is then automatically analyzed to discover pattern. Those patterns are then used to predict likely outcomes and create information that can be used to affect those outcomes.
For example, if I decide to buy a new car and do online research about cars, my search history could be mined along with everyone else who is researching car purchases. Then, based on the analysis of the information gathered by the data miners, I would begin receiving targeted information such as car ads, reviews, and links to articles designed to influence my decision.
But data miners would not stop with my search history. They would mine other personal data about me, such as financial information, driving records, and credit history to try to influence my buying decision.
That doesn’t sound like a bad thing, but what if I began receiving false information from fake sources? What if I began receiving email notices from a source claiming to be Consumer Reports that the car I planned to buy was unreliable and unsafe for small children? If I believed the reports were real, it would change my decision about the car.
That is in essence what happened during the 2016 presidential campaign and election. A political data company named Cambridge Analytica mined the private information of about 80 million American Facebook users to gather data about voters and how to influence their choice of candidate. Their analysis led to a campaign to target a segment of the voting population to manipulate the voters.
Cambridge Analytica has been largely funded by Robert Mercer, a wealthy Republican donor, and Stephen K. Bannon, a former adviser to the president who became an early board member and gave the firm its name.
Aleksandr Kogan, a Russian-American psychology professor, built his own app and began harvesting data from Facebook for Cambridge Analytica in June 2014. The data included details on users’ identities, friend networks, and “likes.” The idea was to map personality traits, and use that information to target audiences with digital ads.
Users were asked to take a personality survey and download an app, which gathered private information from their profiles and those of their friends, activity that Facebook permitted then and has since banned. About 270,000 users participated in the survey, although they were all told that it was only for academic use. But their consent opened a gateway to having the information of their friends and family harvested as well, which led to about 50 million other Facebook users being mined in 2014. Facebook said no “sensitive pieces of information” such as financial passwords were taken, though information about a user’s identity and location was available.
Facebook routinely allows researchers to have access to user data for academic purposes — and users consent to this access when they create a Facebook account. But Facebook prohibits this kind of data to be sold or transferred “to any ad network, data broker or other advertising or monetization-related service.” It says that was exactly what Dr. Kogan did by providing the information to a political consulting firm. He and the company are now banned from the site.
After first denying that they had mined data from Facebook, Cambridge Analytica officials have now admitted they did.
It has not been revealed how the data mined by Cambridge Analytica was shared with the Russian government, although various American intelligence agencies have said that the data was used to create Russian propaganda that targeted voters with false information in an attempt to influence the presidential election. Evidence has also shown that Russians and other foreign citizens were in the U.S. working on various campaigns in the 2014 and 2016 elections, including N.C. Sen. Thom Tillis’.
After learning about data mining, I no longer feel left out that I don’t “do Facebook.” That is one less thing I have to worry about. But I am slowly wading into technology by recently acquiring my first smart phone. I needed something for all those pictures of my granddaughter.
