Facebook’s rules allowed for this to happen at the time when third-party apps were permitted access to data about friends of a Facebook user. Since the Mark Zuckerberg-run firm has changed its policy in order to prevent developers from having access.
Christopher Wylie, a former contractor at Cambridge Analytica, told The Guardian the company had used the data in order to target American voters before Donald Trump won the 2016 election. He said that Cambridge Analytica is a “full service propaganda machine”.
Has denied any wrongdoing and stated that the business practices used by Cambridge Analytica are common amongst other companies. Kogan, for his part, insists he complied with the law at all times. He also tells CNN that he is willing to talk to the FBI and testify in front of the US Congress about the work that he performed for the company.
Facebook’s platform policy states that developers are not allowed to transfer any data they receive from Facebook (including anonymous or aggregate data).
The University of Cambridge made the following statement in a press release to Cambridge News:
We know that Dr Kogan founded his own company, Global Science Research (GSR), and SCL/Cambridge Analytica, which was one of its clients. Cambridge academics often have their business interests. However, they must prove to the university that this is done in a personal capacity.
It is our understanding that the thisisyourdigitallife app was created by GSR. We have no reason, based on Dr Kogan’s assurances and the evidence we possess, to believe that he used the data and facilities of the university for his work with GSR.
Facebook’s share price plummeted on Wall Street a day after the Cambridge Analytica controversy hit. Could the incident have an impact on legitimate academic research, though?
The Implications
Social media data can be used to inform research in many fields, including psychology, technology, and business. Recent examples include:
Using Facebook to predict riots.
Comparing its use with concern about body image in teenage girls.
Investigating whether Facebook could reduce levels of stress responses.
Research suggests that it can enhance or undermine psycho-social constructs relating to well-being.
You are right to think that research integrity is important to researchers and employers. Even if the data was used for university purposes, it is still a bad sign when an academic betrays trust. This has implications for the governance of research and companies’ willingness to share data.
Mark Zuckerberg is yet to comment on the Cambridge Analytica scandal. Shutterstock
Universities, research organizations, and funders are responsible for the integrity of research. They do this by implementing strict and clear ethics procedures that protect study participants, including when social media data is being used. According to commonly accepted research standards, the harvesting of user data without their permission is an unethical act.
Researchers who use social networks to conduct their research and whose data is shared routinely with them at these sites for research purposes could be affected in a big way by the Cambridge Analytica scandal. Researchers could be less willing to share their data with tech companies. Facebook already protects its data. Researchers could find it more difficult to access the information after what happened with Cambridge Analytica.
Data Analytics
Researchers are not the only ones who use profile information to understand peoples’ behavioral patterns better. Since the 1970s, marketing organizations have profiled consumers. If they know their customers’ triggers, they can adjust their messages to increase sales. Digital marketing has made it easier – users are tracked on the internet, their online activities are analyzed with data analysis tools, and personalized recommendations are given. These methods are at the core of business strategies for tech giants such as Amazon and Netflix.
Online behavior can predict mood, emotion, and personality. In my research on Intelligent Tutoring Systems, I use learner interaction with software to profile personality types so that it can automatically adjust tutoring to someone’s preferred style. Machine learning can be used to combine psychology theories with patterns discovered – like Facebook’s “likes.”
Eli Pariser has been opposing personalization tools as the CEO of the viral website Upworthy since 2011. Eli Pariser, CEO of viral content website Upworthy, has been arguing against personalization tools since 2011.