Versión clásicaVersión móvil

Social Media in Higher Education

 | 
Chris Rowell

Part Three. Teaching and learning

11. Cambridge Analytica, Facebook, and Understanding Social Media Beyond the Screen

Zoetanya Sujon

Texto completo

#Beginner #Facebook #WhatsApp #CambridgeAnalytica

Introduction

1One of the challenges many HE professionals face in classrooms oriented around learning about social media, is that of teaching apparently digitally-savvy students who feel their intense familiarity with social media is the same as critical understanding. While some may of course be critically minded, or even possess a sophisticated understanding of algorithms, privacy, and the complex structures of social media; many do not. As such, guiding learners to move beyond their experience of the newsfeed, stream, or front page can be tremendously challenging as well as tremendously rewarding.

2This chapter examines one approach for dealing with this sometimes difficult teaching context, providing a broad overview of the growing importance of critical perspectives on social media. Beginning with an outline of the rich variety of student experiences, this chapter contextualises some of the learning challenges I have encountered in my own classrooms while teaching social media; challenges which require an open classroom and a critical view of the idea of ‘digital natives’.

3This chapter also presents Facebook, particularly its interactions with Cambridge Analytica, as an ideal case for tackling the complexities of social media and pushing users beyond the social experience. The aim of this section is to examine the importance of personal data as the core business model of Facebook, and most mainstream or corporate social media.

4Finally, this chapter includes three key exercises that can be used in classrooms to help learners to understand how Facebook works, and some of what the Cambridge Analytica case reveals about social media. In sum, the purpose of this chapter is to examine some of the best ways to bring critical thinking into the experience of social media, providing a mix of theory and practical tasks which can help learners understand concepts of personal data collection and ‘surveillance capitalism’ in relation to their own social media use. The Cambridge Analytica case is particularly important and effective for engaging learners’ critical understanding of social media and moving their perspective beyond the screen.

Students, Screens, and Social Media

5Many HE professionals regard young adults as ‘digital natives’ who come into the classroom with an innate understanding of new technologies and digital skills (e.g. Prensky 2001, 2012). Many also draw on ideas from the ‘Visitor and Resident’ model in order to understand the skills young people develop as a consequence of their engagement with social and digital tools (e.g. White and Le Cornu 2011). Based on what I see in my classrooms, many young people do bring an intense familiarity with social media based on everyday use, yet this is familiarity is uneven across each cohort of students. Many among them have limited experience and understanding, perhaps as a result of a personal or parental rejection of social media in their lives.

6This uneven familiarity and understanding is important for two reasons. First, as educators, it is vital to challenge assumptions about young people as ‘digital natives’, and instead establish a classroom environment that is as open for those with relatively little knowledge as it is for those who are very experienced. In this sense, understanding use based on levels of experience and critical understanding, rather than on levels of expertise, is an important starting point. Growing evidence suggests that the concept of ‘digital natives’ overstates many young peoples’ digital knowledge and skills, so an open classroom can make all the difference for those coming from the more unfamiliar end of the spectrum (e.g. Das and Beckett 2009; Helsper and Eynon 2009). Second, as I will argue throughout this chapter, Facebook, like other social media platforms, does considerable work to keep its users on the newsfeed rather than on the almost infinite options beyond the ‘front page’ of user experience. In many ways, most peoples’ experiences of social media resemble those of automobile drivers — we may know how to drive the car, but that does not mean we know what is under the hood, how roads are built, how traffic is regulated, or anything about the trade fluctuations and power dynamics of the petrol and oil industries.

7A broader understanding of the social media landscape does not necessarily require advanced technical skills, and can introduce users to a more critical understanding of what they are interacting with. Given the scope, breadth, and power of social media, this critical approach is urgent, and is the first step towards breaking down common misunderstandings that social media does not constitute anything more than a newsfeed. Facebook’s role in reinforcing this common misunderstanding is clearly apparent in the Cambridge Analytica case.

Cambridge Analytica: ‘It’s a Feature Not a Bug’

8In early 2018, revelations were widely published revealing the misuse of 87 million people’s personal data, gathered in 2014 by Cambridge academic Aleksander Kogan, through a Facebook app personality quiz called ‘thisisyourdigitalife’ (Madrigal 2018; Tufekci 2017, 2018). Using the Facebook platform, the app not only collected personal data from approximately 300,000 Amazon Mechanical Turk workers paid $1 or $2 to complete the quiz, but also personal information from each of those worker’s friend accounts. Meanwhile, Facebook CEO Mark Zuckerberg maintains that this data was shared with Cambridge Analytica — a political campaigning and marketing company — in breach of Facebook’s terms and conditions. Indeed, Facebook did suspend Kogan’s app and demanded certification from Cambridge Analytica (CA) that they had deleted all data that had been collected. While CA apparently did provide certification, the data has since been linked with both Donald Trump’s presidential campaign in 2016, and with various Brexit leave campaigns (Chen 2018; Tufecki 2018; Zuckerberg as cited in Thompson 2018; Osborne and Parkinson 2018; Greenfield 2018).

9The case involves use of Facebook’s open social graph, a service which has allowed app developers and third parties to access certain kinds of user information via the Facebook platform. Drawing from this case, there are two significant implications which are essential for broadly understanding social media, and Facebook in particular. The first is that this case is not a scandal or data breach — instead, it reveals the inner workings and logic of social media platforms. Second, these platforms work very hard to keep users on the ‘news feed’, obscuring the big business of personal data collection behind the ‘social’ purpose of these platforms, as explained below.

10Leading thinkers on digital media and politics argue that the ‘data misuse’ of the CA case ‘is a feature, not a bug’, not just of Facebook, but also social media platforms more broadly (Zuckerman 2018; Tufecki 2017, 2018). Referring to Facebook as a ‘surveillance machine’ or what Zuboff terms ‘surveillance capitalism’ (2015), Tufekci (2018) argues:

Facebook makes money, in other words, by profiling us and then selling our attention to advertisers, political actors and others. These are Facebook’s true customers, whom it works hard to please.

11The widespread ‘misuse’ of personal data can also be observed in Facebook’s ‘shadow profiles’ — profiles made up of data gathered from Facebook users’ contact lists and web browsing behaviours — which have been long understood to be the basis for the ‘people you may know’ algorithm. In addition, WhatsApp founder Jan Koum has resigned from WhatsApp and the Facebook board of directors over privacy concerns (White 2018; Solon 2018). Indeed, in the risk assessment section of its 2018 first quarter report, Facebook has warned shareholders that: ‘We anticipate that we will discover and announce additional incidents of misuse of user data or other undesirable activity by third parties’ (Levy 2018; Facebook Inc. 2018). Many others have pointed to Facebook’s problematic data collection procedures, ranging, for example, from the 2014 mood manipulation study (Meyer 2014; Kramer et al. 2014) to targeting ‘Jew haters’ in advertising and political campaigns, highlighting the profitability of big data (Tufekci 2017; cf. Zuboff 2015; Van Dijck 2013; Morozov 2012).

12What is particularly important about this is that Facebook, like other social platforms, presents itself as primarily a social network, giving users ‘the power to share and to make the world more open and connected’ (Facebook’s mission statement, 2004–2018). As such, Facebook claims its primary aim is to connect people and more recently ‘to bring the world closer together’ (Facebook mission statement 2018), rather than the monetisation of social interactions and personal data collection. For all of these reasons, today’s students must work to critically understand social media as more than just social networks and tools for interpersonal communication, regardless of whether they use social media or not.

Personal Data Exercises

13Although the CA events clearly illustrate the practice of monetising personal data collection on Facebook, many students may struggle to understand how this applies to them and their data. Drawing from Lave and Wenger’s (1990) notion of situated and participatory learning, the following exercises have been developed to illustrate the scope, scale, and applicability not only of personal data collection, but also of the use of the data and the lack of user rights on social media.

Exercise 1: Review Facebook’s ‘Statement of Rights and Responsibilities’ or ‘Terms of Service’

14Many of today’s social media users, including students at many levels, admit to having never read the terms and conditions or end user licence agreements of the sites, platforms, apps, or devices they use. As such, one of the first helpful exercises for introducing users to social media beyond the newsfeed is to ask them to review the terms and conditions which most people click through without reading or considering what their rights may be. Although the CA case has prompted the development of new Facebook terms (introduced as of 19 April 2018), there are still a number of consistencies with prior versions. These can be found by clicking on ‘terms’ often found on the bottom right of the Facebook.com home page (you can access Facebook’s terms if you are logged in and if you are not, see Fig. 11.1 below).

Fig. 11.1 Zoetanya Sujon, Screenshot highlighting Facebook’s ‘Terms’ (2018), CC BY 4.0

Fig. 11.1 Zoetanya Sujon, Screenshot highlighting Facebook’s ‘Terms’ (2018), CC BY 4.0

15The link pathways are different depending on whether you are logged in or not. First, it is important to review the terms which outline your rights and responsibilities as a Facebook user. Many find the terms and conditions off-putting, particularly as the small print and formal language can be difficult to read and understand — these are points which have been addressed in the April 2019 updates. However, despite the challenges of reading the terms, many report that this exercise is eye-opening and worth the effort. While there are many things to focus on during this kind of session, I ask students to review the statement of rights and responsibilities, their privacy settings, and the data policy — identifying what stands out to them and what they think is important — often leading to lively discussion.

16One part of the terms and conditions which I draw attention to falls about halfway through Facebook’s ‘Terms of Service’, under section ‘3.3 The permissions you give us’. In this section, Facebook outlines what kind of permissions they need in order to let you use Facebook (2018, my emphasis):

Permission to use content that you create and share: You own the content that you create and share on Facebook and the other Facebook Products you use, and nothing in these Terms takes away the rights that you have to your own content. You are free to share your content with anyone else, wherever you want. To provide our services though, we need you to give us some legal permissions to use that content. Specifically, when you share, post or upload content that is covered by intellectual property rights (e.g. photos or videos) on or in connection with our Products, you grant us a non-exclusive, transferable, sub-licensable, royalty-free and worldwide licence to host, use, distribute, modify, run, copy, publicly perform or display, translate and create derivative works of your content (consistent with your privacy and application settings).

17The highlighted part of the above terms was almost identical in prior version of Facebook’s terms and conditions other than moving the placement of this statement from the first section of the ‘Statement of Rights and Responsibilities’ (‘2. Sharing your content and information’, 31 January 2018, https://www.facebook.com/​legal/​terms) to considerably lower on the page, as well as some minor re-ordering of the words within the statement.

18Although users and non-users identify community standards, safety, data policy, and privacy settings as important, ‘the permissions you give us’ are demonstrative of some of the double logic at work, as well as Facebook’s total control of personal content and data (i.e. ‘You own the content you create and share… . [and] you give us non-exclusive, transferable, sub-licensable, royalty-free and worldwide licence to host, use, distribute, modify, run, copy, publicly perform or display, translate and create derivative works of your content’). Participants are also strongly advised to review and reflect on Facebook’s ‘Data Policy’.

Exercise 2: Activity log

19The purpose of this short exercise is to encourage learners with a Facebook account to explore Facebook’s record of all of their activity on Facebook from the present moment back to the day they signed up for an account. The activity log can be found by clicking on the ▼ icon in the top right-hand corner of any Facebook page (see Fig. 11.2 below).

20Once users have found their activity log, they will see a long list of every like, comment, share, video watched, and other kinds of behaviour. While it is valuable to ask participants to review their activity log at their leisure, it is worth pointing out that they have a number of filters located on the left of their activity log, which organizes Facebook activity according to particular kinds of behaviour; These categories, found on the left hand side of your activity log, include:

  • Posts

  • Posts you’re tagged in

  • Other people’s posts to your timeline

  • Hidden from your timeline

  • Photos and videos

  • Likes and reactions

  • Comments

  • Profile

  • Added friends

  • Life events

  • Songs you’ve listened to

  • Articles you’ve read

  • Films and TV

  • Games

  • Books

  • Products you’ve wanted

  • Notes

  • Videos you’ve watched

  • Following

  • Groups

  • Events

  • Polls

  • Search history

  • Saved

  • Your places

  • Security and login information

  • Apps

Fig. 11.2 Zoetanya Sujon, Locating the activity log on your Facebook account (2018), CC BY 4.0

Fig. 11.2 Zoetanya Sujon, Locating the activity log on your Facebook account (2018), CC BY 4.0

21Each filter provides a complete list of Facebook activity related to that category. Clicking on ‘articles you’ve read’ or ‘videos you’ve watched’, for example, reveals a complete list of articles or videos you’ve clicked on or even hovered over without clicking. While it is possible to clear some of these activity filters (e.g. ‘search history’ or ‘videos you’ve watched’) with a single click, it is more difficult to edit or delete other activities on your timeline (see Fig. 11.3).

22On the right side of each event recorded in your activity log, you will see an indication of who your action is visible to, including ‘public’, ‘friends’, ‘only me’, or ‘custom’ (see icons circled in Fig. 11.3, and a full list of visibility options in box 1).

Fig. 11.3 Zoetanya Sujon, One event from a Facebook activity log, highlighting visibility settings (2018), CC BY 4.0

Fig. 11.3 Zoetanya Sujon, One event from a Facebook activity log, highlighting visibility settings (2018), CC BY 4.0

Fig. 11.4 Zoetanya Sujon, Screenshot of Facebook’s options for who can see your activity on Facebook, from ‘What audiences can I choose from when I share?’ (2018), CC BY 4.0

Fig. 11.4 Zoetanya Sujon, Screenshot of Facebook’s options for who can see your activity on Facebook, from ‘What audiences can I choose from when I share?’ (2018), CC BY 4.0

23Based on the visibility settings, it is possible to identify who can see what activity, although these settings are determined by the original poster, which means they cannot be changed for any content you have not created. The event shown on the activity log in Figure 11.3 is public, meaning it is visible to everyone on the Internet, as indicated by the globe icon. The other editing option shown by the pen icon (shown in the red circle in the figure), allows you to unlike, unfollow, delete or hide a comment or shared post.

Conclusion

24In conclusion, this short essay has provided an overview of key challenges many HE professionals face in today’s classrooms, including widespread assumptions regarding young people’s levels of experience and critical understanding with social technologies. In order to illustrate the importance of a critical understanding, this chapter focuses on recent events between the misuse of 87 million Facebook users personal data by Cambridge Analytica, briefly explaining what happened and what the implications are. Following this, several key exercises intended to demonstrate the wealth of personal data collected and held by Facebook are introduced. The purpose of this chapter is to provide a framework for teachers and HE professionals to use in classrooms, in part or as a whole, to contribute to learners’ data literacy, not only of Facebook, but of social media more broadly. #DeleteFacebook may be a temporary solution to personal data infringements, but, as many leading thinkers have argued, Cambridge Analytica is not a breach or scandal, rather, it capitalizes on social media’s organizing logic — the monetisation of personal data.

Other Suggested Exercises:

1. Facebook company directory

25In order to get a sense of the size and scope of Facebook, review Facebook’s key pages and products through its company directory. Note any pages that stand out or are interesting to you. What do these product teams and pages tell you about Facebook as a company? https://newsroom.fb.com/​pages-directory/​

2. Ad preferences

26Based on the information you give Facebook (e.g. name, age, marital status, parental status, where you work, go to school etc.), your browsing behaviour on Facebook (e.g. likes, groups, friends, messages, clicks, etc.), and your online behaviour on other apps and sites shared through the Facebook platform and cookies, Facebook builds an overview of your interests and habits. These interests and habits are organized in categories which advertisers buy access too. These categories are called ‘ad preferences’. You can view your own ‘ad preferences’ here: https://www.facebook.com/​ads/​preferences (you must be logged in to see this feature and for the link to work).

3. Platform and apps

27With the 2007 launch of Facebook log-in (now Facebook Connect) — a service allowing users to log-in to external web-sites using their Facebook username and password — Facebook transformed from a single web site to a MobileFirst platform. Anne Helmond (2015), leading thinker in platform studies, defines the move as ‘the extension of social media platforms into the rest of the web and their drive to make external web data “platform ready”’. The Facebook platform enabled personal user data to be collected and shared across services and accounts, just as it was between Aleksander Kogan’s personality quiz app, 87 million users, and Cambridge Analytica. Check your own app and platform settings under ‘Apps, websites, and games’: https://www.facebook.com/​settings?tab=applications (you must be logged in to see this feature and for link to work).

Bibliografía

References

Chen, Adrian (2018). ‘Cambridge analytica and our lives inside the surveillance machine, ’ The New Yorker, March 21, https://www.newyorker.com/tech/elements/cambridge-analytica-and-our-lives-inside-the-surveillance-machine

Das, R., and Beckett, C. (2009). ‘Digital natives: A myth?’, A POLIS Paper. November 24, http://www.lse.ac.uk/media@lse/Polis/Files/digitalnatives.pdf

Facebook Inc. (2018). ‘Quarterly report pursuant to section 13 or 15d of the securities exchange act of 1934’, United States Securities and Exchange Commission, https://www.sec.gov/Archives/edgar/data/1326801/000132680118000032/fb-03312018x10q.htm

Facebook Inc. (2019). ‘Terms of service’, https://www.facebook.com/legal/terms/update

Helsper, E., and Eynon, R. (2009). ‘Digital natives: Where is the evidence?’, British Educational Research Journal, pp. 1–18, http://eprints.lse.ac.uk/27739/1/Digital_natives_%28LSERO%29.pdf

Kramer, Adam D. I., Guillory, Jamie E. and Hancock, Jeffrey T. (2014). ‘Experimental evidence of massive-scale emotional contagion through social networks’, PNAS, 111: 24, pp. 8788–90, http://www.pnas.org/content/111/24/8788.abstract?tab=author-info

Lave, J., and Wenger, E. (1990). Situated Learning: Legitimate Peripheral Participation. Cambridge: Cambridge University Press.

Levy, Ari. (2018). ‘Facebook warns investors that more Cambridge Analyticas are likely, ’ CNBC, April 26, https://www.cnbc.com/2018/04/26/facebook-warns-investors-that-more-cambridge-analyticas-are-likely.html

Madrigal, Alexis C. (2018). ‘What took Facebook so long?’, The Atlantic, March 18, https://www.theatlantic.com/technology/archive/2018/03/facebook-cambridge-analytica/555866/

Meyer, Robinson (2014). ‘Everything we know about Facebook’s mood manipulation study’, The Atlantic, June 28, http://www.theatlantic.com/technology/archive/2014/06/everything-we-know-about-facebooks-secret-mood-manipulation-experiment/373648/

Morozov, Evgeny (2012). The Net Delusion: How Not to Liberate the World. London: Penguin

Osborne, Hilary and Parkinson, Hannah Jane (2018). ‘Cambridge Analytica scandal: The biggest revelations so far, ’The Guardian, March 22, https://www.theguardian.com/uk-news/2018/mar/22/cambridge-analytica-scandal-the-biggest-revelations-so-far

Prensky, M. (2012). From Digital Natives to Digital Wisdom: Hopeful Essays for 21st Century Learning. Thousand Oaks, CA: Corwin.

Prensky, M. (2001). ‘Digital natives, digital immigrants’, On the Horizon, 9: 5, https://www.marcprensky.com/writing/Prensky%20-%20Digital%20Natives,%20Digital%20Immigrants%20-%20Part1.pdf

Solon, Olivia (2018). ‘WhatsApp CEOJan Koum quits over privacy disagreements with Facebook’, The Guardian, April 30, https://www.theguardian.com/technology/2018/apr/30/jan-koum-whatsapp-co-founder-quits-facebook

Thompson, Nicholas (2018). ‘Mark Zuckerberg speaks to Wired about Facebook’s privacy problem, ’ Wired, March 21, https://www.wired.com/story/mark-zuckerberg-talks-to-wired-about-facebooks-privacy-problem/

Tufekci, Zeynep (2018). ‘Facebook’s surveillance machine, ’ The New York Times, March 19, https://www.nytimes.com/2018/03/19/opinion/facebook-cambridge-analytica.html

Tufekci, Zeynep (2017). ‘Facebook’s ad scandal isn’t a “fail2, it’s a feature, ’ The New York Times, September 23, https://www.nytimes.com/2017/09/23/opinion/sunday/facebook-ad-scandal.html

Van, Dijck, José (2013). The Culture of Connectivity: A Critical History of Social Media. Oxford: Oxford University Press.

White, Jeremy B. (2018). ‘WhatsApp co-founder Jan Koum leaving parent company Facebook aid privacy scandal, ’ The Independent, May 1, https://www.independent.co.uk/news/business/jan-koum-whatsapp-leaving-cofounder-facebook-privacy-a8330256.html

White, D. S., and Le Cornu, A. (2011). ‘Visitors and residents: A new typology for online engagement’. First Monday, 16: 9, http://firstmonday.org/article/view/3171/3049

Zuboff, Shoshana (2015). ‘Big other: Surveillance capitalism and the prospects of an information civilization’, Journal of Information Technology, 30, pp. 75–89, https://doi.org/10.1057/jit.2015.5

Zuckerberg, Mark (2018). ‘I want to share an update on the Cambridge Analytica situation… ’, Facebook, March 21, https://www.facebook.com/zuck/posts/10104712037900071

Zuckerman, Ethan. (2018). ‘This is so much bigger than Facebook, ’ The Atlantic, March 23, https://www.theatlantic.com/technology/archive/2018/03/data-misuse-bigger-than-facebook/556310/

Índice de ilustraciones

Título Fig. 11.1 Zoetanya Sujon, Screenshot highlighting Facebook’s ‘Terms’ (2018), CC BY 4.0
URL http://books.openedition.org/obp/docannexe/image/8646/img-1.jpg
Archivo image/jpeg, 32k
Título Fig. 11.2 Zoetanya Sujon, Locating the activity log on your Facebook account (2018), CC BY 4.0
URL http://books.openedition.org/obp/docannexe/image/8646/img-2.jpg
Archivo image/jpeg, 40k
Título Fig. 11.3 Zoetanya Sujon, One event from a Facebook activity log, highlighting visibility settings (2018), CC BY 4.0
URL http://books.openedition.org/obp/docannexe/image/8646/img-3.jpg
Archivo image/jpeg, 26k
Título Fig. 11.4 Zoetanya Sujon, Screenshot of Facebook’s options for who can see your activity on Facebook, from ‘What audiences can I choose from when I share?’ (2018), CC BY 4.0
URL http://books.openedition.org/obp/docannexe/image/8646/img-4.jpg
Archivo image/jpeg, 70k

CC-BY-4.0

Únicamente el texto se puede utilizar bajo licencia CC BY 4.0. Salvo indicación contraria, los demás elementos (ilustraciones, archivos adicionales importados) son "Todos los derechos reservados".

Comprar

Buscar en OpenEdition Search

Se le redirigirá a OpenEdition Search