top of page
Image by Tyler Lastovich

SOCIAL MEDIA PART 2: DIGITAL LITERACY

By Brandon Golob (Ph.D., J.D.)

​Part One: Overview

College-aged students are the most frequent users of social media (Pew Research Center, 2019). Although they are likely aware of the personal consequences of misusing social media, they often have a limited understanding of the laws surrounding their social media usage. From privacy rights to defamation issues to copyright infringement, social media has a robust legal landscape that its users must regularly navigate (Basha, 2012; Stewart, 2017). Not being able to do so can have drastic consequences – social media postings have caused people to disseminate misinformation, lose their jobs, and even face criminal charges. Before exploring the far-ranging ramifications of social media, it is useful to begin this chapter the same way I begin my “Social Media & The Law” course every year:

Analysis Activity 1

What is social media? Although it is something we are all familiar with, conceptualizing and defining it can be challenging. You don’t need to google it or attempt to offer a sophisticated scholarly definition. We’re interested in what you think of when you hear the term “social media.” Perhaps you would like to define by giving examples.

Although this may appear to be a straightforward, definitional exercise, thousands of my students have answered this question differently. This mirrors the challenges scholars have faced in reaching a single conceptualization of social media (Fox & McEwan, 2020, p. 373). In fact, throughout the history of social media research, definitions have continually evolved. Communication scholars Carr and Hayes sought to synthesize some of these shifting definitions and offer an overarching conceptualization: “Social media are Internet-based channels that allow users to opportunistically interact and selectively self-present, either in real-time or asynchronously, with both broad and narrow audiences who derive value from user-generated content and the perception of interaction with others” (2015, p. 50). Given that the field of social media is comprised of new and emerging technologies, it makes sense that conceptualization is a moving target for scholars. For example, Carr and Hayes’ definition is fairly comprehensive, but we would be remiss if we did not acknowledge that it was introduced over half a decade ago:

Analysis Activity 2

Carr and Hayes (p. 53) offered what they called a “categorization of today’s social media.” However, their “today” likely feels quite different than our “today.” Carefully review their “contemporary examples” below and address the following:

  • Which sites, platforms, services, etc. are still active? Which ones are no longer active?

  • For those that are still active, how have they evolved between 2015 and today? Your analysis can be based on your own interactions with these media but also visit the Wayback Machine, a digital internet archive that allows you to see how sites looked in the past.

  • For those that are no longer active, what do you think led to them becoming defunct? Are there contemporary sites, platforms, services, etc. that have replaced them in some way or another (e.g., TikTok instead of Vine)?

  • Speaking of TikTok, what sites, platforms, services, etc. would you add to this contemporary table of social media?

  • What do these evolutions suggest about our ever-changing relationships with social media?

​Part Two: Social Media & The First Amendment

Even if we were able to settle on a definition of social media, there would still be disagreement about the content that pervades the platforms. One of the most ubiquitous contemporary conversations around social media is how the digital age has impacted free speech. In short, questions at the nexus of free speech and social media have spurred countless legal, policy, and sociopolitical debates that will continue well into the future (Hooker, 2019; Lee & Scott-Baumann, 2020; O’Connor & Schmidt, 2021).

 

The First Amendment to the U.S. Constitution protects free speech (among other rights) from government interference. Thus, private social media companies are not bound by the First Amendment (or any other part of the U.S. Constitution and its Amendments). However, that has not stopped debates about whether these platforms should be subject to further regulation and who should ultimately bear the responsibility of content moderation. To learn more about some of the complexities among social media, free speech, and content moderation, complete the following exercise:

Analysis Activity 3

Watch this episode from Patriot Act with Hasan Minhaj. As you do, address the following:

  • What is the difference between a “platform” and a “publisher”? Why does this distinction matter?

  • Is there a way to moderate content fairly?

  • Should platforms moderate content themselves? Should there be an independent body that makes content moderation decisions? Is there some other potential solution?

  • What’s at stake in all this?

Since the release of this episode, Facebook has created an Oversight Board – a quasi-judicial body that oversees content moderation decisions by Facebook and Instagram. According to the board’s website, “the purpose of the board is to promote free expression by making principled, independent decisions regarding content on Facebook and Instagram and by issuing recommendations on the relevant Facebook company content policy” (Oversight Board, The Purpose of the Board section). The board started reviewing cases in 2021 and has released 15 decisions to date. Arguably the most well-known case on the board’s docket involves Facebook’s indefinite suspension of former U.S. President Donald Trump. This suspension was in response to content posted by President Trump during the attack on the U.S. Capitol on January 6, 2021. Read about the content posted and the board’s decision here.

 

Facebook is not the only platform to regulate former President Trump’s content. In fact, Trump’s Twitter (@realDonaldTrump) presence was more ubiquitous and thus subject to additional scrutiny. Approximately six months into his presidency, he tweeted “My use of social media is not Presidential – it’s MODERN DAY PRESIDENTIAL. Make America Great Again!” (Trump, 2017). Over the course of his time in office, he used his Twitter account in unprecedented and controversial ways. Then, as he reached the end of his term, Twitter officially laid his account to rest:

Analysis Activity 4

On January 8, 2021, Twitter permanently suspended President Trump. The Associated Press wrote an obituary to his account, @realDonaldTrump. As the article makes clear, Trump’s use of the platform evolved over time. While reading the article, address the following:

  • What were some of the different ways Trump used the platform?

  • How are these ways similar to, and different from, the ways other people/politicians use the platform?

  • Did Twitter make the ‘right’ decision banning Trump? Should they have done so earlier? At what point in time?

  • What complications arise from banning Trump?

  • Lastly, unlike Facebook, we should keep in mind that there is not an oversight board reviewing Twitter’s decision. This prompts us to think more critically about Exercise 3’s final set of questions: Should platforms moderate content themselves? Should there be an independent body that makes content moderation decisions? Is there some other potential solution?

​Part Three: Misinformation

As is evident from our previous section, debates about content moderation increased during Donald Trump’s presidency. However, focusing too closely on Trump’s tumultuous Twitter relationship could make it so we cannot see the forest for the trees. Put differently, there is an issue at stake here that is larger than one person (even when that person holds the highest political position): misinformation on social media. Before we dive deeply into this topic, it is crucial to clear up a few definitions:

Research Activity 1

The term “fake news” has become popular during our digital age. To better understand what this term consists of, explore this Case Studies in Disinformation module created as part of UCI Open. As you do, pay close attention to how “fake news”consists of three distinct concepts: (1) misinformation; (2) disinformation; (3) propaganda.

As you discovered through this module, “misinformation is information whose inaccuracy is unintentional” whereas “disinformation is information that is deliberately false or misleading” (Jack, 2017, pp. 2-3). Both of these run rampant across social media platforms. Although analysis of how social media has amplified misinformation and disinformation preceded the COVID-19 pandemic, there has been a spike in research about the use of social media specifically during the pandemic (Benis et al., 2021; Drouin et al., 2021; Gottlieb & Dyer, 2020; Nabity-Grover et al, 2020; Wiederhold, 2020; Zhong et al., 2021). Unfortunately, social media has been a vehicle for spreading misinformation and disinformation related to the disease itself, the vaccines, and public health protocols (Apuke & Omar, 2021; Gottlieb & Dyer, 2020; Hartley & Vu, 2020; Naeem et al., 2020; O’Connor & Murphy, 2020; Wiederhold, 2020). Investigating how people navigate this digital landscape to make life or death decisions during an unprecedented pandemic reveals the necessity of digital literacy.

 

Thus, this chapter should be used as a tool for increasing people’s digital literacy. We are at a particularly crucial moment for educating students about social media and the potential ramifications of misuse. Previous pedagogical studies have revealed that social media is an effective tool for learning about fake news and misinformation (Bonnet & Rosenbaum, 2020; McGrew et al., 2017). Moreover, the recent surge in reliance on digital technologies during COVID-19 has ushered in a new era for examining the effects of misinformation and disinformation. For example, I surveyed my undergraduate students (n=315) at the University of California – Irvine about their use of social media during 2021. The majority (66%) reported an increase in personal usage of social media during the pandemic, as well as a desire for social media to be integrated into their coursework (52%). In sum, it is a particularly apt time to be teaching our students about the power of social media.

Recommended Resources: There are many resources for improving digital literacy. Consider the following when learning in the classroom and beyond:

References

Apuke, O. D., & Omar, B. (2021). Fake news and COVID-19: modelling the predictors of fake news sharing among social media users. Telematics and Informatics, 56, 101475.

 

Basha, H. (2012). Social media and the law. Share This Too: More Social Media Solutions for PR Professionals, 149-158.

 

Benis, A., Khodos, A., Ran, S., Levner, E., & Ashkenazi, S. (2021). Social Media Engagement and Influenza Vaccination During the COVID-19 Pandemic: Cross-sectional Survey Study. Journal of medical Internet research, 23(3), e25977.

 

Bonnet, J. L., & Rosenbaum, J. E. (2020). “Fake news,” misinformation, and political bias: Teaching news literacy in the 21st century. Communication Teacher, 34(2), 103-108.

 

Carr, C. T., & Hayes, R. A. (2015). Social media: Defining, developing, and divining. Atlantic journal of communication, 23(1), 46-65.

 

Drouin, M., McDaniel, B. T., Pater, J., & Toscos, T. (2020). How parents and their children used social media and technology at the beginning of the COVID-19 pandemic and associations with anxiety. Cyberpsychology, Behavior, and Social Networking, 23(11), 727-736.

 

Fox, J. & McEwan, B. (2020) Social Media. In M. B. Olivera, A. A. Raney, & J. Bryant (Eds.), Media Effects: Advances in Theory and Research (4th ed., pp. 373-388). Routledge.

 

Gottlieb, M., & Dyer, S. (2020). Information and Disinformation: Social Media in the COVID‐19 Crisis. Academic Emergency Medicine, 27(7), 640-641.

 

Hartley, K., & Vu, M. K. (2020). Fighting fake news in the COVID-19 era: policy insights from an equilibrium model. Policy Sciences, 53(4), 735-758.

 

Hooker, M. P. (2019). Censorship, Free Speech & Facebook: Applying the First Amendment to Social Media Platforms via the Public Function Exception. Wash. JL Tech. & Arts, 15, 36.

 

Jack, C. (2017). Lexicon of lies: Terms for problematic information. Data & Society, 3(22), 1094-1096.

 

Lee, Y., & Scott-Baumann, A. (2020). Digital ecology of free speech: Authenticity, identity, and self-censorship.

 

McGrew, S., Ortega, T., Breakstone, J., & Wineburg, S. (2017). The Challenge That's Bigger than Fake News: Civic Reasoning in a Social Media Environment. American educator, 41(3), 4.

 

Nabity-Grover, T., Cheung, C. M., & Thatcher, J. B. (2020). Inside out and outside in: How the COVID-19 pandemic affects self-disclosure on social media. International Journal of Information Management, 55, 102188.

 

Naeem, S. B., Bhatti, R., & Khan, A. (2020). An exploration of how fake news is taking over social media and putting public health at risk. Health Information & Libraries Journal.

 

O’Connor, C., & Murphy, M. (2020). Going viral: doctors must tackle fake news in the covid-19 pandemic. bmj, 369(10.1136).

 

O’Connor, K. W., & Schmidt, G. B. (2021). Free Speech and Social Media in Academia. In Media and Law: Between Free Speech and Censorship. Emerald Publishing Limited.

 

Oversight Board. (2021, March). The Purpose of the Board. https://oversightboard.com/

 

Pew Research Center. Social Media Fact Sheet. (2019, June 12). https://www.pewresearch.org/internet/fact-sheet/social-media/

 

Stewart, D. (Ed.). (2017). Social media and the law: A guidebook for communication students and professionals. Taylor & Francis.

 

Wiederhold, B. K. (2020). Social media use during social distancing.

 

Zhong, B., Huang, Y., & Liu, Q. (2021). Mental health toll from the coronavirus: Social media usage reveals Wuhan residents’ depression and secondary trauma in the COVID-19 outbreak. Computers in human behavior, 114, 106524.

bottom of page