For all the good that social media platforms provide – connection, innovation, a panoply of perspectives – there is also a darker side, evident in online harassment, deep fake video and audio manipulations, and the widespread circulation of disinformation and conspiracy theories. What responsibility should social media companies take for the content hosted on their sites? In their January 2021 Harvard Business Review essay, Michael A. Cusumano, Annabelle Gawer, and David B. Yoffie argue that it is time for social media companies to self-regulate their platforms to secure “their long-term survival and success.”
- What other industries, according to the authors, successfully self-regulated instead of waiting for the government to impose regulations? The authors argue that social media companies should follow a similar path to self-regulation. What do you think? Are these industries good analogies to social media platforms? Why or why not? Who do you think should have the power to establish regulations on social media companies: the social media platforms, the users of social media platforms, or state and federal governments? Why?
- The authors use transitions and connecting devices throughout their essay to guide readers through their argument. Read paragraph 2 (“In the past, Twitter and Facebook…”) and note the transition words and connecting devices the authors use in this paragraph. Select two transitions in this paragraph. How would you categorize these two transitions, using the categories described in Chapter 8? How do these two transitions help the reader understand the relationship among the ideas in this paragraph? Then, imagine the paragraph without any transitions. What would be lost for the reader?
- What is one major reason why social media companies might resist self-regulation, according to the authors? Look closely at paragraph 13 (“Second, we find that firms . . .”), where the authors introduce one of the naysayer arguments about profits. How do the authors respond to this naysayer argument? What evidence do they give to back up their response?
- Critiques of Section 230, the law that categorizes social media companies as platforms, not “publisher[s],” have been growing among both Republicans and Democrats, especially in response to the circulation of fake news and disinformation. However, others contend that curtailing the immunity Section 230 offers social media companies is misguided. Read Billy Easley’s essay, which argues that Section 230 protects the free speech of marginalized communities. How do you think Easley might respond to Cusumano, Gawer, and Yoffie’s argument about the necessity of social media companies to self-regulate? What’s your take? Should Section 230 be revised? If so, how?
Participants on social media platforms should regulate extremism from other participants
Posted by: Robert Hambly | 11/15/2021 at 04:07 PM
I agree that tech companies should self-regulate their digital platforms to secure their long-term survivals and promote positive impacts on society. Nowadays, with the great help of Internet technology developments, uploading and obtaining online information has been easier than before. As a result, it creates an age of information overload and abundance since people can access information from anywhere at any time. The digital platforms can be double-edged swords that prompt to develop the society or rise a risk to destroy social civilization. On one hand, it provides many convenient ways for people to work, study, and have recreations. On the other hand, it also offers effective channels for negative perspectives or fake news to make impacts on society. Being influenced by the Internet effect, these negative perspectives or fake news may become triggers for social unrests, such as the U.S. Capital debacle on January 6, 2021. Likewise, an article emphasizes the importance of self-regulating for these tech companies. According to her article, “Google, Democracy, and the Truth about Internet Search,” Carole Cadwalladr points out that search algorithms from Google contain biases (636). In her article, Cadwalladr shows how Google search engine attempts to predict its users’ questions and offers negative leading choices while these users are typing the incomplete sentences. For instance, when typing “are Jews,” Google will provide potential questions, like “are Jews evil,” “are Jews white,” and “are Jews evil?” These predictions may push its users to follow Google’s perspectives. In addition, some top results shown on Google are fake news or contain preconceptions, which affect its customers’ recognition. In this way, without taking action to stop the spread of these misconceptions, some gullible people will be tricked or be fomented negative emotions easily. Therefore, performing self-regulations from the tech companies is a key to solve the spread of negative perspectives or fake news on the digital platforms and reduce the negative impacts on society.
Works Cited
Cadwalladr, Carole “Google, Democracy, and the Truth about Internet Search,” They Say/I Say: With Readings, edited by Gerald Graff, et al., 5th ed., W.W. Norton, 2012, pp. 624-641.
Posted by: Penny Feng | 12/06/2021 at 03:40 PM