« Disrupting the gender binary: Barbara J. King on gender fluidity and queer rights | Main | Lunar landings: Alexandra Witze on protecting the Moon by regulating scientific exploration »

01/25/2021

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Robert Hambly

Participants on social media platforms should regulate extremism from other participants

Penny Feng

I agree that tech companies should self-regulate their digital platforms to secure their long-term survivals and promote positive impacts on society. Nowadays, with the great help of Internet technology developments, uploading and obtaining online information has been easier than before. As a result, it creates an age of information overload and abundance since people can access information from anywhere at any time. The digital platforms can be double-edged swords that prompt to develop the society or rise a risk to destroy social civilization. On one hand, it provides many convenient ways for people to work, study, and have recreations. On the other hand, it also offers effective channels for negative perspectives or fake news to make impacts on society. Being influenced by the Internet effect, these negative perspectives or fake news may become triggers for social unrests, such as the U.S. Capital debacle on January 6, 2021. Likewise, an article emphasizes the importance of self-regulating for these tech companies. According to her article, “Google, Democracy, and the Truth about Internet Search,” Carole Cadwalladr points out that search algorithms from Google contain biases (636). In her article, Cadwalladr shows how Google search engine attempts to predict its users’ questions and offers negative leading choices while these users are typing the incomplete sentences. For instance, when typing “are Jews,” Google will provide potential questions, like “are Jews evil,” “are Jews white,” and “are Jews evil?” These predictions may push its users to follow Google’s perspectives. In addition, some top results shown on Google are fake news or contain preconceptions, which affect its customers’ recognition. In this way, without taking action to stop the spread of these misconceptions, some gullible people will be tricked or be fomented negative emotions easily. Therefore, performing self-regulations from the tech companies is a key to solve the spread of negative perspectives or fake news on the digital platforms and reduce the negative impacts on society.

Works Cited
Cadwalladr, Carole “Google, Democracy, and the Truth about Internet Search,” They Say/I Say: With Readings, edited by Gerald Graff, et al., 5th ed., W.W. Norton, 2012, pp. 624-641.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Your Information

(Name and email address are required. Email address will not be displayed with the comment.)

By signing up you agree to W. W. Norton’s
privacy policy and terms of use.

About They Say / I Blog

  • New readings posted monthly, on the same issues that are covered in “They Say / I Say” with Readings—and with a space where readers can comment, and join the conversation.

Follow us on Twitter to get updates about new posts and more! @NortonWrite

Become a Fan