Could a generative AI chatbot have a real relationship with a person? Allison Pugh, a sociologist, argues no: “there is no human relationship when one half of the encounter is a machine,” she writes. In her argument, Pugh describes how care-focused generative AI platforms–such as the ones being used in education, medical, and therapy settings–damage human relationships by making people feel both invisible and disconnected from each other.
Allison Pugh, "When AI Automates Relationships, TIME, 14 August 2024.
- Pugh introduces the term “connective labor” in her argument. In your own words, define this term. Name three jobs that Pugh mentions in her argument which rely on this kind of labor.
- According to Pugh, how has generative AI worsened the working conditions of people in care-focused jobs? Describe one example Pugh gives that illustrates this issue.
- Identify one place in Pugh’s argument where she uses metacommentary to clarify and emphasize her claims. Evaluate the effectiveness of this metacommentary: how does it help you as a reader follow and understand her points?
- Watch this ad, which Google released during the 2024 Summer Olympics. What do you think Pugh’s reaction would be to the ad? Why? Use a short quote from her essay to explain Pugh’s perspective and support your answer. /span>
- Pugh references a “depersonalization crisis,” an issue addressed in the 2023 Surgeon General’s report, “Our Epidemic of Loneliness and Isolation.” Read the letter from the Surgeon General, Dr. Vivek Murthy, on pages 4-5 of the report. How does loneliness impact both individuals and society, according to Murthy? Do you think generative AI can be used ethically to address loneliness and social disconnection? Why or why not?
wowoow
Posted by: J | 11/08/2024 at 12:06 PM