Today, robots and AI-enabled devices can vacuum your carpets, remind you to water your plants, or give you a weather forecast for next week’s trip to Disneyland. But can a robot commiserate with you at the end of a hard day or applaud you for a personal triumph? Well, maybe not yet, but soon, for sure. Wired writer and editor Arielle Pardes tells us all about it in this January 2018 article.
Read it here: Pardes, “The emotional chatbots are here to probe our feelings”
- Pardes’s essay promotes a personal assistant very different from Siri, Alexa, or any of the other AI helpers that are increasingly becoming part of our daily lives. In what ways are the chatbots that Pardes describes different from the more familiar task-oriented bots? Why does she think they will be so useful for people? Is her argument persuasive? Why or why not?
- A good way to learn about new things is to compare them with things that we already know and to draw examples from familiar items or concepts. Pardes uses these comparison devices frequently, for example, when she says that personal chatbots can function “something like friendship,” or when she observes that Siri is “like your co-workers, all business.” Choose two additional ways that Pardes explains chatbots by means of comparisons. How effective are her choices? Why do you think so?
- Pardes incorporates quotations from a number of AI experts in her essay, but she seldom follows the suggestions for framing a quotation that are given in Chapter 3 of your text. What does she do that is different? Describe what you find. Are the quotations well supported, even though they may not follow the guidelines set out in the text? Why do you think so?
- Pardes mentions a potential debate about whether chatbots “should become placeholders for emotional relationships with real humans,” and she concludes by saying that people may not feel very connected to others, leaving “a huge space for products to do more like that.” What do you think? Would you use a chatbot to talk about how your day is going or how you feel about yourself or your interactions with others? Write an essay addressing that question and explaining your reasoning; use Pardes and/or any of the readings in your text as your They Say.
The chatbots that Pardes describes can not really do anything besides talk. They are specifically made to give something for humans to communicate to that removes the stigma of opening up to someone. She thinks they will be useful because people are more likely to open up to a robot than a human. Her argument is persuasive because people truly are more likely to open up to something other than a human.
Posted by: Karson Peterson | 02/16/2018 at 09:42 AM
I think that this robot is completely should not be developed. People's lives should be filled with all kinds of feelings. But it appears only to make us full of joy at home, thereby causing everyone to themselves as the center, so that communication between people become obstacles.
Posted by: yunhuwang | 08/17/2018 at 02:07 AM
It is common today to us Artificial Intelligence (AI) in our daily life. According to Arielle Pardes, she believes that not only should this AI apps, should be able to look up places or set appointments and tell you the weather , but also have the capability to emotionally interact with you, by asking you, how was your day? or how you are feeling, etc. I believe that making this technology available to anybody as the creator Eugenia Kuyda did, will help other companies further advance this technology. This choice is very difficult to do,instead of keeping it all for her self she rather it be explored and advanced. This option she took is great for the future.
Posted by: Ricardo Carlos De La Rosa | 08/24/2018 at 09:50 PM
When it comes to Artificial Intelligence(AI) we must ask ourselves, how far do we want to go. I can see why people working to improve AI are doing what they are doing. They want to make life better for the generations below them. The question is, will they ever really feel emotions. It is hard to see that we can teach something emotion, in fact it sounds impossible. Emotion is something we are born with, its the thing that stops us from making choices based on facts with morals into play. So how can we teach something that only runs on facts emotions. Why should we give things the ability to advance when they won't have morals. I think that we can go somewhat further into advancing AI, But how far will we risk based on something that will never truly feel emotion.
Posted by: Austin Maloney | 10/16/2018 at 11:30 AM
Artificial intelligence is a controversial issue that exists in today’s society, however the idea of an AI understanding human emotion and being a form of emotional and mental support should be promoted to the public and worked on. Artificial intelligence is a controversial issue because on one side, the fear of artificial intelligence exists because it is an unknown anomaly. One the other hand, artificial intelligence helps the public throughout their daily lives such as Alexa and Siri when it comes to googling something, setting a timer, or ordering food. Above all else, many artificial intelligence cannot understand and mimic human emotion and make a one on one conversation. Replika can be used as a way to help with emotional support when it is needed the most. Many people in the world have stressful lives or emotional issues that they need to vent to someone. Sometimes, people does not have others to vent to. Mental health issues such as Depression and Anxiety are a big risk in today’s society and this AI can be a solution to this issue. In the article “Suicide Statistics,” it states, “The World Health Organisation [sic] (WHO) estimates that each year approximately one million people die from suicide, which represents a global mortality rate of 16 people per 100,000 or one death every 40 seconds.” AI like Replika can reduce the amount of deaths to suicide because it can become a solution accessible in the form of an app. Suicide is a grave issue in teenagers and middle aged adults and is preventable in some form if there are more solutions out there. In “The Emotional Chatbots Are Here to Probe Our Feelings,” Arielle Pardes states, “One Replika user, Kaitelyn Roepke, was venting to her Replika when the chatbot responded: ‘Have you tried praying?’ Roepke, who is a devout Christian, wrote to the company to tell them how meaningful that moment was for her.” Replika can create wonderful experiences such as what happened with Roepke and can alleviate many emotional pains and stresses in our daily lives. AI that can understand and mimic human emotion such as Replika should be promoted as a new solution to emotional and mental health problems in our daily lives.
Works Cited
Pardes, Arielle. “The Emotional Chatbots Are Here to Probe Our Feelings.” Wired, Conde Nast, 31 Jan. 2018, 7:00 a.m.
“Suicide Statistics.” Befrienders Worldwide, 2018.
Posted by: Sebastian Rodriguez | 12/11/2018 at 04:23 PM
In the essay, “The Emotional Chatbots are Here to Probe our Feelings” written by senior associate editor, Arielle Pardes, it talks about Replika a chatbot that is being developed to sense and respond to your emotions rather than your command. In the essay, Pardes writes, that humans access to a lot of information already that to these chatbots, but most don’t aren’t as personal as many would like. Although I agree with Pardes and am excited to see the implications of such a chatbot, she fails to address the security risk that such a chatbot would have. A chatbot like Replika one that as the Padres describe as “a digital companion with whom to celebrate victories, lament failures, and trade weird internet memes” (Padres), would hold a lot of personal information, some of which could be sold and used for others gain. My only concern is that this issue isn’t being addressed in the essay, not to say that the developers are not concerned with such things.
Posted by: Nathan Mehta | 08/12/2019 at 09:45 PM