« Easing the burden: Lamar Alexander on financing a college education | Main | Dear Sir or Madam: Ellen Barry on gender-specific forms of address »



Feed You can follow this conversation by subscribing to the comment feed for this post.

Samie Conyers

Although the voices of these devices are default to be feminine, that does not undermine the female. The voice is a standard “robot” voice which you hear everywhere. In movies, shows, and our personal technology these voices are automatically assumed to be that of a robot. Also, females typically don’t sound like that. The voice sounds robotic and fake, and is normally not compared to your average woman.

brooke natalie wood

Women servitude has been an outstanding issue dating back to the first civilization of Mesopotamia. Throughout history women have been regarded as intellectually and physically inferior to men, bringing a pressing need for social reorder. Through modernization and progressive movements, this suppression has slowed and even reversed in certain ways. Yet, certain aspects of society, have failed to correspond to recent progressions in gender equality. Siri and other technological services that aim to aid a persons every day life, fail to succum to new genderal expectations. These services promote the idea that a female’s function is to serve others seen through the femininity they engender mimicked in their voice. Not only does this slight the progress made by women to equalize the genders, it allows dated ideals of a women’s role to return.

The Unfavorite Lit Group

The prominence of voice assistants allows Americans to delegate seemingly rudimentary tasks, such as checking the weather and scheduling appointments, to computers. The intention of tech companies was to make the lives of Americans easier. However, there is question as to what the true motives of these companies are. All of the voice assistants are female-Siri, Alexa, Cortana, etc.-which sparks debate of whether the intentions of Apple, Google and Amazon are pure. Female voices are much more alluring than male ones, for many unknown reasons. Using female voices for Alexa, Cortana, and Siri allows for Amazon, Google, and Apple to sell more units. But it also resurfaces stereotypes against women that were supposedly abolished back in the 20th century. Voice assistants are like the virtual maids that serve to address all aspects of virtual life. They remain in the house to complete the simplest of tasks. Similar to housewives of the 1950s, it is unheard of for a male to take over any of these tasks. To companies, a female voice becomes simply a marketing strategy. The high-level executives are not concerned with the patronization of women; they only care about what will make the most money. And, in this twisted society that is America today, only a female voice is acceptable for a virtual home assistant.


While the idea of sexism is one of importance in today’s society, using female bots as the base of an argument against sexist actions is hardly credible. While these bots do have feminine voices, they are in no way objectifying women as a whole, nor are they reversing the positive change made in recent centuries. If it were voiced by a man, would these bots be symbols of male servitude? No. Because the purpose of these bots is solely to provide service- whether it be forecasting the weather, making a phone call, or answering a question. Though stated by the author that “‘If you can’t have a real woman who will cater to your every whim, buy a virtual one,” contradicts the idea that the only significance of such robots are to provide what a real woman does, which is a sexist comment in itself-more so than any robot the author referenced.

Sophia Waninger

In this article, it is brought to the public’s attention that the popular craze of virtual robots is shaping the way people’s lives develop. Numerous companies and organizations have advanced their technological approach in order to obtain more revenue, even if it means setting negative examples and stereotypes in order to please their audience. After centuries of women being treated unequally in comparison to men, it is not beneficial to the largely supported women’s movement to stereotype a robot that acts like a prop in daily lives. The worst part is that most of the robots’ voices simulate a women’s voice primarily; except for Apple’s Siri, which allows people to command a female to do certain things for their own benefit. How was this problem stressed to the public? Once the movie “Her” came out in theaters, people realized what the horror of a person’s control over a simple robot can influence lifestyle of adults, and children growing up in this era. The situations the main character gets himself into are highly grotesque and involve a fictional robot called “Samantha”. This particular instance shows that although these robots are helpful for daily use, such as playing music, however these robots have been psychologically and emotionally dangerous to human beings who have thought of doing other things than simply asking to play music. All in all, these female robots are beneficial to society, but their negative outlooks are shaping society’s true feelings.

Molly Barkley, Mike Smith, Jacob Strauss, Elizabeth Galvich

Although sexism is a prevalent as well relevant issue in present day, the claim that the default robotic voice such as seen with Siri or Alexa programs, is a bit of a stretch. The gender of the default voice can be easily changed in the settings of one’s device. Siri could be a British male, a South African female, or even Irish of either gender. Siri is simply a virtual assistant, lacking human emotion or the complexity of artificial intelligence. Either way, Siri or even Alexa as a program would have been assigned a default gender. Upon being questioned, Siri does not identify with any gender, it is simply a pitch of voice, no pronouns assigned. The labeling is meant for convenience, to understand which tone of voice one will receive. If Siri had been by default a male, similar accusations would’ve been made my women; that it is inherently sexist, suggesting that men are superior.


In the article it has come to realization that even in today's society robots are going to shape the way women are suppose to act, look and be perceived. Siri and many other robotic programs are being defaulted as a women's voice.This is making Women in today's society be inferior to a man in the way she speaks. The psychological factors can and will shape the way men think they can speak down on a female and also have a standard ,that will not be met, considering women are humans and the robot is just a form of artificial intelligence. Parramore speaks on the issue about the movie "her" that creates this story on a male growing frustrated because “Samantha,” the robot does not have a body. This is not acceptable as it is giving standards that will not be met by women.

Brycen Roberts

In the article written by Lynn Stuart Parramore, she argues that companies such as Apple and Amazon are using female voices as a way for males to control women. I strongly disagree with this article in a variety of ways. For one thing, you can change the voice of Siri on the iPhone so that it can be a male voice and you aren't forced to use the default female voice. In the case of Alexa, she is a virtual assistant not a real person. Alexa doesn't have emotions or feelings, she is a robot meant to help with daily tasks and I do not believe she represents the house wife stereotype that Lynn argues. As for the idea of men wanting Siri and Alexa to have a physical body so that they can get off on their sexual fantasies sounds completely ridiculous and that is definitely not a majority of us. These voice assistants are just that, assistants who are here to help provide us with information, not to be sex robots. In conclusion, I don't think that Siri and Alexa are made to serve males, but rather, the default robot voice to help anyone with rudimentary tasks and to provide information in a quick and efficient manner.


I think that whoever has the idea that the people that is making these Siris and Alexas are trying to degrade women have a very ignorant mind. It is the fact that everyone has become sensitive nowadays and just looks for something really small in order to make an argument against it. They all may have women voices because women's voices are genuinely more appealing than men's voices are. Maybe even they just chose a women's voice just because they chose it. On Siri, you can change the voice to a man if wanted to too.


Although I agree that the seemingly insignificant fact that today's technological assistants having female voices may actually be very important, this does not make it intentional. I feel it is very unlikely that companies are purposely trying to undermine females. When Apple set the precedent for voice assistants using Siri, other companies most likely did not think much of making theirs in the same likeness, which was female. Also, Siri does not have to be Siri. Apple devices give you the capabilities to change the gender and accent of your voice assistant. Large companies like Apple and Amazon also would not want to upset are a large portion of the consumer intentionally because that would only decrease sales. Though I understand why some might be offended by the use of female voices, it should not be taken so seriously unless the female voice is being abused for bad intentions.


Although it may seem that women are being sexualized through the voice of seri and Alexa this is not the case. In the sex industry, there are dolls to pleasure both men and women, but seri and Alexa are not even a part of the sex industry. You are over analyzing this if you think just because seri and Alexa has a female voice it automatically means that men are encouraging the idea that women are subservient to men.


I completely agree that sexism is an issue that is very prevalent today, but I don't know if I agree that these female technological assistants are reinforcing sexist attitudes. Apple allows their users to change the gender and accent of Siri. While Amazon's Alexa does not provide the same option, it has the same use as Apple's Siri: it assists you in searching the web, finding articles, calling someone, etc. These robots are exactly that, they're robots. They aren't human, they don't feel real emotion, and they do not have a physical body. I think the topic that Lynn Stuart Parramore should be concerned about is the conspiracy that these robots are listening in on your conversations 24/7, not that they're reinforcing sexist attitudes.

Aminah Muhammad

What Parramore means when she says "its a man-made culture on steroids" is that men have created the idea of objectifying women and the culture of having female voices in our technology is a sexist thing because it is more well liked overall by men. But Parramore's argument is that it is a more robotic version of a women so she doesn't argue or nag or have any wants for anything. They are just there to please people. She uses evidence like the movie Her and how in the movie a man falls in love with a digital women but is frustrated by the fact that she doesn't have a body. Parramore's frustration with this idea is that people will hyper-sexualize the bodies of the virtual voices. Not only does she refer to the movie Her but she also talks about other feminist like Simone de Beauvoir who have a standing on men and the idea that "men are the ones in charge, women are there to support".

Jackson Ruh

In the article “Female digital assistants like Alexa and Siri remain popular in Silicon Valley. It is time to boycott them” the author Lynn Parramore is arguing that Siri, Amazon Alexa and other voice over technology assistants reinforce the idea of women sexuality. Her purpose in writing this article is to show the negative effects on these technology systems and overall to boycott and get rid of these technology assistants. She argues that these big companies need to get rid of these machines and come up with a more intelligent design. In the article Parramore does use specific logos and pathos to back up her argument of why she believes what she does. The author proceeds to say that these technology assistants are putting a negative view of sexism on women.
I disagree with Parramore’s view that Alexa, Siri, and other technology robots should be boycotted and eventually disappear. Although I do concede that some may find it offensive to the negative image that it portrays to women I do not believe they should get rid of them. By focusing on digital assistants and the negative image some believe it brings to women it overlooks the deeper problem we have in our society of women sexism not due to these technology machines. In the article Parramore states in the first paragraph that these “female servitudes” are getting a negative image due to the reasons of “You don’t have to pay here. She does not have any rights. In fact, she is sub-human” (Parramore 2019). I disagree with this argument due to that I do believe that women do have rights in our society. Finally, I believe that these digital assistants are needed in our society today is due to the way that our world have turned to the need of technology. Technology is not the problem with the idea of women sexuality is todays culture people need to look deeper and see that it is people problems.

Christina DiMaria

In this article, Lynn Parramore is arguing that by making most of the voices for artificial intelligence female, Big Tech is undermining the hard-won rights women have received in regards to equal pay and treatment. Parramore begins her argument by discussing the various types of female voiced artificial intelligence, such as Siri, Alexa, and newly released “Erica”, a service that helps pay your bills, and how blatantly sexist they are. She further discusses examples of this, such as “subservience in language” present in most forms of artificial intelligence, Ads for Amazon Alexa that show large sensual lips, and chatbots that have sensual women as avatars. What effect does this have on women? Parramore explains that this associates femininity with certain tasks and behaviors that secure women’s place as secondary. This makes it more apparent that the role for women should be to assist and support while the man is left in charge, according to Parramore. Parramore emphasizes the impact of this, saying that women who grow up in environments where sexist beliefs are prevalent tend to earn less, and those who are objectified experience lost wages, financial consequences, and narrowed opportunities. Parramore proposes on solution, boycott the companies and request new voices for artificial intelligence.
I disagree with Parramore’s view that the many female voices of artificial intelligence undermine the rights women have received regarding equal pay and treatment. Although I do agree that the sensual avatars are a bit unnecessary, I do not think the female voices of artificial intelligence provides a second nature instinct to make women submissive and secondary. Firstly, on most forms of artificial intelligence, you can adjust the voice to be female or male. Secondly, I cannot seem to find any relationship between the female voiced artificial intelligence and negative impacts of women. From what I can tell, as the years progress, which comes with the creation of artificial intelligence, there has been more advancements and conversation on feminism and equal opportunity for women. I cannot seem to find any facts or evidence to support the claim for this negative impact on women either. Although I do agree that there is a huge economic and psychological impact on women when we encourage a sexist environment, I do not agree that this is being done through artificial intelligence and also do not think there would be any need to boycott the companies that create this.

Shani J

I personally have never thought of Alexa and Siri's creation as being sexist. Yes they both have the sound of a female, but it's a robotic sound. They do not sound like human beings and could not replace women as a whole. I do, on the other hand, see that it could influence the younger generation that if you tell a woman to do something, then she will willingly do it for you. I think this thought would only stick if a child didn't have any other human contact. The article also mentions the newly popular chat box which I hadn't heard of until I read this article. No this one is a bit different because it has a 3-D avatar, nationality, and a sexy sound. This reflects a different persona of a woman but in robot form. Now I agree with the writer with her opinion on this, simply because this one seems to be sexist due to there being that type of a visual. Sex sells and if it's being marketed that way then I can see the outrage. All in all, if people boycott or not, people in this generation like quick accessibility and Alexa and Siri do just that. No different than internet surfing, only Alexa and Siri will talk to you.

Elizabeth Holloway

When Parramore says that it is a "man-made culture on steroids" she is referring to the fact that men have been dissatisfied in a society where women have gradually been given more opportunities and advancements in the workplace over the past several decades. From the man's perspective, this has led to less attention being given to men and a neglect of their needs. As a way to counter that, these men have now manufactured their own subservient she-bot who will be there to assist and serve them 24/7. Parramore explains that it is dangerous to link women to a sub human form since this will inevitably lead to those views being cast onto real women. Children's interaction with Alexa and Siri more and more on a daily basis means that this is becoming an ingrained and subconscious practice from a very young age.

Teague H

Parramore's article tackles the subject of virtual assistants having default female voices, this never really occurred to me as being sexist, in addition you can go into the setting and change the voice to any gender and any number of accents you may choose. Although I do understand Parramore's point when she makes the connection between associating the female voice with serving or the completion of demanded tasks. She goes on to talk about the repercussions these feminized virtual assistants could cause leading young women to grow up with this idea of subservient behavior after generations of women have already fought for equality in a world divided by gender. I think one of the best ways to fix this problem would be to simply give the option of both male and female voice settings upon setup of the device, that way there would be no clear bias to one gender or another.

Alex Davis

This article by Lynn Stuart Parramore is about how tech companies, in their race to create the best applications for artificial intelligence, have fallen back into creating stereotypical gender roles when it comes to the new AI assistants. While this may seem like a small thing, the implications can actually be far-reaching because AI assistants are becoming more prevalent and will eventually become a standard part of our lives. However, considering the current events ravaging America and the world today, this does seem like something that may not be first on the list of things needing to be fixed. Initially, the tech companies seemed to be creating stereotypical, perky, nice, young women voices as the AI assistants, but they have since changed that to allow for male voices and different accents. I think it is important to give consumers options so that they can decide who they want as their "assistant". The tech companies are driven by the bottom line, and if consumers demand this optionality, they will get it. In the article, Parramore acts like digital relationships are only for men, but they can also be for women. I think she reaches somewhat when considering the impacts of having only a "Siri" or "Erica" instead of a "Bob" or "Michael". I could see an argument that says these assistants' sex, gender, accent, etc. should be constantly changing to expose people to how the real world is. But companies are more likely to do what gets them the most money, and that is likely the nice, young, perky woman. It's important to understand tech companies motivation is selling phones and not social equality.

Karla Ortega

I agree with the argument Lynn Stuart presents, to a certain extent. This article forced me to view virtual assistants in a different light and before reading this, I never thought about how sexism could be involved when promoting a female’s voice to be the one that listens to demands. I always believed it just made sense, since women usually are the more nurturing and listening type. Once I reflected on this though, I realized I was being sexist as well. Lynn Stuart makes a great argument, as she discusses her argument in a satirical manner. She begins her article by stating, “Summon her with a tap or a word. Yell at her if you want…She doesn’t have any rights.” This is how women have been viewed for centuries, and in using these words, she proves how degrading these digital assistants are to women. Although she makes the valid point that these sexist attitudes can be dangerous to kids in the long run, I would have to disagree. Siri and Alexa have the feature to change the gender of the voice, and even the language, which many people put to use. This means that the voice of a woman is not the only one that can be bossed around. Additionally, these assistants are used solely to provide help with small tasks that some might find tedious. She then brings up how this can be damaging to boys, who might fall in love with the assistants' voice and begin to want sexual relations. Although this is a chilling concept, I would have to say that there are already other sites online that allow boys to partake in a similar manner, and are easily accessible. Ultimately, although the concept of these assistants can seem to support how women have been viewed for centuries, it does not invalidate the fact that the progress women have made in society is huge. From filling up the workplace to taking control of their bodies, women have definitely come a long way

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.


Post a comment

Your Information

(Name and email address are required. Email address will not be displayed with the comment.)

About They Say / I Blog

  • New readings posted monthly, on the same issues that are covered in “They Say / I Say” with Readings—and with a space where readers can comment, and join the conversation.

Follow us on Twitter to get updates about new posts and more! @NortonWrite

Become a Fan