
Description
Critical Analysis Length: 750 word minimum Format: MLA, no cover sheet, Times New Roman or Arial 12 font Include: Work Cited Article: “The Attack of the Friendly Robots,” You will complete the critical analysis on “The Attack of the Friendly Robots,” written by Dr. Sherry Turkle. Your essay will include the major theme of the article and how Turkle uses evidence to support her theme. Determine if it meets the criteria for credible evidence. Discuss any strengths and weaknesses of the evidence and how these aspects help the author achieve, or fail to achieve, her purpose. Questions to keep in mind: Does the author ignore contradictory evidence? How does this impact the author’s argument? What evidence has the author failed to consider? How does this impact the author’s argument? What are the implications of this argument? Based on the evidence presented, does the author support his or her argument? Introduction – make sure to include the title of the article, publication information, author’s name, purpose, and thesis. The last sentence of the introductory paragraph will be your thesis. Summary – briefly summarize the article. Body – this is your evaluation of the evidence. Make sure that you use examples from the text. Properly cite each example. Conclusion – restate (in new words) your argument, and summarize the arguments made throughout the text. Work Cited – cite the text using MLA citation format. Basic Information: Do not use first or second person Do not use the author’s first name unless you are writing the entire name. Do not use contractions Do not use abbreviations Use academic language
From: support@ebsco.comSubject: EBSCOhost E-mail ResultDate: October 26, 2020 at 10:01:07 PM CDTTo: andersonkiaral@yahoo.com
Record: 1Title:The attack of the friendly robotsAuthors:Sherry TurkleSource:Washington Post, The. 12/09/2017.Document Type:ArticleAbstract:Jibo the robot swivels around when it hears its name and tilts its touchscreen face upward, expectantly. “I am a robot, but I am not just a machine,” it says. “I have a heart. Well, not a real heart. But feelings. Well, not human feelings. You know what I mean.” [ABSTRACT FROM PUBLISHER]Accession Number:wapo.bce1eaea-d54f-11e7-b62d-d9345ced896d Persistent link to this record (Permalink):https://libaccess.hccs.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=n5h&AN=wapo.bce1eaea-d54f-11e7-b62d-d9345ced896d&site=ehost-live&scope=site Cut and Paste:The attack of the friendly robots Database:Newspaper Source Plus
Full TextJibo the robot swivels around when it hears its name and tilts its touchscreen face upward, expectantly. “I am a robot, but I am not just a machine,” it says. “I have a heart. Well, not a real heart. But feelings. Well, not human feelings. You know what I mean.”Actually, I’m not sure we do. And that’s what unsettles me about the wave of “sociable robots” that are coming online. The new releases include Jibo, Cozmo, Kuri and M.A.X. Although they bear some resemblance to assistants such as Apple’s Siri, Google Home and Amazon’s Alexa (Amazon chief executive Jeff Bezos also owns The Washington Post), these robots come with an added dose of personality. They are designed to win us over not with their smarts but with their sociability. They are marketed as companions. And they do more than engage us in conversation – they feign emotion and empathy.This can be disconcerting. Time magazine, which featured Jibo on the cover of its “25 Best Inventions of 2017” issue last month, hailed the robot as seeming “human in a way that his predecessors do not,” in a way that “could fundamentally reshape how we interact with machines.” Reviewers are accepting these robots as “he” or “she” rather than “it.” “He told us that blue is his favorite color and that the shape of macaroni pleases him more than any other,” Jeffrey Van Camp wrote about Jibo for Wired. “Just the other day, he told me how much fun, yet scary it would be to ride on top of a lightning bolt. Somewhere along the way, learning these things, we began to think of him more like a person than an appliance.” Van Camp described feeling guilty for leaving Jibo at home alone all day and wondering if Jibo hated him.But whereas adults may be able to catch themselves in such thoughts and remind themselves that sociable robots are, in fact, appliances, children tend to struggle with that distinction. They are especially susceptible to these robots’ pre-programmed bids for attachment.So, before adding a sociable robot to the holiday gift list, parents may want to pause to consider what they would be inviting into their homes. These machines are seductive and offer the wrong payoff: the illusion of companionship without the demands of friendship, the illusion of connection without the reciprocity of a mutual relationship. And interacting with these empathy machines may get in the way of children’s ability to develop a capacity for empathy themselves.Jibo’s creator, Cynthia Breazeal, is a friend and colleague of mine at the Massachusetts Institute of Technology. We’ve debated the ethics of sociable robots for years – on panels, over dinner, in classes we’ve taught together. She’s excited about the potential for robots that communicate the way people do to enrich our daily lives. I’m concerned about the ways those robots exploit our vulnerabilities and bring us into relationships that diminish our humanity.In 2001, Breazeal and I did a study together – along with Yale robotics pioneer Brian Scassellati and Olivia Dasté, who develops robots for the elderly – looking at the emotional impact of sociable robots on children. We introduced 60 children, ages 8 to 13, to two early sociable robots: Kismet, built by Breazeal, and Cog, a project on which Scassellati was a principal designer. I found the encounters worrisome.The children saw the robots as “sort of alive” – alive enough to have thoughts and emotions, alive enough to care about you, alive enough that their feelings for you mattered. The children tended to describe the robots as gendered. They asked the robots: Are you happy? Do you love me? As one 11-year-old girl put it: “It’s not like a toy, because you can’t teach a toy, it’s like something that’s part of you, you know, something you love, kind of, like another person, like a baby.”You can hear echoes of that sentiment in how children are relating to the sociable robots now on the market. “Cozmo’s no way our pet,” the 7-year-old son of a Guardian contributor said. “And he’s not our robot. He’s our child.” Similarly, Washington Post tech columnist Geoffrey A. Fowler observed a 3-year-old girl trying to talk to Jibo, teach it things and bring it toys. “He is a baby,” the girl determined.In our study, the children were so invested in their relationships with Kismet and Cog that they insisted on understanding the robots as living beings, even when the roboticists explained how the machines worked or when the robots were temporarily broken. Breazeal talked to an 8-year-old boy about what Kismet was made of and how long it took to build, and still that child thought the robot wasn’t broken, but “sleeping with his eyes open, just like my dad does.” After a quick assessment of the out-of-order machine, the boy declared, “He will make a good friend.”The children took the robots’ behavior to signify feelings. When the robots interacted with them, the children interpreted this as evidence that the robots liked them. And when the robots didn’t work on cue, the children likewise took it personally. Their relationships with the robots affected their state of mind and self-esteem. Some children viewed the robots as creatures in need of their care and instruction. They caressed the robots and gently coaxed them with urgings such as, “Don’t be scared.” Some children became angry. A 12-year-old boy, frustrated that he couldn’t get Kismet to respond to him, forced his pen into the robot’s mouth, commanding: “Here! Eat this pen!” Other children felt the pain of rejection. An 8-year-old boy concluded that Kismet stopped talking to him because the robot liked his brothers better. We were led to wonder whether a broken robot can break a child.Kids are central to the sociable-robot project, because its agenda is to make people more comfortable with robots in roles normally reserved for humans, and robotics companies know that children are vulnerable consumers who can bring the whole family along. As Fowler noted, “Kids, of course, are the most open to making new friends, so that’s where bot-makers are focused for now.” Kuri’s website features photos of the robot listening to a little girl read a book and capturing video of another child dressed as a fairy princess. M.A.X.’s site advertises, “With a multitude of features, kids will want to bring their new friend everywhere!” Jibo is programmed to scan a room for monsters and report, “No monsters anywhere in sight.”So far, the main objection to sociable robots for kids has been over privacy. The privacy policies for these robots tend to be squishy, allowing companies to share the information their devices collect – recorded conversations, photos, videos and other data – with vaguely defined service providers and vendors. That’s generating pushback. In October, Mattel scrapped plans for Aristotle – a kind of Alexa for the nursery, designed to accompany children as they progress from lullabies and bedtime stories through high school homework – after lawmakers and child advocacy groups argued that the data the device collected about children could be misused by Mattel, marketers, hackers and other third parties. I was part of that campaign: There is something deeply unsettling about encouraging children to confide in machines that are in turn sharing their conversations with countless others.Privacy, though, should not be our only concern. Recently, I opened my MIT mail and found a “call for subjects” for a study involving sociable robots that will engage children in conversation to “elicit empathy.” What will these children be empathizing with, exactly? Empathy is a capacity that allows us to put ourselves in the place of others, to know what they are feeling. Robots, however, have no emotions to share. And they cannot put themselves in our place.What they can do is push our buttons. When they make eye contact and gesture toward us, they predispose us to view them as thinking and caring. They are designed to be cute, to provoke a nurturing response. And when it comes to sociable AI, nurturance is the killer app: We nurture what we love, and we love what we nurture. If a computational object or robot asks for our help, asks us to teach it or tend to it, we attach. That is our human vulnerability. And that is the vulnerability sociable robots exploit with every interaction. The more we interact, the more we help them, the more we think we are in a mutual relationship.But we are not. No matter what robotic creatures “say” or squeak, no matter how expressive or sympathetic their Pixar-inspired faces, digital companions don’t understand our emotional lives. They present themselves as empathy machines, but they are missing the essential equipment: They have not known the arc of a life. They have not been born; they don’t know pain, or mortality, or fear. Simulated thinking may be thinking, but simulated feeling is never feeling, and simulated love is never love.Breazeal’s position is this: People have relationships with many classes of things. They have relationships with children and with adults, with animals and with machines. People, even very little people, are good at this. Now, we are going to add robots to the list of things with which we can have relationships. More powerful than with pets. Less powerful than with people. We’ll figure it out.To support their argument, roboticists sometimes point to how children deal with toy dolls. Children animate dolls and turn them into imaginary friends. Jibo, in a sense, will be one more imaginary friend – and arguably a more intelligent and fun one. Why make such a fuss?I’ve been comparing how children play with traditional dolls and how children relate to robots since Tamagotchis were released in the United States in 1997 as the first computational playmates that asked you to take care of them. The nature of the attachments to dolls and sociable machines is different. When children play with dolls, they project thoughts and emotions onto them. A girl who has broken her mother’s crystal will put her Barbies into detention and use them to work on her feelings of guilt. The dolls take the role she needs them to take.Sociable machines, by contrast, have their own agenda. Playing with robots is not about the psychology of projection but the psychology of engagement. Children try to meet the robot’s needs, to understand the robot’s unique nature and wants. There is an attempt to build a mutual relationship. I saw this even with the (relatively) primitive Furby in the early 2000s. A 9-year-old boy summed up the difference between Furbies and action figures: “You don’t play with the Furby, you sort of hang out with it. You do try to get power over it, but it has power over you, too.” Today’s robots are even more powerful, telling children flat-out that they have emotions, friendships, even dreams to share.Some people might consider that a good thing: encouraging children to think beyond their own needs and goals. Except the whole commercial program is an exercise in emotional deception.For instance, Cozmo the robot needs to be fed, repaired and played with. Boris Sofman, the chief executive of Anki, the company behind Cozmo, says that the idea is to create “a deeper and deeper emotional connection. . . . And if you neglect him, you feel the pain of that.”You feel the pain of that. What is the point of this exercise, exactly? What does it mean to feel the pain of neglecting something that feels no pain at being neglected? Or to feel anguish at being neglected by something that has no moral sense that it is neglecting you? What will this do to children’s capacity for empathy, for care, for relationships?When adults imagine ourselves to be the objects of robots’ affection, we play a pretend game. We might wink at the idea on Jibo’s website that “he loves to be around people and engage with people, and the relationships he forms are the single most important thing to him.” But when we offer these robots as pretend friends to our children, it’s not so clear they can wink with us. We embark on an experiment in which our children are the human subjects.Mattel’s chief products officer, Robb Fujioka, concedes that this is new territory. Talking about Aristotle, he told Bloomberg Businessweek: “If we’re successful, kids will form some emotional ties to this. Hopefully, it will be the right types of emotional ties.”But it is hard to imagine what those “right types” of ties might be. These robots can’t be in a two-way relationship with a child. They are machines whose art is to put children in a position of pretend empathy. And if we put our children in that position, we shouldn’t expect them to understand what empathy is. If we give them pretend relationships, we shouldn’t expect them to learn how real relationships – messy relationships – work. On the contrary. They will learn something superficial and inauthentic, but mistake it for real connection.When the messy becomes tidy, we can learn to enjoy that. I’ve heard young children describe how robot dogs have advantages over real ones: They are less temperamental, you don’t have to clean up after them, they never get sick. Similarly, I’ve watched people shift from thinking that robotic friends might be good for lonely, elderly people to thinking that robots – offering constant companionship with no fear of loss – may be better than anything human life can provide. In the process, we can forget what is most central to our humanity: truly understanding each other.For so long, we dreamed of artificial intelligence offering us not only instrumental help but the simple salvations of conversation and care. But now that our fantasy is becoming reality, it is time to confront the emotional downside of living with the robots of our dreams.Twitter: @STurkle
Source: Washington Post, The, 12/09/2017 Item: wapo.bce1eaea-d54f-11e7-b62d-d9345ced896d
The link information above provides a persistent link to the article you’ve requested.
Persistent link to this record: Following the link above will bring you to the start of the article or citation.
Cut and Paste: To place article links in an external web document, simply copy and paste the HTML above, starting with “<a href”
If you have any problems or questions, contact Technical Support at http://support.epnet.com/contact/askus.php or call 800-758-5995.
This e-mail was generated by a user of EBSCOhost who gained access via the HOUSTON COMM COLLEGE SYSTEM account. Neither EBSCO nor HOUSTON COMM COLLEGE SYSTEM is responsible for the content of this e-mail.
Get professional assignment help cheaply
Are you busy and do not have time to handle your assignment? Are you scared that your paper will not make the grade? Do you have responsibilities that may hinder you from turning in your assignment on time? Are you tired and can barely handle your assignment? Are your grades inconsistent?
Whichever your reason may is, it is valid! You can get professional academic help from our service at affordable rates. We have a team of professional academic writers who can handle all your assignments.
Our essay writers are graduates with diplomas, bachelor, masters, Ph.D., and doctorate degrees in various subjects. The minimum requirement to be an essay writer with our essay writing service is to have a college diploma. When assigning your order, we match the paper subject with the area of specialization of the writer.
Why choose our academic writing service?
Plagiarism free papers
Timely delivery
Any deadline
Skilled, Experienced Native English Writers
Subject-relevant academic writer
Adherence to paper instructions
Ability to tackle bulk assignments
Reasonable prices
24/7 Customer Support
Get superb grades consistently
Get Professional Assignment Help Cheaply
Are you busy and do not have time to handle your assignment? Are you scared that your paper will not make the grade? Do you have responsibilities that may hinder you from turning in your assignment on time? Are you tired and can barely handle your assignment? Are your grades inconsistent?
Whichever your reason may is, it is valid! You can get professional academic help from our service at affordable rates. We have a team of professional academic writers who can handle all your assignments.
Our essay writers are graduates with diplomas, bachelor’s, masters, Ph.D., and doctorate degrees in various subjects. The minimum requirement to be an essay writer with our essay writing service is to have a college diploma. When assigning your order, we match the paper subject with the area of specialization of the writer.
Why Choose Our Academic Writing Service?
Plagiarism free papers
Timely delivery
Any deadline
Skilled, Experienced Native English Writers
Subject-relevant academic writer
Adherence to paper instructions
Ability to tackle bulk assignments
Reasonable prices
24/7 Customer Support
Get superb grades consistently
How It Works
1. Place an order
You fill all the paper instructions in the order form. Make sure you include all the helpful materials so that our academic writers can deliver the perfect paper. It will also help to eliminate unnecessary revisions.
2. Pay for the order
Proceed to pay for the paper so that it can be assigned to one of our expert academic writers. The paper subject is matched with the writer’s area of specialization.
3. Track the progress
You communicate with the writer and know about the progress of the paper. The client can ask the writer for drafts of the paper. The client can upload extra material and include additional instructions from the lecturer. Receive a paper.
4. Download the paper
The paper is sent to your email and uploaded to your personal account. You also get a plagiarism report attached to your paper.
PLACE THIS ORDER OR A SIMILAR ORDER WITH Essay fount TODAY AND GET AN AMAZING DISCOUNT
The post The Attack of the Friendly Robots appeared first on Essay fount.
What Students Are Saying About Us
.......... Customer ID: 12*** | Rating: ⭐⭐⭐⭐⭐"Honestly, I was afraid to send my paper to you, but you proved you are a trustworthy service. My essay was done in less than a day, and I received a brilliant piece. I didn’t even believe it was my essay at first 🙂 Great job, thank you!"
.......... Customer ID: 11***| Rating: ⭐⭐⭐⭐⭐
"This company is the best there is. They saved me so many times, I cannot even keep count. Now I recommend it to all my friends, and none of them have complained about it. The writers here are excellent."
"Order a custom Paper on Similar Assignment at essayfount.com! No Plagiarism! Enjoy 20% Discount!"
