Skip to main

'Roomba is like your pet and Alexa is like your friend': How kids view home smart devices

Six-year-old Brennan Kelly loves peppering his family’s Alexa device with questions. “What’s the strongest Pokemon?’” “What noise does a frog make?” and “Can you play ‘Watermelon Sugar’ by Harry Styles?” are a few of his most recent inquiries.

Home smart devices, like Alexa, have boomed in popularity in the last decade. It's now possible to have a semi-intelligent machine in every room, and that means that more and more kids like Brennan, are growing up with smart devices. But do children actually believe that smart devices are capable of human thoughts and feelings?

Developmental psychologists from Duke and Cornell Universities found that young children selectively attribute human mental abilities to certain smart devices but not to others. These findings, published in the journal Developmental Psychology in April, had kids consider two common gadgets: Alexa and Roomba.

Tess Flanagan, a former graduate student at Cornell University who co-authored the study, says that kids are able to compare the different abilities of Alexa and Roomba and will then treat each device accordingly.

“Now we're seeing with kids that they have this sophisticated understanding, where they are able to think, ‘OK, a Roomba it moves autonomously, but it doesn't have feelings or mental abilities,’” said Flanagan. “What separates Alexa and Roomba is that they think the Alexa has mental abilities. So children will say that the Alexa can think, that it can choose to speak. They also think that the Alexa has emotions, too.”

The chatty cylinder and the hungry disc

The researchers specifically selected Alexa and Roomba because they were interested in interactive technologies that children are familiar with from their everyday lives. Alexa is an Amazon smart speaker that comes in a variety of shapes, from a squat orb to a Pringle-can cylinder. It possesses a human-like voice that can respond to simple questions or requests. Roomba is an elongated disc-shaped vacuum that moves autonomously through its environment. Both machines, importantly, lack any distinctively human features, like a face or body.

While the devices don’t necessarily look like people, Tamar Kushnir, Duke psychology and neuroscience professor and co-author on the study, says that the machines’ unique abilities mirror specific human behaviors.

“One of the interesting contrasts between Alexa and Roomba is that Alexa talks and communicates,” said Kushnir. “And a Roomba, it moves, and it reacts to the environment. It's very physical, but it doesn't communicate. So this particular contrast interested us because these two interactive technologies were giving different signals to kids about what their capabilities were.”

In the study, more than 100 children aged 4 to 11 watched short videos of Roomba and Alexa in action. Kids then answered questions on the devices’ potential for feelings, thoughts, and emotions. Overwhelmingly, younger kids deemed that Alexa was smarter than the Roomba. The chattering speaker seemed to possess a human mind capable of complex thoughts, compared to the silent Roomba. Kids also concluded that neither device seemed to experience strong physical sensations like hunger, pain, or ticklishness.

A child whispers secrets to her family's Alexa.

A child whispers secrets to her family's Alexa. Courtesy Of Tess Flanagan / Cornell University

“[Roomba] reacts, it senses, it maybe even wants things but it doesn't really have higher-level cognition,” Kushnir explained. “Interestingly, children thought that [Alexa], which has no body, no physical manifestation, actually could think and could feel emotion, and knew the difference between right and wrong. …So the real difference between the two was basically: a Roomba is like your pet, and Alexa is like your friend.”

Older kids in the study, however, did not view Alexa as their intelligent friend so readily. It seems that children’s belief in Alexa’s intelligence fades with age. Flanagan said that with more experience using the devices, older kids may be able to better gauge Alexa’s limitations. If, for example, a child asked Alexa a creative question and the machine could not adequately respond, then the child will start to question whether Alexa is actually engaging them in an intelligent conversation.

Across all age ranges kids did agree on one thing: It's wrong to hit or yell at either smart device. The researchers are still trying to parse out exactly why children are motivated to treat the machines with kindness. One possibility is that kids could consider the device to be an expensive piece of property that may break with harm. In line with this thinking, a 10-year-old from the study commented that “the microphone sensors might break if you yell too loudly.” A second theory is that this response could be morally motivated: Just like it's wrong to hit a person, it's wrong to hit a device. When asked if it's ok to harm the devices, another 10-year-old said no, citing that “the robot will actually feel really sad.”

Sci-fi robots as research inspiration

The moral treatment of technology was what originally inspired Flanagan to pursue this research. As a big sci-fi fan, she was always drawn to shows like Westworld, which explored the psychology behind human-robot interactions. Flanagan found that while Hollywood was quick to analyze how adults treat robots, most media never examined how children would communicate with sentient machines.

“I was watching [Westworld] and seeing all these horrific things that people were doing to these technologies and what this very sad, dystopian future looks like to us,” said Flanagan. “And what I was thinking was like, ‘Well, what do you kids think? What would kids actually do in this situation?’”

Flanagan felt relieved to see that kids are kind to devices, especially, as she notes, since interactive technologies do not seem to be going anywhere soon.

Professor Alexa — teaching soon at a classroom near you?

During the COVID-19 pandemic lockdown, many kids resorted to using interactive technologies as a source of both their education and socialization. Flanagan and Kushnir are now considering the educational potential that these devices could hold. The pair are currently designing studies that would assess whether smart devices make a valuable contribution in the classroom.

“Especially where technologies are being used in classrooms to facilitate children's learning along with human teachers, are they useful?” asked Kushnir. “In what ways can we make them more useful?”

“It is really important to me to teach our son to be able to navigate this new world,” Kelly said.

Advanced forms of artificial intelligence, like ChatGPT and other chatbots, have received a lot of press in recent months, especially due to their controversial use in generating school assignments. Further studies on smart devices could shed light on whether educators may be able to effectively incorporate devices and AI into the curriculum.

Amita Kelly, Brennan’s mother, said that smart device usage in education may be a double-edged sword. She is teaching Brennan to verify what Alexa tells him with a quick Google search and to not always trust what their smart device says.

“It is really important to me to teach our son to be able to navigate this new world,” Kelly said. “In terms of education, I think there are good sides to AI and ChatGPT and ways that it can make your life easier, just like any technology. But I think if you are not able to navigate them…that it can be really challenging.”