Trending: Concerned about cloud providers, Confluent becomes latest open-source company to set new restrictions on usage
When University of Washington researchers accidentally silenced children quacking at a cartoon duck, they repurposed the response to gain insights into kids’ interactions with voice-activated electronics. (UW image from the game Cookie Monster’s Challenge)

To win the bonus game in Cookie Monster’s Challenge, a player must say “quack” to a small, white cartoon duck as it paces across the tablet screen, holding a wing to its ear.

However, when University of Washington researchers issued the game and tablets to their study participants — a group of 14 preschoolers — they inadvertently introduced a glitch into the devices causing them to not always hear the children.

The kids were left quacking in vain.

But the UW scientists collected some useful observations about young children interacting with voice-activated electronics. And their recently published research, which essentially examines kids trying to make themselves understood by a toy, raises some larger questions about the importance of refining human interactions with digital devices to make sure that they work for everyone.

Close to 40 million U.S. homes have an Amazon Echo or Google Home personal assistant, and that number could grow to more than half of America’s households by 2022.

“The way touch screens totally transformed what young kids can do with technology and use technology is relevant here,” said Alexis Hiniker, an assistant professor at the UW Information School and director of the User Empowerment Lab. “It just opens a lot of possibilities for their participation.”

Alexis Hiniker, an assistant professor at the UW Information School and director of the User Empowerment Lab. (Photo courtesy of Alexis Hiniker)

Little kids don’t talk like adults, sometimes stumbling over pronunciations or using incorrect words. Parents and teachers will pick out the correct words and zero in on the muddled bits to help them repeat their phrases more clearly.

Through this accidental experiment, the scientists, who used the tablet to record the kids, found that the preschoolers used various strategies to try to make themselves understood, repeating their “quacks” and tweaking the tone and pronunciation of the words, just as they would speaking to another person. The tablets captured 107 interactions of the kids with the device.

“The kids were so ready to get this failed communication back on track,” said Hiniker, who was the senior author of the study. “Every time the app re-prompted them, they would always try again.”

The take away, Hiniker concluded, was that engineers need to facilitate more nuanced responses from voice-activated devices. Instead of the terse “Sorry, I didn’t get that” currently stated by personal digital assistants like Alexa or Siri, the devices could coach the speaker, repeating the words that were clear and prompting the user to resay certain words.

Refining these responses could have benefits beyond the Sesame Street set. The Washington Post recently partnered with researchers to test voice-operated personal assistants and found that a variety of foreign accents routinely befuddled the devices.

“For many across the country, the wave of the future has a bias problem,” the article stated, “and it’s leaving them behind.”

These tech creations clearly have broad ramification — a truth that engineers and designers might not fully appreciate in the early days of a project. It’s a consideration that’s currently front-and-center as Mark Zuckerberg and others at Facebook wrestle with the reality that a platform for connecting college students became a weapon for foreign powers to derail America’s democratic process.

For her part, Hiniker created a UW course called “Designing for Evil” that was first offered last spring to computer science informatics majors. It delved into ethical and philosophical thought and examined tech applications through that lens.

A potential fix for communication between children and voice-activated devices. (Image from Alexis Hiniker’s study, which was published in the proceedings of the 17th Interaction Design and Children Conference, held in June in Trondheim, Norway.)

Hiniker, who is a former Microsoft engineer, would love to see the sort of moral questions and discussions that her students tackled infused more widely in computer science curricula. Tech leaders and engineers need to be vigilant in identifying and responding to unintended or problematic consequences of their creations, she said, and to pivot quickly when issues arise.

“Technology is so pervasive and transformative for people at scale,” Hiniker said. The work “requires an incredible amount of thought, and having the humility to know that we’re never getting it all right.”

Other authors on Hiniker’s study were Yi Cheng, who graduated from the UW Information School; Kate Yen, who graduated from the UW Department of Sociology; Yeqi Chen, a senior in the UW Department of Electrical Engineering; and Sijin Chen, a master’s student in the UW Department of Human Centered Design & Engineering.

The study was published in the proceedings of the 17th Interaction Design and Children Conference, held in June in Trondheim, Norway. The research received financial support from Sesame Workshop.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Comments

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.