Robotic hand
Nvidia CEO Jensen Huang and Susan Gaither tickle a robotic hand at Nvidia’s robotics research lab in Seattle. (Nvidia Photo)

When Nvidia CEO Jensen Huang interacted with a sensitive robotic hand at today’s open house for his company’s robotics research lab in Seattle, it was love at first touch.

“It almost feels like a pet!” Huang said as he tickled the hand’s fingers, causing them to retreat gently.

“It’s surprisingly therapeutic,” he told the crowd around him. “Can I have one?”

The robotic hand, which is programmed to avoid poking humans when they come too close, was just one of the machines on display at the 13,000-square-foot lab in Seattle’s University District.

Nvidia is based in California’s Silicon Valley and has nearly 200 employees working at an engineering center in Redmond, Wash. But when the chipmaker laid plans to open a lab focusing on research in robotics and artificial intelligence, it set up shop in the same building that houses the University of Washington’s CoMotion Lab. It also put Dieter Fox, a longtime computer science professor at UW, in charge of the operation as senior director of robotics research.

Huang said Seattle was the natural choice.

“Because of the University of Washington, because of Microsoft, because of Amazon, this has become one of the great hubs of computer science,” he said. “And so it made sense that we thought about this area.”

UW’s tradition of collaboration in computer science was also a selling point.

“Everybody was working with everybody else,” Huang said. “This is very unnatural, frankly, in most universities. They tend to be very isolated. … The collaboration, I felt, was the perfect culture for creating a robotics platform.”

Nvidia conducts a lot of research elsewhere, focusing on applications ranging from virtual-reality systems to self-driving cars to autonomous drones to medical imaging. But Fox said the Seattle lab will be the core facility for basic research in robotics.

“This lab is focusing on the next generation of interactive manipulators,” Fox said.

Nvidia moved into the lab in November, and since then operations have been ramping up toward the goal of having 50 roboticists working on site. At least 20 of those roboticists will be Nvidia employees, and the rest will be visiting academic researchers and students, Fox said.

Today the lab’s tables were strewn with boxes of Cheez-It crackers and Domino sugar, plus cans of Spam and Campbell’s tomato soup. Those ingredients didn’t go into the dishes served at the open house (which included pizza and coconut shrimp, and were quite tasty, by the way). Instead, the boxes and cans are the ingredients for the lab’s deep-learning and computer vision experiments .

Fox and his teammates decided that a kitchen would be the best place to begin testing their robots’ capabilities for image recognition, object manipulation and interaction with humans. So they went down to the Ikea store in Renton, Wash., and bought all the equipment needed for a working kitchen. The testing ground is even outfitted with a sink and an oven.

“We want to ultimately get a robot that can cook a meal with you, or that you can just talk to it and tell the robot what you want to do,” Fox said. ” ‘Get me the sugar box,’ and you tell the robot it’s in the third drawer from the left … and the robot will be able to do that.”.

Teaching robots how to recognize a box of sugar or a can of soup isn’t as easy as it sounds. Nvidia’s system takes advantage of machine learning, analyzing pictures of kitchen items in a wide variety of randomized poses and lighting conditions.

“We can train only on synthetic data, and get results that work in the real world,” said Stan Birchfield, a principal research scientist at Nvidia.

The kitchen manipulator robot being developed at the Seattle lab uses the Nvidia Jetson platform for navigation, and performs real-time inference for processing and manipulation on Nvidia Titan GPUs. The robotic perception system was trained using the cuDNN-accelerated PyTorch deep-learning framework.

All of the results from the lab’s experiments will be published openly, with the aim of creating an operating system of sorts that could be used in a wide range of robots.

Eventually, Nvidia aims to create software packages or specialized hardware that would come pre-programmed to do kitchen tasks, and add to its knowledge base by watching what human cooks do. Fox said 3-D computer models of kitchen environments — or any home environments, for that matter — could become standard applications that are handed over whenever someone moves into a new home or remodels an old one.

“You buy your kitchen plus the model, and let’s say you have your robot already,” Fox said. “Upload it to the robot, and there’s your robot knowing about the kitchen.”

But the fruits of the Seattle lab’s studies won’t be found only in the kitchens of the future. Someday, personal robots could take on delicate tasks such as giving people with disabilities a clean shave, assist elderly people who might otherwise have to move out of their homes, or perform jobs that human workers would prefer not to do. Nvidia’s robotics research could help make it so.

“There are so many interesting things that we could spin off in our pursuit of a general AI robot. For example, it’s very likely that in the near future you’ll have ‘exo-vehicles’ around you, whether it’s an exoskeleton or an exo-something that helps people who are disabled, or helps us be stronger than we are,” Huang said.

“It’s very likely that in the future, manufacturing robots don’t have to be programmed — that somehow they learn by either watching us, or they learn by imitation, or learn in general from what its goals are,” he added.

So did Huang’s tour of Seattle’s lab, and his encounter with the robotic hand, spark any ideas for new Nvidia products? “I have many ideas for new products,” he said with a sly smile. “I will tell you about them at GTC.”

For what it’s worth, Nvidia’s GPU Technology Conference, also known as GTC, is 10 weeks away.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.