BluHaptics has received a $747,197 grant from the National Science Foundation to work on a virtual-reality robotic control system that could transform underwater operations as much as drones have transformed aerial operations.
The project, which includes a subcontract to the University of Washington’s Applied Physics Laboratory, will use 3-D data fusion and machine learning to develop safer, more intuitive ways to pilot remotely operated vehicles, or ROVs. Such vehlcles can capture imagery and manipulate objects miles beneath the sea surface.
“Our technology will make subsea and underwater operations safer,” BluHaptics’ chief technology officer, Fredrik Ryden, said today in a blog posting announcing the NSF’s Phase II Small Business Innovative Research grant. “Divers can be replaced in hazardous situations by telerobots with improved control based on our products. The rate of untoward incidents, and their severity, will be mitigated for a large range of subsea activities.”
Seattle-based BluHaptics is a commercial venture that was spun out from UW’s Department of Electrical Engineering and the Applied Physics Laboratory in 2013. Its tools of the trade include haptic devices that provide force-feedback control of teleoperated robots, and immersive 3-D interfaces that put a remote operator into a virtual-reality environment.
In an email to GeekWire, BluHaptics CEO Don Pickering said his company was working on an innovative robotic approach to underwater cutting and welding. “This tech will save lives and help [with the] ability to do precision cutting and welding in adverse conditions (like oil spills),” Pickering said.
Although BluHaptics is focusing on subsea telerobotics for its first commercial products, the company’s technologies can be extended to aerial and ground-based applications such as excavation, construction and hazardous-site cleanup. The newly announced grant follows up on a Phase I SBIR grant, and is meant to support research lasting into 2018.