SEATTLE, Washington-September 2, 2015. BluHaptics has completed the first phase of a Small Business Innovative Research (SBIR) grant from the National Science Foundation (NSF) titled "Collaborative Subsea Manipulation Interface" that is focusing on advancing capabilities of subsea ROVs. The following summary was posted on Research.gov on September 2 and covers the Project Outcomes (as defined by NSF).
As humans endeavor to deploy and maintain every more complex underwater hardware, conduct subsea scientific sampling and explorations and develop natural resources in ever more hostile, deep and remote locations, the dependence on and need for ROV’s (Remotely Operated Vehicles) is increasing. Significant challenges persist within ROV operations, resulting in high costs and unacceptable downtime, as well as unfortunate accidents and safety risks. This project explored the use of advanced visualization and force feedback tools to allow for collaborative control of a manipulator in an underwater environment. Applications for the technology include operation of remotely operated vehicles (ROVs) commonly used offshore for oil and gas exploration and production, environmental cleanup, telecommunications, science, and defense. By exploring the use of virtual reality displays and haptic tools for force feedback, this project demonstrated multiple ways that situational awareness can be enhanced and 3D information can be leveraged to conduct efficient and safe operations with subsea manipulators onboard ROVs.
The intellectual merit of this Phase I SBIR project was the development of several foundational innovations that, separately, enhance operator capabilities (and may become individual products), and that together derive immense value to the customer. These foundational technologies are in the areas of: (1) Pilot Interface/Control room software; (2) Sensor fusion and processing; (3) Task assistance and workflow management; and (4) Manipulator and vehicle control. With these technologies combined into a single software platform and integrated with a robotic system, we enable increased performance, predictability and safety, far beyond what is currently possible for purely automated or manual robotic systems used to perform manipulation work. The primary contribution was the development of an interface that uses a model-based representation of the underwater environment and the machinery (ROV) being used to conduct underwater work. By updating this model in real-time, it is possible to generate an augmented-reality that contains information about the real world of the ROV. This enables a collaborative approach to command and control that allows ROV pilots to communicate seamlessly with each other and the robotic system.
This project has several types of broader impact. The first pertains to the advancement of engineering technology. Successful implementation of visualization, sensor fusion, control methods, haptic virtual fixtures and other proposed technical tasks will serve as an excellent demonstration example of these technologies, and will advance the field of telerobotic control. The most important impacts and significance of this work are societal. Our technology will make subsea and underwater operations safer. Divers can be replaced in hazardous situations by telerobots with improved control, based on our technology. The rate of untoward incidents, and their severity, will be mitigated for a large range of subsea activities. This includes an enhanced capability for environmental remediation, equipment repair and maintenance, and for scientific exploration of underwater environments. The adoption of this technology will provide economic benefits to the US, including employment opportunities and contributions to the national technological infrastructure.
909 NE Boat St,
Seattle, WA 98105
+1 (206) 747-8277