The Mobile & Marine Robotics Research Centre (MMRRC) at the University of Limerick in Ireland created a generic solution for underwater development. During research cruises in February and June of 2005 with the RV Celtic Explorer, MMRRC researchers identified weak points of the overall system integration to address with a virtual underwater lab (VUL). The VUL is a mixed hardware/software tool designed to overcome identified problems; make overall system integration easier; and provide a framework for researchers to develop, implement, and test advanced control algorithms in a combined real-world/simulated environment.

As part of the current research project, the MMRRC acquired state-of-the-art survey equipment, including a high-resolution, multibeam sonar RESON SeaBat 7125; an IXSEA photonic inertial navigation system (PHINS); an RDI Doppler velocity log (DVL); a MicroBath depth sensor; six TriTech digital precision altimeters for obstacle avoidance; National Instruments’ NI CompactRIO; and NI Compact Vision System.

Platform Requirements

The remotely operated vehicle (ROV) conducts sampling, data acquisition, and high-resolution acoustic and video surveys in deep oceans. In an underwater environment, an ROV pilot’s field of view is limited to onboard cameras.
Because of the complexity of the underwater environment, the vehicle control system had to avoid obstacles and compensate for various external disturbances such as sea currents and drag effects of umbilical for remotely operated vehicles (ROVs). The physical shape and actuator configuration of the vehicle (number, position, and orientation of thrusters and control surfaces) impose constraints that limited control actions. If the control system was not effectively designed, tracking errors would lead to unnecessary survey mission time delays. In addition, a nonoptimal control allocation of actuators would lead to inefficient usage of available energy resources.

A typical seabed survey mission requires integration of the existing ship equipment (GPS, USBL) with ROV onboard components (multibeam, inertial navigation system; DVL; depth sensor; sound velocity probe; and vision system). To obtain the highest-quality navigation data, it is necessary to reconfigure some components for the best performance in real time, depending on the stage of the mission.

Modern ROVs conduct sampling, data acquisition, and high-resolution acoustic and video surveys in deep oceans. Expensive equipment is deployed very close to the seabed, and the ROV pilot has the huge responsibility of controlling the vehicle and preventing any damage or loss of equipment. Developing a set of tools to help a pilot with moderate skills easily perform complex tasks is critical to potential commercial applications. The pilot must be able to bring the vehicle to the initial position, activate the corresponding aiding tool (algorithm), and monitor the realization of the task.

Environment Simulation

In an underwater environment, an ROV pilot’s field of view is limited to onboard cameras. The side view of the vehicle and its underwater environment enhances steering. This side view was made possible by preparing a virtual reality (VR) underwater scene with 3D models of the vehicle, ship, and seabed in advance, and by forwarding real-time signals from sensors to the VR scene in real time. To use the VUL in a simulated environment, the real-world components from the ship and the ROV were replaced with hardware/software simulators.

All software (except the sonar simulator) was implemented in NI LabVIEW. It is important to note that all simulators were synchronized with real time. Full six-degrees-of-freedom vessel dynamic models were implemented, including thruster DC-motor dynamics with nonlinearities, such as saturation, slew-rate limiter, friction, and nonlinear propeller load. Different components of the ship and ROV simulators were simulated as parallel loops executed with different speeds, depending on the dynamics of components.

The control system used a hybrid control approach, which combined the advantages of a top-down, traditional (hierarchical) artificial intelligence approach and a bottom-up, behavior-based approach. Two high-level task executors — waypoint tracking and obstacle avoidance — were implemented. Each of these task executors competed to take control of actuators. Each task executor had its own control cluster, which consisted of hand control unit (HCU) components to simulate a virtual joystick, and settings for low-level controllers.

The Virtual Underwater Lab (VUL) is a mixed hardware/software tool that incorporated a CompactRIO system connected by Ethernet to a compact vision system and camera.
Two planners were implemented in the VUL — the Mission Planner and Tracking Waypoints Planner. The main task of the Mission Planner was decomposition of the mission into a set of tasks. The control execution layer supervised the activity of the lower-level reactive layer and assessed the situation. Based on external conditions or in-state calculation, the Mission Planner decided which state needed to be executed next.

Actual waypoint guidance was performed inside the Tracking Waypoints state. The main task of the Tracking Waypoints Planner, a state machine whose parent is the Tracking Waypoints Super-State, was the decomposition of the waypoint guidance algorithm into a set of tasks. Depending on the predetermined tracking mode, the corresponding operation mode of the Auto-Heave low-level controller (depth or altitude) was permanently activated in the Tracking Control Cluster during the waypoint guidance. The Navigation PC ran a real-time, side-scan/multibeam sonar simulator.

Control Allocation

A hybrid control allocation approach was implemented inside the control allocation Express VI (Virtual Instrument). In a fault-free case, optimal control allocation is guaranteed for all possible command inputs. In faulty situations, the fault diagnosis part of the system immediately detects and isolates any fault in a thruster using fault-detection units and delivers fault information in the form of a fault indicator vector. The fault accommodation part of the system uses this vector to accommodate fault and eventually switch off the faulty thruster.

At the same time, control reallocation was performed by redistributing the control energy among the remaining operable thrusters. Slider HT and VT saturation bounds controlled the degree of usage for each thruster. Depending on the state of the thrusters (healthy, partial fault, or total fault), the ROV pilot or fault accommodation system determined the position of these sliders.

Visualization of the thruster velocity saturation bounds implemented as part of the fault diagnosis and accommodation system (FDAS) provided insight into the constraints imposed by a particular thruster configuration. During missions, the ROV pilot/control law generated command inputs, which stretched out over the command space in different directions. Thruster configuration determined the position of saturation bounds inside the command space. Any fault in a thruster caused a change in the shape of the attainable command set. Using different indicators and visualization tools, the FDAS informed the ROV pilot/main controller about the position of actual command inputs relative to the actual attainable command set. Using this information, even an inexperienced ROV pilot can detect when thruster velocity saturation occurs and correct the command inputs such that it becomes attainable.

The MMRRC is successfully developing the VUL generic hardware/software tool for integration of survey equipment with existing ROV and ship resources. The signal-level compatibility between the simulated and real-world environments provides the opportunity for engineers to use rapid control prototyping and hardware-in-the-loop development techniques in system design.

This article was written by Edin Omerdic, Mobile & Marine Robotics Research Centre, ECE Department, at the University of Limerick in Ireland. For more information, Click Here 


Imaging Technology Magazine

This article first appeared in the December, 2007 issue of Imaging Technology Magazine.

Read more articles from the archives here.