Vision Positioning System (VPS)
On orbit, most space vehicles use an on-board Inertial Measurement Unit (IMU) to determine attitude and have their position determined on the ground via radar. Such full 6DOF state estimation is necessary in order to control the spacecraft. The simulation vehicles developed here at the Neutral Buoyancy Research Facility (NBRF) use an off-the-shelf IMU to fully determine their attitude. However, unlike other Earth-based vehicles, they are unable to utilize the Global Positioning System (GPS) because the associated GHz-band signals cannot penetrate building walls. Radio-based positioning systems are also infeasible due to the underwater environment. Our goal with this research is to develop an accurate local positioning system capable of providing equivalent or better position data than would be possible with GPS.
Sonar is the most popular positioning technology employed by today's autonomous undersea vehicles. Currently under development at the NBRF is the 3-Dimensional Acoustic Positioning System (3DAPS), which uses sonar signals to find the position of a body in the water. There have been several challenges to the implementation of this system. One is eliminating sources of noise other than the sonar emitters, such as vehicle electronics and actuators, which can be operating at approximately the same frequencies. Another is the problem of wave reflections from the sides and bottom of the tank, leading to constructive and destructive interference. A third issue is obstacles in the water, such as divers, mock-ups, and other vehicles, which stop the waves from traveling from emitters to receivers. Further information is available on the 3DAPS project page.
Two Secondary Camera And Maneuvering Platform (SCAMP) vehicles will be primary beneficiaries of an accurate positioning system, allowing a transition from teleoperation with attitude control to full 6-DOF autonomous flight. These vehicles are approximately spherical, as seen in Figure 2. The outside of the vehicle is matte black, whereas the walls of the neutral buoyancy tank are pale-colored, also evident in Figure 2. This very structured environment gives rise to a new way of "seeing" the vehicle. By taking a control picture with a camera in a fixed position without the vehicle in the frame (Figure 1), and comparing it with the same camera view containing the vehicle (Figure 2), we are able to isolate the vehicle, seeing it as perfectly black on a perfectly white background (Figure 3). Thus far, this simple, computationally-efficient technique has yielded a good first approximation to the 2-dimensional position of the vehicle in the camera's coordinate frame.
In the fully-deployed VPS configuration, cameras will be mounted in eight positions on the walls of the tank, one pair approximately every 90 degrees. Of each pair, one will be near the surface and one closer to the bottom of the tank. They will be aimed at different angles and have different focal lengths to provide maximal tank volume coverage. At each time step, two unobstructed camera views are necessary to produce two 2-dimensional positions in intersecting planes. From this, the vehicle's 3-dimensional position can be triangulated in the inertial reference frame. The vehicle's state estimation update rate is 60Hz. Given the real-time processing requirements and computational overhead associated with computer vision processing, all data processing software is written in C.
This technology is not without its own set of challenges, foremost of which is obstruction of the vehicle from the camera by divers and structures in the water. However, given a sufficiently accurate vehicle dynamics model, we can incorporate knowledge of past vehicle state and thrust commands into our state estimation algorithms to minimize error. VPS will provide a fast, simple method of determining underwater position. Once complete, VPS will facilitate a myriad of research topics, including autonomous path planning, astronaut following/observation, and formation flying of multiple neutral buoyancy vehicles.
Lead Engineer: Kate McBryan