Saturday, February 21, 2015

Data Depiction and Presentation of the Protector Unmanned Surface Vehicle


Unmanned marine systems are categorized into two categories, one category is unmanned surface vehicles (USVs), and the other is unmanned underwater vehicles (UUVs). When analyzing the method of data depiction and presentation between the two, it is vastly different. For underwater systems high bandwidth communication is often difficult, therefore missions are usually preprogrammed and the systems often operate with an increased level of autonomy. This makes the mission control aspect of these systems relatively simple. For systems such as the Bluefin 21, the control station and data processing system is a Windows based laptop loaded with the company’s proprietary software (Bluefin Inc., 2015). For surface based systems such as the Protector USV the mission control of the system is more similar to an unmanned aerial system. When operating on the surface, weather, sea conditions, ship traffic, and precision targeting are required in real time. Having to remotely interpret the USV’s immediate environment requires complex depiction and presentation strategies. Studying current systems used in data depiction and presentation as well as being able to identify their shortfalls will help engineers in the future develop new methods and techniques in the realm of data depiction and presentations for unmanned systems in general. 

Data Depiction and Presentation for the Protector  

The Protector USV is an advanced surface warfare platform that can be utilized for anti-terror/ force protection missions, port security, and intelligence surveillance and reconnaissance. The system consists of a 30ft inflatable rigid hull boat, an interchangeable mission payload, and a two person command and control station that can be either placed on a larger ship or on shore (Rafael LTD., 2010). In terms of data depiction and presentation, the command and control station is broken down into two separate functions. There is a vessel control station and a payload control station. The mission payload for this system is very similar to UASs, therefore, for the purpose of this study, understanding and studying the vessel control station is paramount to the payload control. Controlling a surface vessel is very difficult when there is no proprioceptive feedback being given to the operator. In order to compensate for the lack of proprioceptive feedback, the Protector has been designed with a higher degree of autonomy than most other unmanned systems. The system features cameras, sensors, and algorithms that pick optimal sea paths and account for weather, waves, and currents. The system also has a dynamic stabilization capability as well as autonomous sense and avoid capability. All these systems are overlaid and presented to the vessel operator in a control station that is a replica of the ships actual controls. There is not only a replica of the vessel’s control, but there is also a display of an overhead map as well as a view from the safety camera which is located similarly to wear the skipper would actually be sitting or standing if they were on the vessel (IWeapons.com, 2007).  The operator can either control the system in a manual or semi-autonomous mode. During docking and precision maneuvers, manual control is ideal. At high speeds in the open ocean, the semi-autonomous method of control is better suited for the mission.

Data Depiction and Perception Challenges  

One of the largest challenges for any surfaced based marine system is the lack of situational awareness and proprioceptive feedback. Compared to a manned surface vessel, the unmanned skipper receives all feedback in the form of quantifiable numbers and sensor readings rather than actual feedback like wind being blown through hair, boat movement, and engine sound. Not being able to feel the waves under the boat, compounded with poor visibility of the surface conditions during night operations or in dense fog makes it more challenging to get the most out of the USVs performance. Not having these physical cues makes piloting an unmanned surface vessels a complex task, but there are a few methods to mitigate this challenge. One method is turning to a computer system to add a high degree of automation that will off load some environmental based decisions from the skipper, such as sea path selections and stabilization. This helps to a degree during simple maneuvers, but in terms of precision maneuvers or extreme weather and sea conditions, this solution does not provide the best outcome (Gilat, 2012). Another tool used to help USV skippers is training. Similar to pilots training to fly with just flight instruments, conditioning USV skippers to learn how to effectivly turn numerical data into a cognitive picture of the remote environment is essential (Gilat, 2012). In order to aid in this process, I propose to include proprioceptive feedback in the command and control station via full motion simulations.  
    

Alternative Data Depiction and Presentation Strategies

In aviation, pilots that are training to fly with only instruments first train on simple computer based simulators. Once the pilot can gain the ability to cognitively process numerical and instrument data into spatial awareness, they graduate to a full motion simulator. This helps them become physically aware of the simulated environment as well as be able to physically sense the aircrafts performance. The real time motion simulation helps the pilot manage their control inputs as well as determine aircraft health and performance. If engineers can create the same full motion feedback for USV skippers, I feel that it would help bridge the gap from numerical sensor data to real life spatial and environmental understanding. Not only simulating the motion of the vessel, but being able to provide simulated wind speed and direction, as well as engine sound would also help the skipper gain clear understanding of the environment as well as the USVs performance. The key to this system would be real time feedback. If there is just a few second delay between the USVs actual movement and corresponding simulated movement of the command and control station, the required control inputs to correct for environmental influence would be too late to be effective. Another difficulty of this system is the size and complexity of a full motion command and control station. This system would be better suited for a land based control facility rather than a ship based facility. If proper training and full motion feedback is designed into the Protector, an unmanned skipper could manually control the vessel to perform more complex and precise maneuvers as well as allow for operations in more severe sea and weather conditions.

Increasing the complexity and simulated feedback of an USV skipper’s control station is a decision that needs to be made based on mission limitations and mission profiles. For a steady state harbor protection mission that requires precise maneuvering around ships, docks, and the shoreline, the full motion command and control station may increase the Protectors effectiveness and allow more complex manually controlled missions. In expeditionary missions, a mobile form factor and highly automated mission control method may be a more effective solution. Either way, as a study in design, engineering a full motion command and control station for something like the Protector USV could also provide valuable insight for land based unmanned systems or even rover based systems that operator on the surface of Mars. The most difficult aspect to this type of integration is the need for real time data connections that could prove difficult over large distances. An emerging methodology involving laser based communication could be the key to decreasing data latency between vehicle and control station.       


References


Bluefin Inc. (2015). Operator Software » Bluefin Robotics. Retrieved February 21, 2015, from http://www.bluefinrobotics.com/technology/operator-software/

Gilat, E. (2012, August 5). The Human Aspect of Unmanned Surface Vehicles. Retrieved February 21, 2015, from http://defense-update.com/20120805_human_aspects_of_usv.html#.VOYIWfnF9AU

Iweapons.com. (2007, March 29). Protector. Retrieved February 21, 2015, from http://web.archive.org/web/20070329024450/http://www.israeli-weapons.com/weapons/naval/protector/Protector.html

Rafael LTD. (2010). PROTECTOR: Unmanned Naval Patrol Vehicle. Retrieved February 21, 2015, from http://www.rafael.co.il/Marketing/288-1037-en/Marketing.aspx



Sunday, February 8, 2015

Data Format, Protocols, and Storage Methods of Mars Curiosity Rover

Data transfer between unmanned systems and ground control stations is a challenge for any unmanned system. Data transfer between an unmanned system and a ground control station which are planets apart becomes even more of challenge given the limitations of our current technology. Understanding the methods behind the Mars Curiosity Rover communications system will help us gain understanding about data management, treatment, and movement in any unmanned system. Additionally, being able to understand the onboard sensor and how they interact with the data transfer architecture in terms of power requirements, data storage requirements, and data treatment methods, will help us analyze the current state of technology. Upon a detailed study of the current situation, proposed future changes could streamline the entire data transfer process from the sensor to the final product.  

Data Format, Protocol, and Storage

The Curiosity Rover has a network of satellites that allows the rover on the surface of Mars to communicate with the ground control station (GCS) on earth at high data rates. Unlike smaller unmanned systems, direct communication between the ground station and the vehicle are impracticable due to the power requirements and bandwidth limitations that exits when transmitting data between Earth and Mars. The rover and the GCS can be as far as 400,000,000 km apart, and to transmit data over that distance while using a relatively feasible mount of power, the bandwidth peaks at 800 b/ s using the onboard high gain antenna, which utilizes X band frequencies (Gordon, 2012). This type of communication can transfer telemetry and navigational commands to the rover, but sending raw imagery or data from one of its many sensors would be extremely impractical. For large data the Curiosity Rover utilizes the Mars Reconnaissance Orbiter (MRO) to relay its data back to earth. The MRO orbits only 275km over the surface of Mars, which exponentially decreases the power requirement to transmit large data files off the surface of Mars (Taylor, 2006). The orbiter is designed not only to relay and amplify signals back to Earth, but it is an instrument of science loaded with its own sensors that are used in conjunction with the rover. The final piece of the communications architecture is the GCS. There are three large ground stations, one in Goldstone, California, one in Madrid, Spain, and one in Canberra, Australia. These GCSs are on a much larger scale than any terrestrial based unmanned system GCS. They come standard with one 70m parabolic antenna and at least two 34m parabolic antennas. By creating huge power hungry antennas on the earth, the size and power required by the orbiter and rover to transmit and receive data can be decreased (Gordon, 2012).    

Mars Earth Comms

Onboard Sensors

In terms of sensors onboard Curiosity, there are many high tech sensors that have a range of purposes (NASA, 2015). In an effort to study data transfer techniques; analyzing the sensors with the highest data transfer requirement will assist in understanding how to streamline the process in the future. Some of the highest data producing sensors on the rover are the mast cameras. There are two 2-megapixl cameras that not only capture imagery, but can be used as a stereo pair in order to conduct 3D mapping of the rovers immediate surrounds. Anyone with a background in photography would think 2MP cameras are not the best choice for a 2.5 billion dollar space probe, but this is the first step of the data treatment method utilized by the engineers on the rover project. By utilizing a lower megapixel camera, the data transfer requirement is decreased before utilizing compression, or other forms of data treatment. The mast cameras utilizes a 1600 x 1200 pixel resolution interline camera sensor (Cangeloso, 2012). This camera is capable of storing up to 5500 full size images on the actual cameras memory itself, which acts as a buffer. This capture resolution roughly translates to about a 1.4mb image file in its raw form (Gordon, 2012). This is where the use of the MRO and UFH radio communication come in. The UHF radio on the rover can pump out much higher bandwidth, which is up to 256 kb/s. This is done with a 12 watts transceiver. If the rover was required to send via its high gain antenna straight back to Earth via X-band communications, data transfer could only top out around 800 b/s and required 15 watts of power. This would mean the same images would take much longer to transmit and require the 15watt transceiver to be active and drawing energy for a much longer period of time (Taylor, 2006). The power and time constraints placed on the entire system when the MRO is not used would severely limit the mission success of the rover.      

  

Alternative Data Treatment Strategy

Since 2004, which is when Curiosity and the MRO were designed many things have changed in terms of image compression and data storage. Starting with the sensor, I would recommend the use of a higher resolution camera pair, but include the use of lossy image compression. This would have two aspects, it would increase the native resolution and quality of the imagery coming from the mast camera set, but the lossy compression would reduce the data back into a manageable data set that would be easily relayed back to earth (Chin, 2013). The second methodology in terms of data transfer would be increasing the onboard storage capacity from about 8 GB to 100 GB per sensor. Along with the increased storage, I would implement a server based tagging method to the imagery that would add metadata and allow scientist to query the server and then pull full resolution imagery as needed. Not having to send back all the imagery at full resolution would save bandwidth to allow simultaneous transfer of the “current view” low resolution imagery as well a full resolution stills that are being queried by scientists for further investigation. 


Both data storage and processing power are increasing exponentially every day. The physics of data transfer and the power required to accomplish it are not as rapidly advancing as fast, and this is why alternatives and advances in data transfer should focus on the digit aspects rather than the electromagnetic aspects. Long distance laser data transfer may be a long term solution to the current slower moving portions of the electromagnetic spectrum that we utilize, but with todays rapid increase in processing power, the near term solution may be in compression and processing advances in data transfer.          

References

Gordon, S. (2012, January 1). Talking to Martians: Communications with Mars Curiosity Rover. Retrieved February 8, 2015, from https://sandilands.info/sgordon/communications-with-mars-curiosity 
Taylor, J. (2006). Mars Reconnaissance Orbiter Telecommunications. DESCANSO Design and Performance Summary Series, (Article 12). Retrieved February 8, 2015, from http://descanso.jpl.nasa.gov/DPSummary/MRO_092106.pdf 
Makovsky, A. (2009). Mars Science Laboratory Telecommunications System Design. DESCANSO Design and Performance Summary Series, (Article 14). Retrieved February 8, 2015, from http://descanso.jpl.nasa.gov/DPSummary/Descanso14_MSL_Telecom.pdf 
NASA. (n.d.). Mars Science Laboratory; Curiosity Rover. Retrieved February 8, 2015, from http://mars.nasa.gov/msl/
Cangeloso, S. (2012, August 9). Why does the $2.5 billion Curiosity use a 2-megapixel camera? | ExtremeTech. Retrieved February 8, 2015, from http://www.extremetech.com/extreme/134239-why-does-the-2-5-billion-curiosity-use-a-2-megapixel-camera 
Chin, M. (2013, December 18). New data compression method reduces big-data bottleneck; outperforms, enhances JPEG. Retrieved February 8, 2015, from http://newsroom.ucla.edu/releases/ucla-research-team-invents-new-249693





Sunday, February 1, 2015

Comparing Sensor Placement: Storm Racing Drone vs. the DJI Inspire 1


The world of commercial ready to fly unmanned aerial systems (UASs) is growing every day. The advancements in commercial UAS technology have allowed many subcategories of UASs to form. These categories differentiate from each other based on the intended purpose of the UAS. Two prevalent categories are first person viewing (FPV) racing systems and aerial high definition (HD) video capture systems. Both types of systems share many attributes, but one of the biggest differences between the categories is sensor placement and integration. FPV racers tend to have forward looking cameras fixed to the aircraft frame, while UASs designed for aerial photography tend to have more robust center mounted cameras with controllable gimbals. Understanding the reasoning behind the sensor placement on these two categories of UASs will provide insight into unmanned sensor placement and integration as a whole.         
    

Sensor Placement and Details of the Storm Racing Drone



The industry of FPV racing is different from that of many other commercial UASs. Often, the parts, components, and sensors are sold individually, with the intent that each consumer will purchase and assemble a unique system. The Storm Racing Drone is one of the few exceptions that offers a ready to fly (RTF) system out of the box. Regardless of whether or not the FPV racer is RTF or assembled by the user, there tends to be one constant; the sensor placement and integration. The Storm, for example, has a forward mounted camera capable of capturing a field of view of 110 degrees at 768x494 resolution. The sensor is also only 71 grams and only 30 dollars (Helipal.com, 2015). Keeping the sensors light and cheap are essential in FPV racing due to the high probability of crashing on the racecourse.  Another attribute unique to the FPV racer category is the needs for zero lag video being sent back to the operator. Due to the fact that flight is controlled by the operator through a first person viewing device, any lag in video could result in delayed control inputs. Any delay in control input on a heavily wooded racecourse could result in costly accidents and the inability to operate the system properly (Helipal.com, 2015). 

Sensor Placement and Details of the DJI Inspire 1



The DJI Inspire 1 is one of the latest and greatest cinematic tools available to the public at this time. This high end UAS provides stabilized high definition stills and videos to anybody willing to pay the price. The sensor placement and integration on this system is focused on providing the best field of view as well as the smoothest and most high quality captures that can be produced. The integration into the operator control system is also designed for optimizing video captures, rather than increased flight performance like the Storm Racing Drone mentioned above. The Inspire 1 is capable of allowing one operator to control the flight controls while another controls just the camera. Having a center mounted, gyro stabilized gimbaled payload as well as a separate camera control capability allows this UAS to be perfectly suited to the world of cinematography. The Inspire 1 is large and relatively slow compared to the Strom Racing Drone, but unlike a FPV racer, the Inspire 1 uses GPS autopilot and indoor IR sensors to stabilize the UAS, while the camera pans and tilts as required by the camera operator (DJI Inc., 2014). Another aspect that is important to providing the most stable and high quality image captures, is the ability of the Inspire 1 to provide the camera an unobstructed 360-degree field of view while in a hover. This is achieved by a landing gear/ rotor system that actual raises out of the field of view upon takeoff. Clearing the sensors field of view not only demonstrates the importance of sensor placement, but the importance of sensor integration during every step of engineering (DJI Inc., 2014).

Both the Storm Racing Drone and the DJI Inspire 1 provide clear examples of how companies engineer UASs around sensors and missions, rather than just strapping sensors onto a well designed flying machine. The Storm’s front mounted, zero lag camera provides an operator a real time feed in order to navigate difficult racecourse environments, while the DJI provides a stable platform that is obstruction free in order to capture every bit of the world around it in uncompromised clarity. A UASs mission and sensor should always be considered prior to engineering and building any UAS. The sensor limitations and constraints will provide engineers with insight into the overall capabilities and design details of a system as a whole.

    



References
Helipal.com Corporation (Ed.). (2015). Storm Racing Drone (RTF / Type-A). Retrieved January 27, 2015, from http://www.helipal.com/storm-racing-drone-rtf-type-a.html


DJI Inc. (2014). Inspire User Manual V1.0. Retrieved January 27, 2015, from http://download.dji-innovations.com/downloads/inspire_1/en/Inspire_1_User_ Manual_v1.0_en.pdf