Saturday, June 27, 2015

Unmanned System Data Protocol and Format (UNSY 605, 4.5)


The RQ-4 Global Hawk is a high altitude, long endurance (HALE) unmanned aerial system (UAS) currently operated by the U.S. Air Force. The aircraft has a range of 12,300NM, service ceiling of 60,000ft, and cruises at 310 knots (U.S. Air Force, 2014). The Global Hawk’s greatest strength is arguably its 34 hour endurance which makes it ideal for providing persistent intelligence, surveillance, and reconnaissance (ISR). While the RQ-4 has had limited success with operational users when trying to replace the Lockheed U-2’s capabilities, it has significantly contributed to disaster relief efforts in Japan, Haiti, and the Philippines.

RQ-4 Sensor Capabilities
The Block 20 RQ-4 can carry up to 3,000lbs of mission payloads. The standard configuration is a Raytheon Integrated Sensor Suite (ISS) comprised of a Hughes Integrated Synthetic Aperture Radar (HISAR), 0.4-0.8um Electro-Optical (EO) sensor, and 3.6-5.0um Infrared (IR) sensor (Kable Intelligence Ltd, 2015). HISAR is capable of operating in the following modes (Bayma, 1996):
  • Wide Area Search: 24m resolution over a 60o sector.
  • Strip Map: 6m resolution over a continuous 37km path.
  • Spot Image: 1.8m resolution over a 4.8x2.8km area.
  • Ground Moving Target Indicator (GMTI): Position/velocity within 45o of broadside.
The HISAR transmitter operates in X-band with a 600MHz bandwidth. Block 40 RQ-4s have recently been delivered with a Multi-Platform Radar Technology Insertion Program payload which enables new capabilities to be quickly fielded across different vehicles by using common hardware/software interfaces (Pultrich, 2010). The ISS EO/IR sensors share an optical path through a 10in reflecting telescope that can generate 2km2 images that are geo-rectified to within 20m. The ISS requires up to 3.5kW of power (Kable Intelligence Ltd, 2015), likely supplied by an engine-driven 11.2kW generator (Cessna, 2013).

Data Format, Protocols, and Storage
The primary method of transferring sensor data to a ground-based processing, exploitation, and dissemination (PED) function is Ku or UHF satellite data link (Kable Intelligence Ltd, 2015). The HISAR contains a dual-channel receiver which passes raw signals to an Analog-to-Digital converter. The digitized raw data is held in a high capacity buffer until the onboard processor can finish generating an image, which is then downlinked for PED (Bayma, 1996). One of the strengths of this system is that the only the finished product is transmitted from the aircraft, which requires much less bandwidth than the raw data. One of the key features of HISAR is its ability to simultaneously operate in GMTI mode, while performing wide area or strip mapping. In this case, the processed images will be overlaid with GMTI position/velocity information. In the RQ-4, only high-resolution still frames from the ISS EO/IR sensors are downlinked, however the MQ-4 Triton variant flown by the U.S. Navy has the capability to downlink low-resolution video.

Recommendations
The HALE ISR mission is perfect for a UAS. When comparing the performance of the RQ-4 with the U-2, the unmanned platform clearly has an advantage in terms of persistence. Additionally, a pilot’s life is not risked by enemy fire or having to ditch over inhospitable environments. The best recommendation I would make is to continue developing the RQ-4 reliability and sensor capability, while employing open architectures to facilitate future enhancements, so that it can seamlessly fill the role of the U-2. Work is currently underway by Northrop Grumman to build a Universal Payload Adaptor, that would enable the RQ-4 to “carry the Senior Year Electro-Optical Reconnaissance System-2B/C and the Optical Bar Camera” (Malenic, 2015), which are currently the only internationally-recognized systems for aerial treaty enforcement.

References

Cessna Aircraft Company. (2013). Citation X: Specification & Description. Retrieved from http://cessna.txtav.com/~/media/Files/citation/x/xsd.ashx
Note: The Citation X has the same basic power plant model as the RQ-4.

Kable Intelligence Ltd. (2015). RQ-4A/B Global Hawk HALE Reconnaissance UAV, United States of America. Air Force Technology. Retrieved from http://www.airforce-technology.com/projects/rq4-global-hawk-uav/

Malenic, M. (2015, April 29). Northrop Grumman to test U-2 sensors on Global Hawk. Jane’s 360. Retrieved from http://www.janes.com/article/51076/northrop-grumman-to-test-u-2-sensors-on-global-hawk

Pultrich, G. (2010, August 16). Next generation of Global Hawks ready to roll. Flight Global. Retrieved from http://www.flightglobal.com/news/articles/next-generation-of-global-hawks-ready-to-roll-346116/

U.S. Air Force. (2014, October 27). RQ-4 Global Hawk Fact Sheet. Retrieved from http://www.af.mil/AboutUs/FactSheets/Display/tabid/224/Article/104516/rq-4-global-hawk.aspx




Sunday, June 14, 2015

UAS Primary Sensor Placement (UNSY 605, 3.4)


     Aircraft sensor placement is as much an important design consideration as what sensors to install. Incorrectly placing sensors can lead to erroneous or misleading data inputs to avionics or cockpit displays. Valid data streams are of critical importance on Unmanned Aerial Systems (UAS), since they provide the sole source of information for remote pilot or autopilot decision making. A vehicle will be selected and sensor placement analyzed in two UAS mission examples.

Aerial Photography UAS
     The first step in selecting a platform for a particular mission should be defining the requirements. The following characteristics, in descending order of importance, were used to select a UAS for aerial photography flying less than 400 feet Above Ground Level (AGL).
  • Camera stabilization and control
  • Resolution
  • Vehicle station-keeping
     The FreeFly ALTA was selected since it offers the best capability in those areas. Camera stabilization is maintained by the MoVI 3-axis gimbal system. With the optional MIMIC controller, a second operator can independently control the pan, tilt, and yaw of the main sensor (FreeFly, 2015). FreeFly has taken advantage of adaptive autopilot controllers now available in high-end sUAS by allowing the 3-axis gimbal to be positioned below or above the vehicle. This adds an entirely new perspective, which allows filmmakers to be more creative, and provides a new angle for structural inspections. High resolution stills or video can be captured by a wide range of professional digital cameras thanks to the ALTA’s 15 pound payload capacity and modular mounting system. Sensors such as the Red Epic Dragon can generate 19 megapixel images at 100 frames per second (fps)(Red Inc., 2015). High frequency vibration isolators are installed at the sensor gimbal/airframe interface to eliminate interference from the six rotors (FreeFly, 2015). This is crucial to maintaining smooth video when the sensor’s frame rate approaches a harmonic frequency of the propulsion system. The autopilot draws from avionics-grade rate sensors which combine with the hexacopter configuration to provide extremely precise control. Adding the MIMIC controller would increase station keeping ability by dividing piloting and sensor slewing workload between two people (Lavars, 2015).
Racing UAS
     Assuming all aircraft were similar in racing capability, the following sensor-related characteristics were used to select a UAS capable of competing on a First Person View (FPV) racing circuit.
  • Field of view
  • Refresh rate/link latency
  • Picture and telemetry presentation
     The STORM Racing Drone had a good compromise of the items above in an affordable package (HeliPal, n.d.). The main charge-coupled device (CCD) sensor has a 110o field of view, which will provide the pilot with sufficient obstacle awareness. The main sensor was also positioned relatively close to the center of gravity. Other models of racing UAS had the camera slung beneath the body (similar to those for aerial photography), which would likely exaggerate pitch and roll rates to the pilot due to the long moment arm. The STORM’s 3.2GHz, 250mW transmitter advertised consistent video feed at a range of 1 mile, which user reviews appeared to agree with. This vehicle also includes a visor-type display for the pilot. These are preferable to screens, since they are not subject to sun glare. Depending on the model of visor, battery time remaining and signal strength can also be displayed.

References

FreeFly Systems. (2015). Redefining Movement. Retrieved from http://freeflysystems.com/products/2015/alta/

HeliPal.com. (n.d.). STORM Racing Drone (RTF / Type-A). Retrieved from http://www.helipal.com/storm-racing-drone-rtf-type-a.html

Lavars, N. (2015, April 15). High-end Freefly Alta drone flips aerial photography on its head. Gizmag. Retrieved from http://www.gizmag.com/freefly-alta-drone-photography-nab/37026/


Red Inc. (2015). Epic Dragon Tech Specs. Retrieved from http://www.red.com/products/epic-dragon#tech-specs

Saturday, June 13, 2015

Unmanned Underwater Vehicles in Search and Rescue (UNSY 605, 2.4)

          Search and Rescue (SAR) is a challenging mission that broadly covers locating and recovering distressed persons in all environments. Unmanned maritime systems currently have the potential to multiply the searching, communications, and networking capabilities of the rescue force. The highly dynamic nature of the actual rescue phase, especially in rough seas, urban/complex terrain, or inclement weather is beyond the capabilities of current or forecasted unmanned systems, so this research project will focus on a platform that will complement the search phase and support the rescue.

Bluefin-21 and the Search for Malaysian Airlines Flight MH370
          In April of 2014 a Bluefin-21 Autonomous Underwater Vehicle (AUV) was employed in a city-sized area in the Indian Ocean which potentially had the wreckage of MH370, and Boeing 777 with 239 people on board (Kaye, 2014). Built by Bluefin Robotics, the torpedo-sized AUV has a depth rating of 4,500m and endurance of 25 hours, making it ideal for wide area searches (Bluefin 2015). Another strength of Bluefin 21 is the field-replaceable payload sections.
Proprioceptive Sensors
  • Inertial Navigation System (INS): The Bluefin’s INS drifts less than 0.1% of distance traveled per hour, which is critical when navigating underwater (Bluefin, 2015).
  • Global Positioning System (GPS): A GPS is installed to determine an initial fix and can be used to update the INS when the vehicle surfaces (Bluefin, 2015).
  • Ultra-Short Baseline (USBL) Tracking: USBL is a relatively short range tracking system that allows the Bluefin to home to the recovery vessel (AML, 2015).
  • Stress, Fault, and Leak Sensors: The Bluefin 21 is equipped with numerous internal fault sensors. For operating at extreme depths, stress and leak detectors alert the AUV of impending structural failure, triggering an emergency ascent (Bluefin, 2015).
Exteroceptive Sensors
  • Doppler Velocity Log (DVL): DVL enables the INS to be updated with high frequency water and bottom-referenced velocity, reducing drift. These instruments are limited to around 500m and not practical for deep water operations (TRD, 2013).
  • Sound Velocity Sensor (SVS): SVS increases the accuracy of instruments such as DVLs, echo sounders, or anything else that relies on the speed of sound in water for timing calculations (AML, 2015).
  • EdgeTech 2200-M 120/410 kHz Side Scan Sonar: The Bluefin-21 can be outfitted with numerous sensors. For the MH370 search, the EdgeTech 2200-M was chosen. This sonar has a depth rating of 6000m with a resolution of 25cm (EdgeTech, 2015), which is suitable for long range searches for large objects.
Answers to Key Research Questions
  • What is one modification you would make to the existing system to make it more successful in maritime search and rescue operations?
  • I would develop and install a sensor that could fulfill a function similar to look-up/shoot-up on a fighter radar. This would allow the unmanned system to remain in the relatively constant subsurface environment, avoid inclement weather, and conduct wide area scans for life rafts or floating wreckage.
  • How can maritime unmanned systems be used in conjunction with UAS to enhance their effectiveness?
  • The two systems could either fuse or cue their respective sensors using a datalink. For example, if an RQ-4 Triton identified a point of interest with one its sensors, it could direct the UMS to shift sensors to that point, and vice versa. If a UUV was in use, then it would periodically have to surface to join the network and receive updates. 
  • What advantages do unmanned maritime systems have over their manned counterparts?
  • The most obvious advantage is that it keeps additional people from entering a potentially hazardous area. Especially in the case of a UUV, the vehicle can be made much smaller (or payload capacity increased) by eliminating a crew compartment and life support systems.
  • Are there sensor suites that are more effective on unmanned systems?
  • I would propose that the sensors themselves are not more effective, but the unmanned system allows them to get closer to the target at a lower cost, which improves their performance. For example, lets install the same sonar imaging capability of the EdgeTech 2200 on a manned system. First, it could be a surface vessel, which would increase slant range, resulting in decreased resolution. Second, it could be installed on a submarine capable of Bluefin-21 depths, however vehicle would have to be equipped with all the systems needed to support life at 14,000 foot depths.

References

Kaye, B. (2014, April 18). Drone Risks Damage at Record Depth in Search for Malaysian Plane. Reuters Business. Retrieved from http://www.reuters.com/article/2014/04/18/us-malaysia-airlines-idUSBREA3A06W20140418

Bluefin Robotics. (2015). Bluefin-21 Summary. Retrieved from http://www.bluefinrobotics.com/products/bluefin-21/

AML Oceanographic. (2015). USBL / SBL / LBL (Acoustic Positioning). Retrieved from http://www.amloceanographic.com/Technical-Demo/USBL-SBL-LBL_2

Teledyne Technologies Inc. (2013). Workhorse Navigator Doppler Velocity Log. Retrieved from http://www.rdinstruments.com/navigator.aspx


EdgeTech. (2015). EdgeTech 2200 Modular Sonar System. Retrieved from http://www.str-subsea.com/sales/edgetech-2200-modular-sonar-system

Saturday, June 6, 2015

Autonomous Ground Navigation (UNSY 605, 1.5)

        For this introductory post I reviewed a paper on autonomous navigation using augmented visual localization. In this case the proprioceptive sensor was an odometer and the exteroceptive sensor was an electro-optical camera. The authors’ were tasked with designing localizing algorithms and control laws for a driverless taxi in an urban environment. This is challenged by two main factors. First, reduced Global Positioning System (GPS) accuracy is caused by the “urban canyon” effect, which can limit the number of satellites in view. Second, the complex street layout also requires high frequency position updates, which GPS does not provide.

Visual localization can be the primary navigation solution in a situation where the user has high fidelity images of the operational area. Reference frames are pre-generated, uploaded to vehicles, and then compared to frames from an onboard camera. The largest source of error stems from the use of global scaling factors. These are part of the image comparison algorithms and basically state that for a given commanded speed, prominent references in the field of view should get larger at a certain rate. The authors proposed a local scaling factor determined by an odometer. This input provides the distance traveled along the actual trajectory. An extended Kalman filter was also used for command generation after comparing estimated and actual location. A benefit of the odometer and filter addition is that the system can still estimate location if the camera becomes temporarily obscured or washed out by the sun, although with some drift.

The paper provided experimental results that showed good correlation between expected and actual trajectories, and predicted that this kind of approach would be versatile in that many types of additional sensor data could be used to augment visual localization. Future work was aimed at increasing accuracy at night with headlights and employing a second aft-facing camera to combat prolonged washout when traveling into a low sun. I would propose adjusting the system to work with a self-calibrating infrared imaging system to eliminate the lighting issues, and an edge detection algorithm that would correlate reference frames regardless of whether the camera was in a black or white hot mode.

Reference

Karam, N., Hadj-Abdelkader, H., Deymier, C., & Ramadasan, D. (October, 2010). Improved Visual Localization and Naviagtion Using Proprioceptive Sensors. Conference on Intelligent Robots and Systems. 2010 IEEE/RSJ, Taipei, Taiwan.