Saturday, December 26, 2015

Weather Avoidance Sensor Systems for Unmanned Aerial Systems


Weather Avoidance Sensor Systems for Unmanned Aerial Systems
Geoffrey T. Barnes
Embry-Riddle Aeronautical University

Summary
Weather situation awareness has been a causal factor in Unmanned Aerial System (UAS) accidents and incidents, however real-time weather sensing, perception, and processing is severely lacking in current flight software and Control Stations (CS). While UAS that are flown primarily via line-of-sight data links can avoid adverse weather as seen by the operator and/or visual observers, those UAS that fly beyond visual line-of-sight (BVLOS) may come across areas that have little or no weather reporting available. In the absence of an effective on-board, real-time weather sensing system the pilot may lose weather situation awareness, which can expose the vehicle to numerous meteorological hazards. These include airframe icing, wind shear, lightning strikes, and hail. Weather radar and lightning detectors, currently fielded on manned aircraft, were reviewed for UAS applicability and an optical flow image processing technique will be proposed. Size, weight, and data link bandwidth demand were investigated through a literature review to determine a low-cost solution that requires minimal implementation effort. Image processing is recommended as it can be applied to the full range of UAS sizes, and in some cases, can utilize hardware that is already in place for operating the vehicle.
Weather Avoidance Sensor Systems for Unmanned Aerial Systems

Problem Statement
Possibly the most significant threat is airframe icing, which is super-cooled water droplets or precipitation freezing and accumulating on the vehicle (USAF, 1997). Many UAS are designed for long endurance, typically featuring lightly loaded, high aspect ratio wings and low power-to-weight ratios. These characteristics foster ice accretion and make recovery from ice-induced low-lift, high-drag conditions difficult. Icing can be encountered at any time when visible moisture is present (clouds, fog, rain) and typically when the ambient temperature is between 10o and -10o C. 


Thunderstorms are a significant threat, especially to light aircraft and UAS (USAF, 1997). They have the potential to occur any time there is moist unstable air, and a source of lifting such as mountains or converging air masses. Thunderstorms contain up/down drafts approaching 5,000 feet per minute, which can exceed vehicle performance or structural limits while attempting to maintain control. Severe icing can be encountered, as well as hail, which can be ejected from the storm by as far as 20 NM. Lightning is especially hazardous with the ability to cause physical damage, ignite fuel, overload electrical systems, and damage avionics. The U.S. Air Force Weather for Aircrew Handbook (1997) recommends laterally avoiding severe thunderstorms by 20 NM. Two leading causal factors for fatal mishaps of manned aircraft due to low visibility and ceilings were identified in a study conducted by Douglas Pearson of the National Weather Service in 2002. First, non-instrument-rated pilots continuing into Instrument Meteorological Conditions (IMC) often led to loss of control from spatial disorientation or Controlled Flight Into Terrain (CFIT). Second, poor decision-making when reaching Decision Altitude/Minimum Descent Altitude on instrument approaches resulted in CFIT or loss of control just prior to or after touchdown. This is an area where highly automated UASs may improve safety by executing autopilot approaches with a suite of sensors such as laser/radar altimeters, infrared cameras, and specialized homing equipment. Although sufficient data does not yet exist to be certain, it can be expected that enroute or on-objective weather accidents will take the place of low-visibility CFIT regarding UAS. UAS Control stations (CS) have been identified by numerous studies (Tvaryanas, A., Thompson, W., Constable, S., 2005)(Thompson, 2005)(Williams, 2004) as lacking proper supply or presentation of flight critical information. While human factors have become a special interest item in CS design, weather situation awareness has yet to addressed on a large scale. 

Significance of the Problem
Meteorological phenomena present a significant hazard to air vehicles. The highly dynamic nature of weather makes tactical forecasting and decision-making difficult. In the case of manned aircraft, all pilots are provided basic training in weather forecasting and observation. Nearly 30% of fatal aviation accidents over a five-year period were the result of weather, which highlights the complex problem of anticipating and avoiding unsafe conditions (Pearson, 2002). Of these fatal accidents, 63% were caused by low visibility (likely leading to CFIT), 18% caused by convective or non-convective precipitation, and another 18% from turbulence and wind shear. Unmanned Aerial Systems are often advertised as being capable of increasing safety, however satisfactory reduction in mishap rates will not be achieved until weather avoidance sensing systems and procedures are developed. A review of Department of Defense UAS accidents (Williams, 2004) specifically identified 9% of U.S. Army Pioneer losses as directly caused by weather. Williams was unable to access more detailed data for other service branches and UASs, however he was able to conclude that human factors caused at least a third of all UAS accidents that resulted in total loss of the vehicle. The most troubled UAS was the MQ-1 Predator with 67%. Of those accidents, 67% were the result of “Unsafe Acts” which was comprised of “skill-based errors, decision errors, perceptual errors, and violations.” Lack of weather situation awareness can precipitate any four of those categories. For example, poor weather situation awareness could result in the crew incorrectly deciding on a route that flies the UAS through a thunderstorm, after which it is destroyed by wind shear and lightning strikes. A second example is severe airframe ice accumulation, resulting in a uncontrolled descent and possible crash, caused by the crew’s misconception of moisture-laden clouds building in their airspace. The following paper will investigate the sensing portion of the problem.

Alternative Actions
Two systems currently fielded on manned aircraft, airborne weather radar and lightning detectors, provide a foundation for developing a UAS weather sense-and-avoid capability. Recognizing that these systems may not be suitable for the wide range of sizes and shapes of unmanned aircraft, the ability of small electro optical (EO) or infrared (IR) sensors, combined with advanced image processing, will be investigated as well.

Weather Radar
Weather radar provides a wealth of information to the crew that allows them to avoid areas that are currently unsafe, and anticipate areas that will become unsafe in the near future. Weather radar generates a display based on the amount of energy reflected by airborne precipitation. Returns are color-coded according to strength, so crews can identify both active storm cells and adjacent high-moisture areas that can develop into storms. Pulse-Doppler radars augment the basic functions by measuring the Doppler shift in reflected energy, which can be used for wind shear and turbulence detection. A typical airborne weather radar operates in X-band, has a 30o beam width, and a maximum unambiguous range of 300 NM (Melvin, W. L., & Scheer, J. A., 2013). Unambiguous range is the maximum distance a radar’s transmission can travel to the target and echo within the receiving period for a given cycle. While weather radars provide a high volume of useful information, integration is challenging. The smallest radar package available is the Bendix/King RDR 2000, which weighs only 10lbs, but requires a 12x10in volume and 28VDC/3A power supply (Bendix/King, 2015). This size, weight, and power requirement prevents installation on vehicles much smaller than the MQ-1C. With a 56 ft wingspan and 3,600 lb maximum takeoff weight, it is currently the smallest UAS with comparable radar equipment (General Atomics, 2015).

Given the complex problem of forecasting, humans need to interpret the radar data, which means it needs to be passed through a data link to the CS, resulting in less bandwidth available for the primary mission. The radar set also needs to be regularly manipulated as a closed-loop function of human interpretation of the data, such as adjusting gain or antenna tilt angle (Baur, 2012), making it difficult to effectively integrate with a fully autonomous vehicle.

Lightning Detectors
Lightning detectors for aircraft receive sferics, which are the electromagnetic discharges from lightning that propagate through the atmosphere. Cloud-to-ground lightning is characterized, within the radio spectrum, by a broadband impulse on the order of 1kHz (Siingh, D., Singh, R. P., Singh, A. K., Kumar, S., Kulkarni, M. N., & Singh, A. K., 2012). Beyond a distance of approximately 10 miles where the sound and luminosity can be sensed, sferics become the only evidence of lightning discharges. Aircraft lightning detectors receive these impulses and provide an indication to the pilot, usually consisting of range and bearing to the strike. Successive strikes are recorded and displayed to build a rough picture of active storm cells. While these devices provide less information than radar, they have the advantage of 360o coverage. They are well suited to light general-aviation-size aircraft due to small installation volume/weight, low power requirement, and an order of magnitude lower cost than radar. Models such the Avidyne TWX670 provide a full-color display of trend data to a range of 200 NM (Avidyne, 2015). 

The lightning detector market has steadily declined in recent years in favor of data links such as XM/WX satellite weather service which provides Next Generation Radar (NEXRAD) imagery and lightning strikes, or subscription-free weather information provided via Flight Information Service-Broadcast (FIS-B) (Anglisano, 2012) to aircraft with a Universal Access Terminal (UAT). UAT is a civil aviation data link architecture operating on 978 MHz, which was established under the Federal Aviation Administration’s Next Generation Air Transportation System modernization initiative. While data links may provide significantly more capability (weather, traffic, onboard internet), they rely on a source of high-fidelity data such as NEXRAD, which is only available in the U.S. In this respect, lightning detectors maintain an advantage as a self-contained onboard source of real-time data. The simplicity of the output data (only range and bearing) would require very little data link bandwidth to generate a display in the CS. Additionally, integration with an autonomous system would be relatively easy by adding logic which avoids areas with strikes by a preprogrammed distance.

Image Processing
Significant work has been accomplished using EO/IR sensors and image processing algorithms to give UAS a sense-and-avoid capability. These passive systems are still in development stages due to the challenge of reliably detecting pixel or sub-pixel sized objects in a wide range of lighting or contrast scenarios (Delves, 2012), however the technology is robust enough for cloud detection. As discussed earlier, cloud development is a key indicator of atmospheric stability and can help build a local near-term forecast. Keeping the vehicle clear of clouds significantly reduces the threat of airframe icing and assists in maintaining line-of-sight to a visual mission objective. The first step is an image being captured by a fixed forward facing camera, like those commonly found on medium to large UAS for takeoff and landing, or a dedicated weather camera sensor. This capability can be easily integrated with sUAS on a constrained power, processing and weight budget. Work by Vidya Murali at Clemson University (2011) demonstrates that machine vision can be as low as 32x24 pixels within a 30o field of view (FOV) and still provide sufficient data for basic obstacle avoidance. This is easily accomplished by miniature camera chips like those offered by Raspberry Pi with 640x480 video resolution in a 8.5x11.3mm package, requiring only 5V power supply (Adafruit, 2015). Down-sampling the image towards the 32x24 threshold requires less processing power, reducing the computing requirements for the sUAS.  With the image captured the second step is to reduce the jitter, or high-frequency changes in scene caused by aircraft or sensor movement. To reduce onboard processor requirements, a method such as Image Projection Correlation (IPC) should be used. IPC sums gray-scale pixel values across each column and row of the focal array, and then compares the cross-correlation peak between adjacent frames to determine displacement (Delves, 2012).

With a stable stream of image frames, the third step is to process them and determine whether any clouds are present. Numerous video tracking algorithms are available, varying in the quantity of preprocessing and computational steps (Davies, 2012). Preprocessing an image filters out data that is not useful to the main function, which reduces the amount of data that the higher order algorithms need to process. In the case of edge detection, the intensity of each pixel is of interest, so a gray scale filter can discard color information. Convolution masks tailored to the particular application (blur, edge detection, contrast, etc.) are then applied, resulting in the least amount of data being forwarded for tracking or change detection. In the interest of reducing computations for sUAS, a Differential Gradient (DG) edge detection should be used, which evaluates the gradient (first derivative) of pixel intensity in each row (x) and column (y), and then calculates the local edge value as the root-mean-square of the x and y gradients. A further step to reduce computational complexity can be taken by instead summing absolute values of the x and y gradients, which Abdou and Pratt (1979) demonstrated is sufficient for most edge detection requirements. The resulting line image can now be passed to a higher level processing stage, which will employ the optical flow technique. The local intensity, as a function of x and y location in the image and time, is approximated as a first-order Taylor series expansion. A flow field is estimated by comparing successive frames to determine the differential values (Davies, 2012). A common problem with the optical flow technique is that velocities appear to vanish when an object’s edges are parallel to the velocity. This is initially mitigated by the irregular shape of clouds, and further aided by bounding areas of similar velocity and averaging. The technique provides velocities relative to a fixed camera, which is enough to determine if obstacles (in this case clouds) are present. Absolute velocities can be determined by receiving aircraft state parameters from the navigation system and integrated to estimate cloud location. While on station, the UAS will likely be performing some type of holding maneuver that will result in a 360o heading change, whether it be a circular orbit over a point of interest or a raster mapping pattern over a wide area. Throughout the 360o sweep cloud position, velocity, and size trend data could be gathered and downlinked to the CS for display. This allows for more efficient vehicle positioning when attempting to maintain visual custody of the mission objective. This image processing technique has the ability to work with both EO and IR sensor inputs, and can utilize existing fixed nose cameras on many large UAS. Image processing could be accomplished using compact processing modules such as the Sundance EVP6472-941, which is geared towards high-intensity processing and currently used in signals intelligence and communications analysis. It consists of two multicore Digital Signal Processors (DSP), which could simultaneously accept EO and IR sources. The main Xilinix Virtex-5 Field Programmable Gate Array (FPGA) will run the cloud detection algorithms, and output commands using built-in Gigabit Ethernet or RS-232 serial communication. The Sundance EVP6472-941 package requires less than 24 in3, is less than 1 lb, and costs approximately $6,000 (Holland, 2010). The image processing weather avoidance system output could be easily adjusted to accommodate return link bandwidth limitations or operator requirements, varying between simple range/bearing to potential storm cells and cloud coverage overlays for moving maps.

Recommendations
A fusion technique for weather avoidance is recommended, combining the strengths of lightning detectors and image processing. Both of these technologies can be integrated with a wide size range of vehicles due to relatively small installed volume and weight. Their concise outputs would also reduce interface complexity when used with a fully autonomous system, and could be combined with geo-fencing to enable the vehicle to avoid dangerous meteorological events. In an example scenario, a UAS is on station in an area prone to afternoon thunderstorms. As the day progresses, locations of growing cloud coverage are downlinked to the CS, allowing the crew or autopilot to more efficiently move the vehicle to remain in visual contact with the mission objective. The vehicle has already noted ambient temperature within a range conducive to airframe icing, and downlinks a potential storm activity warning with sub-cardinal direction. For a fully autonomous system, this warning may trigger a person managing a fleet of UAS to research weather forecasts or at least monitor this particular vehicle more closely. With this added weather situation awareness, the crew or autopilot is able to maneuver to an area that allows mission continuation without endangering the vehicle. In this situation, the crew directs the UAS to an area upwind of the storm to avoid hail or ice-inducing precipitation that will likely be found under the “anvil,” while still keeping the vehicle and line-of-sight to the objective clear of clouds. Autonomous vehicle could be programmed with logic that makes a similar decision by using wind estimates that are typically generated by coupled inertial/GPS navigation systems. Onboard lightning sensing equipment would trigger a final indication of storm maturity, causing the crew to terminate the mission, or hold at a safe distance until the storm dissipates. Overall, this selection of sensors and fused approach has the strong potential to reduce weather-related UASs mishaps with readily available technology and should be pursued. Additionally, UAS operator certification programs should incorporate basic meteorology education similar to their manned counterparts, so that the weather system data can be correctly interpreted.

References
Abdou, I. E., & Pratt, W. K. (1979). Quantitative design and evaluation of enhancement/thresholding edge detectors. Proceedings of the IEEE, 67(5), 753-763.

Adafruit. (2015). Spy Camera for Raspberry Pi. Retrieved from http://www.adafruit.com/products/1937

Anglisano, L. (2012, April). Lightning detectors: Still worth having. Consumer Aviation. Retrieved from http://connection.ebscohost.com/c/articles/78149092/lightning-detectors-still-worth-having

Avidyne Corporation. (2015). TWX670 Tactical Weather Detection System. Retrieved from http://www.avidyne.com/products/twx670/index.asp

Baur, C. (2012, April 1). Weather Radar: Navigating the Storm. Rotor & Wing. Retrieved from http://www.aviationtoday.com/rw/commercial/ems/76076.html#.VZbYamCj7V1


Davies, E., & Books, I. (2012). Computer and machine vision: Theory, algorithms, practicalities, fourth edition (4th ed.). Waltham, [Mass.]: Academic Press. Retrieved from http://www.sciencedirect.com

Delves, P. (2012). Sense and Avoid in UAS : Research and Applications (2nd Edition). Hoboken, NJ, USA: John Wiley & Sons. Retrieved from http://www.ebrary.com

General Atomics. (2015). MQ-1C Gray Eagle: Armed Persistence. Retrieved from http://www.gaasi.com/Websites/gaasi/images/products/aircraft_systems/pdf/Gray_Eagle021915.pdf

Holland, C. (2010, August 17). Multicore developer platforms uses TI DSPs and Xilinx FPGA. Embedded. Retrieved from http://www.embedded.com/electronics-products/electronic-product-reviews/embedded-tools/4206229/-Multicore-developer-platforms-uses-TI-DSPs-and-Xilinx-FPGA

Melvin, W., & Scheer, J. (2013; 2010). Principles of modern radar.: (radar applications). Raleigh, NC: Institution of Engineering and Technology.

Murali, V. N. (2011). Low-resolution vision for autonomous mobile robots (Order No. 3469540). Available from ProQuest Dissertations & Theses Global. (893426415). Retrieved from http://search.proquest.com.ezproxy.libproxy.db.erau.edu/docview/893426415?accountid=27203

Pearson, D. (2002). VFR flight not recommended: A study of weather-related fatal aviation accidents. NOAA/NWS Technical Attachment, Southern Region Headquarters, Dallas, TX, TA, 18.

Schetzen, M. (2006). Airborne doppler radar: Applications, theory, and philosophy. Reston, Va: American Institute of Aeronautics and Astronautics.

Siingh, D., Singh, R. P., Singh, A. K., Kumar, S., Kulkarni, M. N., & Singh, A. K. (2012). Discharges in the stratosphere and mesosphere. Space Science Reviews, 169(1-4), 73.

Thompson, W. (2005). U.S. Military Unmanned Aerial Vehicle Mishaps: Assessment of the Role of Human Factors Analysis and Classification System (HWS-PE-BR-TR-2005-0001). Brooks City Base, TX: 311th Human Systems Wing.

Tvaryanas, A.P.; Thompson, W.T.; Constable, S.H. (2005) The U.S. Military Unmanned Aerial Vehicle (UAV) Experience: Evidence-Based Human Systems Integration Lessons Learned. In Strategies to Maintain Combat Readiness during Extended Deployments – A Human Systems Approach (pp. 5-1 – 5-24). Meeting Proceedings RTO-MP-HFM-124, Paper 5. Neuilly-sur-Seine, France: RTO.

United States Air Force (USAF). (1997). AFH 11-203 Volume 1: Weather for Aircrew. Wright-Patterson AFB, OH: United States Air Force.


Williams, K. W., United States. Office of Aerospace Medicine, & Civil Aerospace Medical Institute. (2004). A summary of unmanned aircraft accident/incident data: Human factors implications. (). Springfield, Va; Washington, D.C: Office of Aerospace Medicine, Federal Aviation Administration.

Saturday, July 18, 2015

Sense and Avoid Sensor Selection (UNSY 605, 7.4)

Sense and Avoid Sensor Selection

Developing sense-and-avoid (SAA) capabilities are challenging for small unmanned aircraft. Typical onboard systems for detecting intruding aircraft include non-cooperative systems such as radar, or cooperative systems like Traffic Collision Avoidance System (TCAS). These options are currently not suitable for sUAS due to relatively large size, weight, power consumption, and cost. Benefits of the sUAS SAA sensor trade-space include short ranges and low speeds, which enable vision-based systems of low-cost electro-optical (EO) or infrared (IR) sensors to be used.

Sensor Recommendation
The proposed sensor system would be based on research into the optical flow sensed by insects and used for navigation (Barnhart, R. K., Shappee, E., & Marshall, D. M., 2011). In an example case, a house fly uses thousands of simple receptor cells that detect changes in contrast. Those signals are sent to a section of the brain that contains elementary motion detectors, which build a relative motion map within the fly’s field of view. This system was recreated in a lab setup by Ruffier and Franceschini in 2008 using a tethered remote-controlled helicopter (referred to as OCTAVE2) testbed with the intent of avoiding terrain and close-in obstacles. The technique could also be adapted to provide an SAA capability between non-cooperating vehicles operating in the sUAS realm (low-speed, less than 400ft above ground level). The basic sensing element consisted of a pair of P-type, intrinsic semiconductor, N-type (PIN) diodes which conduct current upon reaching a certain threshold of received light (First Sensor AG, 2015). The PIN diodes were installed behind a 5mm diameter lens and mounted to a 400um-thick circuit board. The complete sensing element measures 2.7x3.0cm and weighs 4.3g. While a power requirement for this particular arrangement was not provided for the OCTAVE2, PIN diodes are available in a variety of supply voltages that would be compatible with sUAS (3.3-9V). Signals were routed to a microprocessor for running optical flow analysis, resulting in command generation for the autopilot. The system was found suitable for directing the OCTAVE2 over steep terrain to a safe landing with only two “eyes,” and could be further enhanced by with a higher quantity of smaller sensing elements, with the net effect of increasing resolution and field of regard. Depending on the mission of the sUAS, the SAA systems could also a subsystem of the overall vehicle navigation systems, instead of an addition, further reducing size, weight, and power requirements. This SAA technique has the strong advantages of being light weight, small, and requires little power and no additional sensing input (i.e. GPS, INS, pitot-static) to guide the vehicle. One disadvantage is relatively short range, which reduces the time available to alter the flight path. This will likely not be a problem for highly maneuverable quad-copters, but may result in the insect-inspired system being less desirable for fixed-wing sUAS. 

References

Barnhart, R. K., Shappee, E., & Marshall, D. M. (2011). Introduction to Unmanned Aircraft Systems. London, GBR: CRC Press. Retrieved from http://www.ebrary.com

First Sensor AG. (2015). PIN photodiodes. Retrieved from http://www.first-sensor.com/en/products/optical-sensors/detectors/pin-photodiodes/

Ruffier, F., & Franceschini, N. (2008). Aerial robot piloted in steep relief by optic flow sensors. Paper presented at the 1266-1273. doi:10.1109/IROS.2008.4651089

Friday, July 10, 2015

Control Station Analysis (UNSY 605, 6.5)

Control Station Analysis

Unmanned systems require thoughtfully designed control stations that provide the operator with sufficient situation awareness and intuitive controls. Unlike a manned system, they can not rely on the human ability to freely sense the environment in a variety of ways, in all directions, and adapt to non-standard conditions. The following research activity will analyze the control station for an unmanned ground vehicle (UGV)  and provide recommendations for improvement.

iRobot 510 PackBot
The iRobot 510 PackBot is a versatile UGV that is currently on the military, law enforcement, and disaster response markets. Its typical mission sets include building and route clearance, explosive ordinance disposal (EOD), and hazardous material detection and handling. The baseline vehicle has tracks and a modular payload frame that allows it to be quickly tailored to specific mission tasks or environments. The control station, referred to as the Operator Control Unit (OCU), is of ruggedized, all-weather construction and focused on portability (iRobot, 1012). It is capable of controlling the vehicle using a thin fiberoptic tether or radio link. The OCU features a 15.1 inch, 1024x768 pixel Liquid Crystal Display (LCD screen.  Video from the Electro-Optical (EO) or Infrared (IR) sensors takes up most of the display. An active 3-D model of the PackBot is provided to visualize the robot’s “pose.” Gauges are provided for critical parameters such as battery power, fiber remaining, and signal strength. Still frames can be captured from the video feed, annotated, and exported using standard networking interfaces, enabling rapid dissemination of intelligence. The UGV can be controlled using the keyboard and dual joysticks on the OCU, or a video-game-style controller (iRobot, 2012).  

Figure 1. 510 PackBot with control station. Adapted from iRobot 510 PackBot Multi-Mission Robot and Army-Guide.com. Copyright 2015 iRobot and ATEN 2008.

An improved PackBot interface has been fielded by the Army Research Lab, which increases the OCU portability so that it would be more compatible with fast-moving infantry units as part of the Future Combat Systems initiative. The improved OCU condenses the control station into a chest-mounted tablet computer, also with dual joystick or hand controller (Pomranky, R. et al, 2012). In order to provide more screen area for video display, the 3-D model has been removed, however critical parameter gauges remain.

Recommendation
Field testing of the improved OCU yielded mainly ergonomic problems with the display. For example, the tablet was difficult to hold while using a hand controller. The screen was also found to be unusable in bright sunlight due to glare, and at night because the screen glow gave away troops’ position. I would recommend integrating a display similar to L-3 Communications’ Tactical ROVER system (2013), which uses a clip-on helmet-mounted unit to display aircraft targeting pod video to ground parties. The display could be augmented with small critical parameter tape gauges and an artificial horizon to give the operator relative angle data. BlueTooth camera and vehicle controllers could be installed on an M-16 hand guard, which would enable essential vehicle control inputs while still holding a weapon.

References

iRobot. (2012). iRobot 510 PackBot Specificactions. Retrieved from http://www.irobotweb.com/~/media/Files/Robots/Defense/PackBot/iRobot-510-PackBot-Specs.pdf?la=en

iRobot. (2012). IRobot PackBot Accessories Manual. Retrieved from http://www.manualslib.com/products/Irobot-Packbot-2938203.html

Pomranky, R. A., Kovach, J. B., Winslow, C. H., & ARMY RESEARCH LAB ABERDEEN PROVING GROUND MD HUMAN RESEARCH AND ENGINEERING DIRECTORATE. (2012). General overview of the use of SUGV and centralized controllers type I and II during command, control, communications, computers, intelligence, surveillance, and reconnaissance on the move, event 08. (C4ISR OTM E08). Retrieved from http://www.dtic.mil/docs/citations/ADA512533


L-3 Communications. (2013). Tactical ROVER-P Data Sheet. Retrieved from http://www2.l-3com.com/csw/ProductsAndServices/DataSheets/Tactical_ROVER-P_Sales-Sheet.pdf

Saturday, June 27, 2015

Unmanned System Data Protocol and Format (UNSY 605, 4.5)


The RQ-4 Global Hawk is a high altitude, long endurance (HALE) unmanned aerial system (UAS) currently operated by the U.S. Air Force. The aircraft has a range of 12,300NM, service ceiling of 60,000ft, and cruises at 310 knots (U.S. Air Force, 2014). The Global Hawk’s greatest strength is arguably its 34 hour endurance which makes it ideal for providing persistent intelligence, surveillance, and reconnaissance (ISR). While the RQ-4 has had limited success with operational users when trying to replace the Lockheed U-2’s capabilities, it has significantly contributed to disaster relief efforts in Japan, Haiti, and the Philippines.

RQ-4 Sensor Capabilities
The Block 20 RQ-4 can carry up to 3,000lbs of mission payloads. The standard configuration is a Raytheon Integrated Sensor Suite (ISS) comprised of a Hughes Integrated Synthetic Aperture Radar (HISAR), 0.4-0.8um Electro-Optical (EO) sensor, and 3.6-5.0um Infrared (IR) sensor (Kable Intelligence Ltd, 2015). HISAR is capable of operating in the following modes (Bayma, 1996):
  • Wide Area Search: 24m resolution over a 60o sector.
  • Strip Map: 6m resolution over a continuous 37km path.
  • Spot Image: 1.8m resolution over a 4.8x2.8km area.
  • Ground Moving Target Indicator (GMTI): Position/velocity within 45o of broadside.
The HISAR transmitter operates in X-band with a 600MHz bandwidth. Block 40 RQ-4s have recently been delivered with a Multi-Platform Radar Technology Insertion Program payload which enables new capabilities to be quickly fielded across different vehicles by using common hardware/software interfaces (Pultrich, 2010). The ISS EO/IR sensors share an optical path through a 10in reflecting telescope that can generate 2km2 images that are geo-rectified to within 20m. The ISS requires up to 3.5kW of power (Kable Intelligence Ltd, 2015), likely supplied by an engine-driven 11.2kW generator (Cessna, 2013).

Data Format, Protocols, and Storage
The primary method of transferring sensor data to a ground-based processing, exploitation, and dissemination (PED) function is Ku or UHF satellite data link (Kable Intelligence Ltd, 2015). The HISAR contains a dual-channel receiver which passes raw signals to an Analog-to-Digital converter. The digitized raw data is held in a high capacity buffer until the onboard processor can finish generating an image, which is then downlinked for PED (Bayma, 1996). One of the strengths of this system is that the only the finished product is transmitted from the aircraft, which requires much less bandwidth than the raw data. One of the key features of HISAR is its ability to simultaneously operate in GMTI mode, while performing wide area or strip mapping. In this case, the processed images will be overlaid with GMTI position/velocity information. In the RQ-4, only high-resolution still frames from the ISS EO/IR sensors are downlinked, however the MQ-4 Triton variant flown by the U.S. Navy has the capability to downlink low-resolution video.

Recommendations
The HALE ISR mission is perfect for a UAS. When comparing the performance of the RQ-4 with the U-2, the unmanned platform clearly has an advantage in terms of persistence. Additionally, a pilot’s life is not risked by enemy fire or having to ditch over inhospitable environments. The best recommendation I would make is to continue developing the RQ-4 reliability and sensor capability, while employing open architectures to facilitate future enhancements, so that it can seamlessly fill the role of the U-2. Work is currently underway by Northrop Grumman to build a Universal Payload Adaptor, that would enable the RQ-4 to “carry the Senior Year Electro-Optical Reconnaissance System-2B/C and the Optical Bar Camera” (Malenic, 2015), which are currently the only internationally-recognized systems for aerial treaty enforcement.

References

Cessna Aircraft Company. (2013). Citation X: Specification & Description. Retrieved from http://cessna.txtav.com/~/media/Files/citation/x/xsd.ashx
Note: The Citation X has the same basic power plant model as the RQ-4.

Kable Intelligence Ltd. (2015). RQ-4A/B Global Hawk HALE Reconnaissance UAV, United States of America. Air Force Technology. Retrieved from http://www.airforce-technology.com/projects/rq4-global-hawk-uav/

Malenic, M. (2015, April 29). Northrop Grumman to test U-2 sensors on Global Hawk. Jane’s 360. Retrieved from http://www.janes.com/article/51076/northrop-grumman-to-test-u-2-sensors-on-global-hawk

Pultrich, G. (2010, August 16). Next generation of Global Hawks ready to roll. Flight Global. Retrieved from http://www.flightglobal.com/news/articles/next-generation-of-global-hawks-ready-to-roll-346116/

U.S. Air Force. (2014, October 27). RQ-4 Global Hawk Fact Sheet. Retrieved from http://www.af.mil/AboutUs/FactSheets/Display/tabid/224/Article/104516/rq-4-global-hawk.aspx




Sunday, June 14, 2015

UAS Primary Sensor Placement (UNSY 605, 3.4)


     Aircraft sensor placement is as much an important design consideration as what sensors to install. Incorrectly placing sensors can lead to erroneous or misleading data inputs to avionics or cockpit displays. Valid data streams are of critical importance on Unmanned Aerial Systems (UAS), since they provide the sole source of information for remote pilot or autopilot decision making. A vehicle will be selected and sensor placement analyzed in two UAS mission examples.

Aerial Photography UAS
     The first step in selecting a platform for a particular mission should be defining the requirements. The following characteristics, in descending order of importance, were used to select a UAS for aerial photography flying less than 400 feet Above Ground Level (AGL).
  • Camera stabilization and control
  • Resolution
  • Vehicle station-keeping
     The FreeFly ALTA was selected since it offers the best capability in those areas. Camera stabilization is maintained by the MoVI 3-axis gimbal system. With the optional MIMIC controller, a second operator can independently control the pan, tilt, and yaw of the main sensor (FreeFly, 2015). FreeFly has taken advantage of adaptive autopilot controllers now available in high-end sUAS by allowing the 3-axis gimbal to be positioned below or above the vehicle. This adds an entirely new perspective, which allows filmmakers to be more creative, and provides a new angle for structural inspections. High resolution stills or video can be captured by a wide range of professional digital cameras thanks to the ALTA’s 15 pound payload capacity and modular mounting system. Sensors such as the Red Epic Dragon can generate 19 megapixel images at 100 frames per second (fps)(Red Inc., 2015). High frequency vibration isolators are installed at the sensor gimbal/airframe interface to eliminate interference from the six rotors (FreeFly, 2015). This is crucial to maintaining smooth video when the sensor’s frame rate approaches a harmonic frequency of the propulsion system. The autopilot draws from avionics-grade rate sensors which combine with the hexacopter configuration to provide extremely precise control. Adding the MIMIC controller would increase station keeping ability by dividing piloting and sensor slewing workload between two people (Lavars, 2015).
Racing UAS
     Assuming all aircraft were similar in racing capability, the following sensor-related characteristics were used to select a UAS capable of competing on a First Person View (FPV) racing circuit.
  • Field of view
  • Refresh rate/link latency
  • Picture and telemetry presentation
     The STORM Racing Drone had a good compromise of the items above in an affordable package (HeliPal, n.d.). The main charge-coupled device (CCD) sensor has a 110o field of view, which will provide the pilot with sufficient obstacle awareness. The main sensor was also positioned relatively close to the center of gravity. Other models of racing UAS had the camera slung beneath the body (similar to those for aerial photography), which would likely exaggerate pitch and roll rates to the pilot due to the long moment arm. The STORM’s 3.2GHz, 250mW transmitter advertised consistent video feed at a range of 1 mile, which user reviews appeared to agree with. This vehicle also includes a visor-type display for the pilot. These are preferable to screens, since they are not subject to sun glare. Depending on the model of visor, battery time remaining and signal strength can also be displayed.

References

FreeFly Systems. (2015). Redefining Movement. Retrieved from http://freeflysystems.com/products/2015/alta/

HeliPal.com. (n.d.). STORM Racing Drone (RTF / Type-A). Retrieved from http://www.helipal.com/storm-racing-drone-rtf-type-a.html

Lavars, N. (2015, April 15). High-end Freefly Alta drone flips aerial photography on its head. Gizmag. Retrieved from http://www.gizmag.com/freefly-alta-drone-photography-nab/37026/


Red Inc. (2015). Epic Dragon Tech Specs. Retrieved from http://www.red.com/products/epic-dragon#tech-specs

Saturday, June 13, 2015

Unmanned Underwater Vehicles in Search and Rescue (UNSY 605, 2.4)

          Search and Rescue (SAR) is a challenging mission that broadly covers locating and recovering distressed persons in all environments. Unmanned maritime systems currently have the potential to multiply the searching, communications, and networking capabilities of the rescue force. The highly dynamic nature of the actual rescue phase, especially in rough seas, urban/complex terrain, or inclement weather is beyond the capabilities of current or forecasted unmanned systems, so this research project will focus on a platform that will complement the search phase and support the rescue.

Bluefin-21 and the Search for Malaysian Airlines Flight MH370
          In April of 2014 a Bluefin-21 Autonomous Underwater Vehicle (AUV) was employed in a city-sized area in the Indian Ocean which potentially had the wreckage of MH370, and Boeing 777 with 239 people on board (Kaye, 2014). Built by Bluefin Robotics, the torpedo-sized AUV has a depth rating of 4,500m and endurance of 25 hours, making it ideal for wide area searches (Bluefin 2015). Another strength of Bluefin 21 is the field-replaceable payload sections.
Proprioceptive Sensors
  • Inertial Navigation System (INS): The Bluefin’s INS drifts less than 0.1% of distance traveled per hour, which is critical when navigating underwater (Bluefin, 2015).
  • Global Positioning System (GPS): A GPS is installed to determine an initial fix and can be used to update the INS when the vehicle surfaces (Bluefin, 2015).
  • Ultra-Short Baseline (USBL) Tracking: USBL is a relatively short range tracking system that allows the Bluefin to home to the recovery vessel (AML, 2015).
  • Stress, Fault, and Leak Sensors: The Bluefin 21 is equipped with numerous internal fault sensors. For operating at extreme depths, stress and leak detectors alert the AUV of impending structural failure, triggering an emergency ascent (Bluefin, 2015).
Exteroceptive Sensors
  • Doppler Velocity Log (DVL): DVL enables the INS to be updated with high frequency water and bottom-referenced velocity, reducing drift. These instruments are limited to around 500m and not practical for deep water operations (TRD, 2013).
  • Sound Velocity Sensor (SVS): SVS increases the accuracy of instruments such as DVLs, echo sounders, or anything else that relies on the speed of sound in water for timing calculations (AML, 2015).
  • EdgeTech 2200-M 120/410 kHz Side Scan Sonar: The Bluefin-21 can be outfitted with numerous sensors. For the MH370 search, the EdgeTech 2200-M was chosen. This sonar has a depth rating of 6000m with a resolution of 25cm (EdgeTech, 2015), which is suitable for long range searches for large objects.
Answers to Key Research Questions
  • What is one modification you would make to the existing system to make it more successful in maritime search and rescue operations?
  • I would develop and install a sensor that could fulfill a function similar to look-up/shoot-up on a fighter radar. This would allow the unmanned system to remain in the relatively constant subsurface environment, avoid inclement weather, and conduct wide area scans for life rafts or floating wreckage.
  • How can maritime unmanned systems be used in conjunction with UAS to enhance their effectiveness?
  • The two systems could either fuse or cue their respective sensors using a datalink. For example, if an RQ-4 Triton identified a point of interest with one its sensors, it could direct the UMS to shift sensors to that point, and vice versa. If a UUV was in use, then it would periodically have to surface to join the network and receive updates. 
  • What advantages do unmanned maritime systems have over their manned counterparts?
  • The most obvious advantage is that it keeps additional people from entering a potentially hazardous area. Especially in the case of a UUV, the vehicle can be made much smaller (or payload capacity increased) by eliminating a crew compartment and life support systems.
  • Are there sensor suites that are more effective on unmanned systems?
  • I would propose that the sensors themselves are not more effective, but the unmanned system allows them to get closer to the target at a lower cost, which improves their performance. For example, lets install the same sonar imaging capability of the EdgeTech 2200 on a manned system. First, it could be a surface vessel, which would increase slant range, resulting in decreased resolution. Second, it could be installed on a submarine capable of Bluefin-21 depths, however vehicle would have to be equipped with all the systems needed to support life at 14,000 foot depths.

References

Kaye, B. (2014, April 18). Drone Risks Damage at Record Depth in Search for Malaysian Plane. Reuters Business. Retrieved from http://www.reuters.com/article/2014/04/18/us-malaysia-airlines-idUSBREA3A06W20140418

Bluefin Robotics. (2015). Bluefin-21 Summary. Retrieved from http://www.bluefinrobotics.com/products/bluefin-21/

AML Oceanographic. (2015). USBL / SBL / LBL (Acoustic Positioning). Retrieved from http://www.amloceanographic.com/Technical-Demo/USBL-SBL-LBL_2

Teledyne Technologies Inc. (2013). Workhorse Navigator Doppler Velocity Log. Retrieved from http://www.rdinstruments.com/navigator.aspx


EdgeTech. (2015). EdgeTech 2200 Modular Sonar System. Retrieved from http://www.str-subsea.com/sales/edgetech-2200-modular-sonar-system

Saturday, June 6, 2015

Autonomous Ground Navigation (UNSY 605, 1.5)

        For this introductory post I reviewed a paper on autonomous navigation using augmented visual localization. In this case the proprioceptive sensor was an odometer and the exteroceptive sensor was an electro-optical camera. The authors’ were tasked with designing localizing algorithms and control laws for a driverless taxi in an urban environment. This is challenged by two main factors. First, reduced Global Positioning System (GPS) accuracy is caused by the “urban canyon” effect, which can limit the number of satellites in view. Second, the complex street layout also requires high frequency position updates, which GPS does not provide.

Visual localization can be the primary navigation solution in a situation where the user has high fidelity images of the operational area. Reference frames are pre-generated, uploaded to vehicles, and then compared to frames from an onboard camera. The largest source of error stems from the use of global scaling factors. These are part of the image comparison algorithms and basically state that for a given commanded speed, prominent references in the field of view should get larger at a certain rate. The authors proposed a local scaling factor determined by an odometer. This input provides the distance traveled along the actual trajectory. An extended Kalman filter was also used for command generation after comparing estimated and actual location. A benefit of the odometer and filter addition is that the system can still estimate location if the camera becomes temporarily obscured or washed out by the sun, although with some drift.

The paper provided experimental results that showed good correlation between expected and actual trajectories, and predicted that this kind of approach would be versatile in that many types of additional sensor data could be used to augment visual localization. Future work was aimed at increasing accuracy at night with headlights and employing a second aft-facing camera to combat prolonged washout when traveling into a low sun. I would propose adjusting the system to work with a self-calibrating infrared imaging system to eliminate the lighting issues, and an edge detection algorithm that would correlate reference frames regardless of whether the camera was in a black or white hot mode.

Reference

Karam, N., Hadj-Abdelkader, H., Deymier, C., & Ramadasan, D. (October, 2010). Improved Visual Localization and Naviagtion Using Proprioceptive Sensors. Conference on Intelligent Robots and Systems. 2010 IEEE/RSJ, Taipei, Taiwan.