Saturday, December 26, 2015

Weather Avoidance Sensor Systems for Unmanned Aerial Systems


Weather Avoidance Sensor Systems for Unmanned Aerial Systems
Geoffrey T. Barnes
Embry-Riddle Aeronautical University

Summary
Weather situation awareness has been a causal factor in Unmanned Aerial System (UAS) accidents and incidents, however real-time weather sensing, perception, and processing is severely lacking in current flight software and Control Stations (CS). While UAS that are flown primarily via line-of-sight data links can avoid adverse weather as seen by the operator and/or visual observers, those UAS that fly beyond visual line-of-sight (BVLOS) may come across areas that have little or no weather reporting available. In the absence of an effective on-board, real-time weather sensing system the pilot may lose weather situation awareness, which can expose the vehicle to numerous meteorological hazards. These include airframe icing, wind shear, lightning strikes, and hail. Weather radar and lightning detectors, currently fielded on manned aircraft, were reviewed for UAS applicability and an optical flow image processing technique will be proposed. Size, weight, and data link bandwidth demand were investigated through a literature review to determine a low-cost solution that requires minimal implementation effort. Image processing is recommended as it can be applied to the full range of UAS sizes, and in some cases, can utilize hardware that is already in place for operating the vehicle.
Weather Avoidance Sensor Systems for Unmanned Aerial Systems

Problem Statement
Possibly the most significant threat is airframe icing, which is super-cooled water droplets or precipitation freezing and accumulating on the vehicle (USAF, 1997). Many UAS are designed for long endurance, typically featuring lightly loaded, high aspect ratio wings and low power-to-weight ratios. These characteristics foster ice accretion and make recovery from ice-induced low-lift, high-drag conditions difficult. Icing can be encountered at any time when visible moisture is present (clouds, fog, rain) and typically when the ambient temperature is between 10o and -10o C. 


Thunderstorms are a significant threat, especially to light aircraft and UAS (USAF, 1997). They have the potential to occur any time there is moist unstable air, and a source of lifting such as mountains or converging air masses. Thunderstorms contain up/down drafts approaching 5,000 feet per minute, which can exceed vehicle performance or structural limits while attempting to maintain control. Severe icing can be encountered, as well as hail, which can be ejected from the storm by as far as 20 NM. Lightning is especially hazardous with the ability to cause physical damage, ignite fuel, overload electrical systems, and damage avionics. The U.S. Air Force Weather for Aircrew Handbook (1997) recommends laterally avoiding severe thunderstorms by 20 NM. Two leading causal factors for fatal mishaps of manned aircraft due to low visibility and ceilings were identified in a study conducted by Douglas Pearson of the National Weather Service in 2002. First, non-instrument-rated pilots continuing into Instrument Meteorological Conditions (IMC) often led to loss of control from spatial disorientation or Controlled Flight Into Terrain (CFIT). Second, poor decision-making when reaching Decision Altitude/Minimum Descent Altitude on instrument approaches resulted in CFIT or loss of control just prior to or after touchdown. This is an area where highly automated UASs may improve safety by executing autopilot approaches with a suite of sensors such as laser/radar altimeters, infrared cameras, and specialized homing equipment. Although sufficient data does not yet exist to be certain, it can be expected that enroute or on-objective weather accidents will take the place of low-visibility CFIT regarding UAS. UAS Control stations (CS) have been identified by numerous studies (Tvaryanas, A., Thompson, W., Constable, S., 2005)(Thompson, 2005)(Williams, 2004) as lacking proper supply or presentation of flight critical information. While human factors have become a special interest item in CS design, weather situation awareness has yet to addressed on a large scale. 

Significance of the Problem
Meteorological phenomena present a significant hazard to air vehicles. The highly dynamic nature of weather makes tactical forecasting and decision-making difficult. In the case of manned aircraft, all pilots are provided basic training in weather forecasting and observation. Nearly 30% of fatal aviation accidents over a five-year period were the result of weather, which highlights the complex problem of anticipating and avoiding unsafe conditions (Pearson, 2002). Of these fatal accidents, 63% were caused by low visibility (likely leading to CFIT), 18% caused by convective or non-convective precipitation, and another 18% from turbulence and wind shear. Unmanned Aerial Systems are often advertised as being capable of increasing safety, however satisfactory reduction in mishap rates will not be achieved until weather avoidance sensing systems and procedures are developed. A review of Department of Defense UAS accidents (Williams, 2004) specifically identified 9% of U.S. Army Pioneer losses as directly caused by weather. Williams was unable to access more detailed data for other service branches and UASs, however he was able to conclude that human factors caused at least a third of all UAS accidents that resulted in total loss of the vehicle. The most troubled UAS was the MQ-1 Predator with 67%. Of those accidents, 67% were the result of “Unsafe Acts” which was comprised of “skill-based errors, decision errors, perceptual errors, and violations.” Lack of weather situation awareness can precipitate any four of those categories. For example, poor weather situation awareness could result in the crew incorrectly deciding on a route that flies the UAS through a thunderstorm, after which it is destroyed by wind shear and lightning strikes. A second example is severe airframe ice accumulation, resulting in a uncontrolled descent and possible crash, caused by the crew’s misconception of moisture-laden clouds building in their airspace. The following paper will investigate the sensing portion of the problem.

Alternative Actions
Two systems currently fielded on manned aircraft, airborne weather radar and lightning detectors, provide a foundation for developing a UAS weather sense-and-avoid capability. Recognizing that these systems may not be suitable for the wide range of sizes and shapes of unmanned aircraft, the ability of small electro optical (EO) or infrared (IR) sensors, combined with advanced image processing, will be investigated as well.

Weather Radar
Weather radar provides a wealth of information to the crew that allows them to avoid areas that are currently unsafe, and anticipate areas that will become unsafe in the near future. Weather radar generates a display based on the amount of energy reflected by airborne precipitation. Returns are color-coded according to strength, so crews can identify both active storm cells and adjacent high-moisture areas that can develop into storms. Pulse-Doppler radars augment the basic functions by measuring the Doppler shift in reflected energy, which can be used for wind shear and turbulence detection. A typical airborne weather radar operates in X-band, has a 30o beam width, and a maximum unambiguous range of 300 NM (Melvin, W. L., & Scheer, J. A., 2013). Unambiguous range is the maximum distance a radar’s transmission can travel to the target and echo within the receiving period for a given cycle. While weather radars provide a high volume of useful information, integration is challenging. The smallest radar package available is the Bendix/King RDR 2000, which weighs only 10lbs, but requires a 12x10in volume and 28VDC/3A power supply (Bendix/King, 2015). This size, weight, and power requirement prevents installation on vehicles much smaller than the MQ-1C. With a 56 ft wingspan and 3,600 lb maximum takeoff weight, it is currently the smallest UAS with comparable radar equipment (General Atomics, 2015).

Given the complex problem of forecasting, humans need to interpret the radar data, which means it needs to be passed through a data link to the CS, resulting in less bandwidth available for the primary mission. The radar set also needs to be regularly manipulated as a closed-loop function of human interpretation of the data, such as adjusting gain or antenna tilt angle (Baur, 2012), making it difficult to effectively integrate with a fully autonomous vehicle.

Lightning Detectors
Lightning detectors for aircraft receive sferics, which are the electromagnetic discharges from lightning that propagate through the atmosphere. Cloud-to-ground lightning is characterized, within the radio spectrum, by a broadband impulse on the order of 1kHz (Siingh, D., Singh, R. P., Singh, A. K., Kumar, S., Kulkarni, M. N., & Singh, A. K., 2012). Beyond a distance of approximately 10 miles where the sound and luminosity can be sensed, sferics become the only evidence of lightning discharges. Aircraft lightning detectors receive these impulses and provide an indication to the pilot, usually consisting of range and bearing to the strike. Successive strikes are recorded and displayed to build a rough picture of active storm cells. While these devices provide less information than radar, they have the advantage of 360o coverage. They are well suited to light general-aviation-size aircraft due to small installation volume/weight, low power requirement, and an order of magnitude lower cost than radar. Models such the Avidyne TWX670 provide a full-color display of trend data to a range of 200 NM (Avidyne, 2015). 

The lightning detector market has steadily declined in recent years in favor of data links such as XM/WX satellite weather service which provides Next Generation Radar (NEXRAD) imagery and lightning strikes, or subscription-free weather information provided via Flight Information Service-Broadcast (FIS-B) (Anglisano, 2012) to aircraft with a Universal Access Terminal (UAT). UAT is a civil aviation data link architecture operating on 978 MHz, which was established under the Federal Aviation Administration’s Next Generation Air Transportation System modernization initiative. While data links may provide significantly more capability (weather, traffic, onboard internet), they rely on a source of high-fidelity data such as NEXRAD, which is only available in the U.S. In this respect, lightning detectors maintain an advantage as a self-contained onboard source of real-time data. The simplicity of the output data (only range and bearing) would require very little data link bandwidth to generate a display in the CS. Additionally, integration with an autonomous system would be relatively easy by adding logic which avoids areas with strikes by a preprogrammed distance.

Image Processing
Significant work has been accomplished using EO/IR sensors and image processing algorithms to give UAS a sense-and-avoid capability. These passive systems are still in development stages due to the challenge of reliably detecting pixel or sub-pixel sized objects in a wide range of lighting or contrast scenarios (Delves, 2012), however the technology is robust enough for cloud detection. As discussed earlier, cloud development is a key indicator of atmospheric stability and can help build a local near-term forecast. Keeping the vehicle clear of clouds significantly reduces the threat of airframe icing and assists in maintaining line-of-sight to a visual mission objective. The first step is an image being captured by a fixed forward facing camera, like those commonly found on medium to large UAS for takeoff and landing, or a dedicated weather camera sensor. This capability can be easily integrated with sUAS on a constrained power, processing and weight budget. Work by Vidya Murali at Clemson University (2011) demonstrates that machine vision can be as low as 32x24 pixels within a 30o field of view (FOV) and still provide sufficient data for basic obstacle avoidance. This is easily accomplished by miniature camera chips like those offered by Raspberry Pi with 640x480 video resolution in a 8.5x11.3mm package, requiring only 5V power supply (Adafruit, 2015). Down-sampling the image towards the 32x24 threshold requires less processing power, reducing the computing requirements for the sUAS.  With the image captured the second step is to reduce the jitter, or high-frequency changes in scene caused by aircraft or sensor movement. To reduce onboard processor requirements, a method such as Image Projection Correlation (IPC) should be used. IPC sums gray-scale pixel values across each column and row of the focal array, and then compares the cross-correlation peak between adjacent frames to determine displacement (Delves, 2012).

With a stable stream of image frames, the third step is to process them and determine whether any clouds are present. Numerous video tracking algorithms are available, varying in the quantity of preprocessing and computational steps (Davies, 2012). Preprocessing an image filters out data that is not useful to the main function, which reduces the amount of data that the higher order algorithms need to process. In the case of edge detection, the intensity of each pixel is of interest, so a gray scale filter can discard color information. Convolution masks tailored to the particular application (blur, edge detection, contrast, etc.) are then applied, resulting in the least amount of data being forwarded for tracking or change detection. In the interest of reducing computations for sUAS, a Differential Gradient (DG) edge detection should be used, which evaluates the gradient (first derivative) of pixel intensity in each row (x) and column (y), and then calculates the local edge value as the root-mean-square of the x and y gradients. A further step to reduce computational complexity can be taken by instead summing absolute values of the x and y gradients, which Abdou and Pratt (1979) demonstrated is sufficient for most edge detection requirements. The resulting line image can now be passed to a higher level processing stage, which will employ the optical flow technique. The local intensity, as a function of x and y location in the image and time, is approximated as a first-order Taylor series expansion. A flow field is estimated by comparing successive frames to determine the differential values (Davies, 2012). A common problem with the optical flow technique is that velocities appear to vanish when an object’s edges are parallel to the velocity. This is initially mitigated by the irregular shape of clouds, and further aided by bounding areas of similar velocity and averaging. The technique provides velocities relative to a fixed camera, which is enough to determine if obstacles (in this case clouds) are present. Absolute velocities can be determined by receiving aircraft state parameters from the navigation system and integrated to estimate cloud location. While on station, the UAS will likely be performing some type of holding maneuver that will result in a 360o heading change, whether it be a circular orbit over a point of interest or a raster mapping pattern over a wide area. Throughout the 360o sweep cloud position, velocity, and size trend data could be gathered and downlinked to the CS for display. This allows for more efficient vehicle positioning when attempting to maintain visual custody of the mission objective. This image processing technique has the ability to work with both EO and IR sensor inputs, and can utilize existing fixed nose cameras on many large UAS. Image processing could be accomplished using compact processing modules such as the Sundance EVP6472-941, which is geared towards high-intensity processing and currently used in signals intelligence and communications analysis. It consists of two multicore Digital Signal Processors (DSP), which could simultaneously accept EO and IR sources. The main Xilinix Virtex-5 Field Programmable Gate Array (FPGA) will run the cloud detection algorithms, and output commands using built-in Gigabit Ethernet or RS-232 serial communication. The Sundance EVP6472-941 package requires less than 24 in3, is less than 1 lb, and costs approximately $6,000 (Holland, 2010). The image processing weather avoidance system output could be easily adjusted to accommodate return link bandwidth limitations or operator requirements, varying between simple range/bearing to potential storm cells and cloud coverage overlays for moving maps.

Recommendations
A fusion technique for weather avoidance is recommended, combining the strengths of lightning detectors and image processing. Both of these technologies can be integrated with a wide size range of vehicles due to relatively small installed volume and weight. Their concise outputs would also reduce interface complexity when used with a fully autonomous system, and could be combined with geo-fencing to enable the vehicle to avoid dangerous meteorological events. In an example scenario, a UAS is on station in an area prone to afternoon thunderstorms. As the day progresses, locations of growing cloud coverage are downlinked to the CS, allowing the crew or autopilot to more efficiently move the vehicle to remain in visual contact with the mission objective. The vehicle has already noted ambient temperature within a range conducive to airframe icing, and downlinks a potential storm activity warning with sub-cardinal direction. For a fully autonomous system, this warning may trigger a person managing a fleet of UAS to research weather forecasts or at least monitor this particular vehicle more closely. With this added weather situation awareness, the crew or autopilot is able to maneuver to an area that allows mission continuation without endangering the vehicle. In this situation, the crew directs the UAS to an area upwind of the storm to avoid hail or ice-inducing precipitation that will likely be found under the “anvil,” while still keeping the vehicle and line-of-sight to the objective clear of clouds. Autonomous vehicle could be programmed with logic that makes a similar decision by using wind estimates that are typically generated by coupled inertial/GPS navigation systems. Onboard lightning sensing equipment would trigger a final indication of storm maturity, causing the crew to terminate the mission, or hold at a safe distance until the storm dissipates. Overall, this selection of sensors and fused approach has the strong potential to reduce weather-related UASs mishaps with readily available technology and should be pursued. Additionally, UAS operator certification programs should incorporate basic meteorology education similar to their manned counterparts, so that the weather system data can be correctly interpreted.

References
Abdou, I. E., & Pratt, W. K. (1979). Quantitative design and evaluation of enhancement/thresholding edge detectors. Proceedings of the IEEE, 67(5), 753-763.

Adafruit. (2015). Spy Camera for Raspberry Pi. Retrieved from http://www.adafruit.com/products/1937

Anglisano, L. (2012, April). Lightning detectors: Still worth having. Consumer Aviation. Retrieved from http://connection.ebscohost.com/c/articles/78149092/lightning-detectors-still-worth-having

Avidyne Corporation. (2015). TWX670 Tactical Weather Detection System. Retrieved from http://www.avidyne.com/products/twx670/index.asp

Baur, C. (2012, April 1). Weather Radar: Navigating the Storm. Rotor & Wing. Retrieved from http://www.aviationtoday.com/rw/commercial/ems/76076.html#.VZbYamCj7V1


Davies, E., & Books, I. (2012). Computer and machine vision: Theory, algorithms, practicalities, fourth edition (4th ed.). Waltham, [Mass.]: Academic Press. Retrieved from http://www.sciencedirect.com

Delves, P. (2012). Sense and Avoid in UAS : Research and Applications (2nd Edition). Hoboken, NJ, USA: John Wiley & Sons. Retrieved from http://www.ebrary.com

General Atomics. (2015). MQ-1C Gray Eagle: Armed Persistence. Retrieved from http://www.gaasi.com/Websites/gaasi/images/products/aircraft_systems/pdf/Gray_Eagle021915.pdf

Holland, C. (2010, August 17). Multicore developer platforms uses TI DSPs and Xilinx FPGA. Embedded. Retrieved from http://www.embedded.com/electronics-products/electronic-product-reviews/embedded-tools/4206229/-Multicore-developer-platforms-uses-TI-DSPs-and-Xilinx-FPGA

Melvin, W., & Scheer, J. (2013; 2010). Principles of modern radar.: (radar applications). Raleigh, NC: Institution of Engineering and Technology.

Murali, V. N. (2011). Low-resolution vision for autonomous mobile robots (Order No. 3469540). Available from ProQuest Dissertations & Theses Global. (893426415). Retrieved from http://search.proquest.com.ezproxy.libproxy.db.erau.edu/docview/893426415?accountid=27203

Pearson, D. (2002). VFR flight not recommended: A study of weather-related fatal aviation accidents. NOAA/NWS Technical Attachment, Southern Region Headquarters, Dallas, TX, TA, 18.

Schetzen, M. (2006). Airborne doppler radar: Applications, theory, and philosophy. Reston, Va: American Institute of Aeronautics and Astronautics.

Siingh, D., Singh, R. P., Singh, A. K., Kumar, S., Kulkarni, M. N., & Singh, A. K. (2012). Discharges in the stratosphere and mesosphere. Space Science Reviews, 169(1-4), 73.

Thompson, W. (2005). U.S. Military Unmanned Aerial Vehicle Mishaps: Assessment of the Role of Human Factors Analysis and Classification System (HWS-PE-BR-TR-2005-0001). Brooks City Base, TX: 311th Human Systems Wing.

Tvaryanas, A.P.; Thompson, W.T.; Constable, S.H. (2005) The U.S. Military Unmanned Aerial Vehicle (UAV) Experience: Evidence-Based Human Systems Integration Lessons Learned. In Strategies to Maintain Combat Readiness during Extended Deployments – A Human Systems Approach (pp. 5-1 – 5-24). Meeting Proceedings RTO-MP-HFM-124, Paper 5. Neuilly-sur-Seine, France: RTO.

United States Air Force (USAF). (1997). AFH 11-203 Volume 1: Weather for Aircrew. Wright-Patterson AFB, OH: United States Air Force.


Williams, K. W., United States. Office of Aerospace Medicine, & Civil Aerospace Medical Institute. (2004). A summary of unmanned aircraft accident/incident data: Human factors implications. (). Springfield, Va; Washington, D.C: Office of Aerospace Medicine, Federal Aviation Administration.