Saturday, July 18, 2015

Sense and Avoid Sensor Selection (UNSY 605, 7.4)

Sense and Avoid Sensor Selection

Developing sense-and-avoid (SAA) capabilities are challenging for small unmanned aircraft. Typical onboard systems for detecting intruding aircraft include non-cooperative systems such as radar, or cooperative systems like Traffic Collision Avoidance System (TCAS). These options are currently not suitable for sUAS due to relatively large size, weight, power consumption, and cost. Benefits of the sUAS SAA sensor trade-space include short ranges and low speeds, which enable vision-based systems of low-cost electro-optical (EO) or infrared (IR) sensors to be used.

Sensor Recommendation
The proposed sensor system would be based on research into the optical flow sensed by insects and used for navigation (Barnhart, R. K., Shappee, E., & Marshall, D. M., 2011). In an example case, a house fly uses thousands of simple receptor cells that detect changes in contrast. Those signals are sent to a section of the brain that contains elementary motion detectors, which build a relative motion map within the fly’s field of view. This system was recreated in a lab setup by Ruffier and Franceschini in 2008 using a tethered remote-controlled helicopter (referred to as OCTAVE2) testbed with the intent of avoiding terrain and close-in obstacles. The technique could also be adapted to provide an SAA capability between non-cooperating vehicles operating in the sUAS realm (low-speed, less than 400ft above ground level). The basic sensing element consisted of a pair of P-type, intrinsic semiconductor, N-type (PIN) diodes which conduct current upon reaching a certain threshold of received light (First Sensor AG, 2015). The PIN diodes were installed behind a 5mm diameter lens and mounted to a 400um-thick circuit board. The complete sensing element measures 2.7x3.0cm and weighs 4.3g. While a power requirement for this particular arrangement was not provided for the OCTAVE2, PIN diodes are available in a variety of supply voltages that would be compatible with sUAS (3.3-9V). Signals were routed to a microprocessor for running optical flow analysis, resulting in command generation for the autopilot. The system was found suitable for directing the OCTAVE2 over steep terrain to a safe landing with only two “eyes,” and could be further enhanced by with a higher quantity of smaller sensing elements, with the net effect of increasing resolution and field of regard. Depending on the mission of the sUAS, the SAA systems could also a subsystem of the overall vehicle navigation systems, instead of an addition, further reducing size, weight, and power requirements. This SAA technique has the strong advantages of being light weight, small, and requires little power and no additional sensing input (i.e. GPS, INS, pitot-static) to guide the vehicle. One disadvantage is relatively short range, which reduces the time available to alter the flight path. This will likely not be a problem for highly maneuverable quad-copters, but may result in the insect-inspired system being less desirable for fixed-wing sUAS. 

References

Barnhart, R. K., Shappee, E., & Marshall, D. M. (2011). Introduction to Unmanned Aircraft Systems. London, GBR: CRC Press. Retrieved from http://www.ebrary.com

First Sensor AG. (2015). PIN photodiodes. Retrieved from http://www.first-sensor.com/en/products/optical-sensors/detectors/pin-photodiodes/

Ruffier, F., & Franceschini, N. (2008). Aerial robot piloted in steep relief by optic flow sensors. Paper presented at the 1266-1273. doi:10.1109/IROS.2008.4651089

Friday, July 10, 2015

Control Station Analysis (UNSY 605, 6.5)

Control Station Analysis

Unmanned systems require thoughtfully designed control stations that provide the operator with sufficient situation awareness and intuitive controls. Unlike a manned system, they can not rely on the human ability to freely sense the environment in a variety of ways, in all directions, and adapt to non-standard conditions. The following research activity will analyze the control station for an unmanned ground vehicle (UGV)  and provide recommendations for improvement.

iRobot 510 PackBot
The iRobot 510 PackBot is a versatile UGV that is currently on the military, law enforcement, and disaster response markets. Its typical mission sets include building and route clearance, explosive ordinance disposal (EOD), and hazardous material detection and handling. The baseline vehicle has tracks and a modular payload frame that allows it to be quickly tailored to specific mission tasks or environments. The control station, referred to as the Operator Control Unit (OCU), is of ruggedized, all-weather construction and focused on portability (iRobot, 1012). It is capable of controlling the vehicle using a thin fiberoptic tether or radio link. The OCU features a 15.1 inch, 1024x768 pixel Liquid Crystal Display (LCD screen.  Video from the Electro-Optical (EO) or Infrared (IR) sensors takes up most of the display. An active 3-D model of the PackBot is provided to visualize the robot’s “pose.” Gauges are provided for critical parameters such as battery power, fiber remaining, and signal strength. Still frames can be captured from the video feed, annotated, and exported using standard networking interfaces, enabling rapid dissemination of intelligence. The UGV can be controlled using the keyboard and dual joysticks on the OCU, or a video-game-style controller (iRobot, 2012).  

Figure 1. 510 PackBot with control station. Adapted from iRobot 510 PackBot Multi-Mission Robot and Army-Guide.com. Copyright 2015 iRobot and ATEN 2008.

An improved PackBot interface has been fielded by the Army Research Lab, which increases the OCU portability so that it would be more compatible with fast-moving infantry units as part of the Future Combat Systems initiative. The improved OCU condenses the control station into a chest-mounted tablet computer, also with dual joystick or hand controller (Pomranky, R. et al, 2012). In order to provide more screen area for video display, the 3-D model has been removed, however critical parameter gauges remain.

Recommendation
Field testing of the improved OCU yielded mainly ergonomic problems with the display. For example, the tablet was difficult to hold while using a hand controller. The screen was also found to be unusable in bright sunlight due to glare, and at night because the screen glow gave away troops’ position. I would recommend integrating a display similar to L-3 Communications’ Tactical ROVER system (2013), which uses a clip-on helmet-mounted unit to display aircraft targeting pod video to ground parties. The display could be augmented with small critical parameter tape gauges and an artificial horizon to give the operator relative angle data. BlueTooth camera and vehicle controllers could be installed on an M-16 hand guard, which would enable essential vehicle control inputs while still holding a weapon.

References

iRobot. (2012). iRobot 510 PackBot Specificactions. Retrieved from http://www.irobotweb.com/~/media/Files/Robots/Defense/PackBot/iRobot-510-PackBot-Specs.pdf?la=en

iRobot. (2012). IRobot PackBot Accessories Manual. Retrieved from http://www.manualslib.com/products/Irobot-Packbot-2938203.html

Pomranky, R. A., Kovach, J. B., Winslow, C. H., & ARMY RESEARCH LAB ABERDEEN PROVING GROUND MD HUMAN RESEARCH AND ENGINEERING DIRECTORATE. (2012). General overview of the use of SUGV and centralized controllers type I and II during command, control, communications, computers, intelligence, surveillance, and reconnaissance on the move, event 08. (C4ISR OTM E08). Retrieved from http://www.dtic.mil/docs/citations/ADA512533


L-3 Communications. (2013). Tactical ROVER-P Data Sheet. Retrieved from http://www2.l-3com.com/csw/ProductsAndServices/DataSheets/Tactical_ROVER-P_Sales-Sheet.pdf