Saturday, May 7, 2016

eBumper Obstacle Avoidance and Precision Operation


eBumper Obstacle Avoidance and Precision Operation

Introduction

Panoptes is the Greek for “all-seeing” the company started in 2014 as a derivative from Aurora Flight Sciences and is dedicated on providing small unmanned vehicles "sense and avoid systems. The eBumper is a commercial available, sonar supported system for small unmanned vehicles for slow speeds and indoor maneuvers. Bumping into objects is one of the major difficulties. eBumper assists the operator to avoid objects. It also involves protection from inaccurate operating inputs and minor wind. The eBumper flights in unrestricted spaces require no adjustment. The operator can change from manual to precision mode which balances inputs, allowing for more accurate flight control (Panoptes, n.d.). The eBumper constantly checks its surroundings for objects on the path. If one is sensed, the eBumper reacts to lower the possibility of an accident. As the vehicle shifts apart from the object, control of the vehicle is restored to the operator.

Mode of Operation

Four sonar sensors present the aircraft with data pertaining to the surroundings forward, left, right, and on top of it. Once the eBumper detects an object in its range, it stops the vehicle from flying near to an object, maintaining the vehicle at a cautious range. The eBumper is also used on high speed focused on eBumper performance at fast speed. EBumper executes all stopping action. Top speed is 100 inches per second (Panoptes, n.d.). The operator can select from the remote control two different modes precision and performance. Precision mode guards the vehicle at safe distances, and controls how faraway the vehicle starts implementing a collision evasion maneuver. Select performance mode for high speeds in unrestricted spaces, or select precision mode for low speeds in scattered surroundings. With the turn of a button, the vehicle takes of automatically and floats 6 feet off the ground (Panoptes, n.d). The top sensor, allows to control vehicle when lifting off. Adjust the sensitivity of the vehicle to operator inputs, gives the capacity to more accurate control location and speed, convenient for inside flight in constricted areas. When objects are near to the right and, left the self-centering utility positions the vehicle among them, leveling the space to the object on both sides.

Characteristics and Specifications

EBumper characteristics are designed thru a graphical user interface (GUI) offered for Windows PC and Android, so the system can be adapted to fit the user specific requirements. eBumper comes with four sonar sensors and an integrated circuit technology panel already mounted in an ABS thermo-plastic polymer case, installing hardware, manual, and a lanyard for the remote control. eBumper is intended as a modification set that fastens to the Unmanned Aerial Vehicle, It takes about 20 minutes to install. In the future the sensor capabilities will be accessible as software upgrades rather than expensive hardware. In addition to not running into objects there is so much to do with the information about aircraft's environment. Tractor beam easily moves the vehicle away or towards an item of significance providing an actual focal effect impression to the film. The relative position maintains a continuous point in relation to an item of importance even if it is static or moving. The obstacle avoidance for waypoint flying autonomously avoid obstacles while on a waypoint to an assignment. The training mode allows the operator to progressively increase the control inputs as the operator becomes more familiar with flying. Users can make the most of their eBumper, they can customize its settings and easily write apps for it. eBumper offers a software development kit (SDK) to create new applications The development kit supplies files to open the information detected by the eBumper, in addition to offering an interface to hand instructions back to the vehicle (Panoptes, n.d). Unmanned aircraft have a propensity to fade the signal connection and so if the vehicle drops the signal the aircraft no longer can sense and avoid consistently to provide safe operation. In order to support safe integration of the vehicles there has to be sensors on board a last level of safety. eBumper has minor effects on vehicle endurance, it increases the weight by 80 grams to the UAV, but because of the echolocation sensors it cut down the vehicle's performance by two to three minutes depending on climate environments (Dronelife.Com, 2016).

Technical Specifications



MSRP $499

Detection Range: 15ft

Maximum Collision Protection Speed: 5kts (8.5ft/s)

Field of View of Sensors:  40 degrees cone

Sensed Directions: Forward, left, right and up

Reduction in Flight Time with eBumper: 2-3 minutes

Net Weight Increase: 130 grams

Blind Spots Between Sensors: 50 degrees

Supported Aircraft Types: DJI Phantom 2 Product Line, 3D Robotics Iris+

Required Interfaces: NAZA Flight controller maintains full I/O support with eBumper installed or Pixhawk.

Figure1 Technical Specs ((Dronelife.Com, 2016).

 

Conclusion

The Panoptes eBumper is a solution to the technological blockade for sense and avoid in small Unmanned Aerial Vehicles. The technology is been in use in other implementations, called echolocation. eBumper uses echolocation like whales and bats do to sense objects when in operation. (Panoptesuavcom, 2016).
 

 
 
References
DIY Drones. (2016). Sense and Avoid "eBumper" available for IRIS+. Retrieved from http://diydrones.com/profiles/blogs/sense-and-avoid-ebumper-available-for-iris

Dronelife.Com. (2016). Drone Flying Made Easy with the Panoptes eBumper. Retrieved from http://dronelife.com/2014/10/31/making-flying-easy-panoptes-e-bumper/
Panoptes. (n.d.). Meet eBumper Explore with Confidence. Retrieved from http://www.panoptesuav.com/ebumper/

 

 

Saturday, April 30, 2016

Data Presentation on Maritime Universal Ground Control Systems


The Common Unmanned Surface Vessel is an Unmanned Surface Vehicle built by Textron Advanced Systems. The Common Unmanned Surface Vehicle (CUSV) combines unmanned marine command and control station as a segment of conflict assignments and payloads combinations. The Common Unmanned Surface Vessel CUSV includes a designed to make adding, upgrading and swapping components easy combining off-the-shelf parts. The boat is operated from the Universal Command and Control System. The system is adaptable with Harris SeaLancet long-range electronic connection for the exchange of information. The system is compatible with NATO STANAG 4586, in addition to the Joint Architecture for Unmanned Systems conventions. Textron Systems' manages the boat through Universal Command and Control System, which is a type of marine of UGCS. The UGCS is used by several armed forces to operate unmanned aircraft system (UAS). (Textron Inc, 2014). The UGCS and UCCS work fine with other unmanned systems reconfigurable and reprogrammable for command and control compatibility with other systems. Additionally, the UGCS UCCS is effective for immediate maneuver of several unmanned ground, water, air vehicles.

Maritime Control Station Hardware Software and User Interfaces

The UGCS configuration integrates intuitive network interface mixed with programs that allow operators to control processes control equipment through a computer graphical user interface (GUI). The universal command and control system rely on user interface techniques and common data presentation to interchange with the operator. The UCCS is furnished with a mouse, keyboard, and joystick to simplify operator feedback (Textron Inc., 2015). Graphics data is showed through display monitors based on the total of unmanned systems being controlled. Graphic screen choices contain state data, geographic navigation, and instrument/task adapted screens. Information items are resulting from instrument gatherings, like the discovery of a mine with the sonar, is transferred to the station for situational awareness overlapped onto the topographical screen. The UCCS connects with the boat through the SeaLancet RT-1944/U. The link is a web-based protocol capable of transferring data at rates of 54 megabytes per second (Mbps) to keep data superiority. The link is supported by information technology, through the robust computer networking transmissions from marine networks to air and ground systems that sustain payload information connecting the control station and the boat. The SeaLancet can reach 150 miles for unobstructed vision, but the span can be stretched past the unobstructed field of vision with the help of link relays (Reliable System Services Corporation, n.d.). The platform uses the link to transmit simultaneous sensor, telecasting, and navigation data. The Unmanned Surface Vehicle has the ability for autonomous operation and combines autonomy and human-in-the-loop maneuvers. The platform is arranged with the program that uses the past and current surrounding information to evaluate mine dangers, improves mine clearing tactics, and proposes strategies, methods, and processes (National Research Council, 2000).  Mine Warfare Environmental Decision Aid Library produces mine countermeasure task strategies to be uploaded to the vehicle and performed autonomously or with varying degrees of involvement (Textron Systems, 2014).

Ground Control Station Enhancements

The Universal Command and Control System was intended to comply with military interoperability criteria that involve control stations to be compatible with other platforms, as a result requiring a somewhat unsophisticated data presentation method. The UCCS could be enhanced with the integration of multiple input modalities to increased usability to transfer and gather data to and from the operator through several perceptible means. For example, Voice User Interface to support operator command and control of the vehicle. Haptic responsiveness, including vibrotactile signals, can be implemented to control, support instruments, and systems. Vibrotactile techniques can be utilized to increase manual navigation and obstacle avoidance, or in the course of disembarking maneuvers. Virtual Reality displays could be used to support safety and greater three-dimensional awareness by increasing the field of vision and producing a three-dimensional view.

The Unmanned Common Control System is an unmanned adjustable GCS. Studies suggest that implementing these data presentation methods, can contribute to enhanced system operations. The implementation of technologies to the UCCS can eventually increase CUVS and UCCS potential.

Figure 1. Universal Command and Control Station. Textron Corporation. (2015). Retrieved from http://www.textronsystems.com/products/unmanned/universal_gcs
 

References

National Research Council (2000, March 6). Oceanography and Mine Warfare. Retrieved from http://www.nap.edu/openbook.php?record_id=9773&page=32

 
Reliable System Services Corporation (n.d.). SeaLancet RT-1944/U Retrieved from http://www.rsscorp.org/?page_id=112

 

Textron Inc. (2014). Universal Capabilities for the Next-Generation Battlespace. Retrieved from


 

Textron Systems (2015,) Unmanned Systems. Retrieved from http://www.textronsystems.com/capabilities/unmanned-systems

 

Sunday, April 17, 2016

Unmanned System Data Protocol and Format


 

SpyRanger Reconnaissance small Unmanned Aerial Vehicle 

Introduction
 The way in which information is processed, transferred and saved by unmanned systems is essential to the operational capabilities. With the increasing amount of data produced by sensors, developers need to make sure that data gathering and processing protocols will function as expected to deliver a consistent experience for users. The newest reconnaissance and surveillance Spy'Ranger is a tactical Unmanned Aerial Vehicle resulting from the need for battlefield anti-drone measures. The platform has sustained a significant level of interest as its competition has gradually expanded. The Spy'Ranger is the newest design of a small Unmanned Aerial Systems (sUAS) with Electro-Optical/Infra-Red imaging system suited for transferring high-definition electro-optical and infrared images in real time (Thales group, n.d). Thales Group has kept most of the technical data private and not readily available to the public.

Unmanned System Data Format, Protocols, and Storage Methods
The Platform presents excellent levels of durability. It comes with the Spy'C Command-and-Control suite. Configured to function in severe conditions, the platform, collects and transfers precise beyond-line-of-sight picture information to taskforces and battle group units. It can make use and exchange information with multiple sensor image analysis and distribution system (MINDS/SAIM). MINDS/SAIM is an Open software and hardware architecture, scalable and adaptable to the stand-alone or networked component of the air operation centers and fighting troops, to produce images that can be utilized in operational Command, Control, Communications, Computers, and Intelligence platforms for operations preparation and multiple sensor maneuvers by units. Advanced processing capabilities and tactical data link make the platform perfect for any kind of wide or point -area surveillance at night or day. Spy'Ranger detects and geographically locates objectives in real time to support operations by units involved in combat. Its laser pointer and HD images reveal the precise position of a target. Geographical information is added so that the details can be utilized instantly.
The platform collect and transfers images, for instant clarification to carry out current operations, or for additional use to sustain combined mission arrangements or damage evaluation. The data link interactions are protected and encoded to maintain service readiness in congested situations and to avoid interruptions. The Spy Ranger is intended to offer unique picture detail with a high-definition angular view that is typically 2.5 wider than its competitors from the also new in-house developed treble-sensor stabilized mounted ball (Thales Group, 2016). Imagery is electronically and mechanically stabilized trough the performance of the gimbal. The high-definition imagery is processed on board for immediate transmission through a high-capacity c-band data link, although that imagery is also stored on board for post-landing processing. Data link solution is based on a dual architecture including a high-speed imagery link in Ku-band and a high integrity command and control in C-band. Both data links include ground and airborne equipment with their complete antenna solution.

Capability Improvement
Regarding data protocol enhancements for the Thales Spy'Ranger, the system could benefit from possible spectrum repurposing and reallocation for broadband purposes to support broadband from a global perspective. Also, the implementation of cloud architecture to reduce the need for onboard data storage and make data available to multiple users.  The Unmanned Vehicle Cloud services thru Widgets should be integrated within Data Distribution Services such as Ozone Widget Framework, Hadoop, and Accumulo. Cloud and unmanned vehicles groups should combined efforts to improve a system using open-source modules.        
 
References

Thales Group. (2016).The future of frontline surveillance and intelligence. Retrieved from https://www.thalesgroup.com/en/worldwide/defence/mini-uav-system-reconnaissance-roles
Thales group. (n.d). Milipol Paris, 18th November 2015 Thales unveils Spy’Ranger, the most advanced reconnaissance/surveillance mini-drone in the world. Retrieved from
https://www.thalesgroup.com/sites/default/files/asset/document/pr_thales_unveils_spyranger_the_most_advanced_reconnaissane_surveillance_mini-drone_in_the_world.pdf.

 

Saturday, April 9, 2016

Importance of Sensor Locations in UAVS


Many customers of unmanned aerial vehicles have tailored their vehicles for filming and taking pictures, but unmanned aerial systems are also used for racing becoming a new development area for the sports market. Most UAV cameras today are quadcopters because they are remarkably steady and maneuverable platforms. There are a number of cameras for Unmanned Aerial Systems available, with different budgets and configurations. Some models are ready-to-fly with photographic suites integrated. These cameras produce high-quality HD video, stills and aerial photography below 400 feet Above Ground Level.

Aerial Photography and Filming

One of the systems of preference by professional photographers and videographers is the DJI Inspire 1 RAW. (RAW a format that keeps picture information filmed by the sensor when snapping a picture (Lim, 2015). It is also equipped with advanced software-driven features that simplify the planning and production of a video shoot. One of the main characteristics of the Inspire 1 is the selection of sensors onboard such as, the ultrasonic and optical flow sensors. These two sensors combined can be used to automatically stabilize the vehicle inside. I chose this model because is a professional grade, higher payload capacity to carry heavier equipment adaptable for its payloads.  The vehicle is high quality carbon fiber with automation built in.  The accessories may be exchanged out for an assortment of cameras and video options dependent on the needs.  This adaptability of the different sensors and cameras enable flexibility.

DJI Inspire 1 RAW

The DJI Inspire 1 RAW is equipped with an innovative high definition Zenmuse X5R camera and a three-axis gimbal.  The vehicle features a folding arm system. The arms rise up to get a clear 360-degree sight. The 3-axis gimbal maintains pictures steady and the camera aimed in the same direction as the vehicle tilts and turns repositioning independently of the vehicle. The Zenmuse takes 16-megapixel static photos with a choice for time-lapse and burst modes. The DJI Inspire 1 RAW is easy to use for one or two-people to create very high quality 4K HD and Micro Four-Thirds video. MFT cameras contain a sensor size that is similar to the 16mm (B&H Photo Electronics Corporation, 2016). The MFT format is being used for productions that require small, portable, cheaper cameras. Options for four interchangeable lenses ranging from 12mm to 17mm, for lengthier filming times, and better depth of areas, and yet preserve a particular focus that distinguishes it from small-sensor video from up to 2 km away (Olympus Imaging Corporation, 2016). Furthermore, the camera sustains bursts of three, five, or seven frames and time-lapse at stepped intervals of between three seconds and one minute.  The camera operates 360-degree so it can remain focus on the subject (B&H Photo and Electronics Corporation, 2016).

First Person View Racing

First Person View racing is becoming a popular sport throughout the world. FPV racing requires placing cameras on the vehicle and using them to race a circuit. Each vehicle feeds video to its pilot to control it at high speeds among eight to ten pilots simultaneously. These conditions place high demands on the wireless transmission system.

VAJRA80 Professional Racing FPV

The racing unmanned aerial system chosen as a First Person View (FPV) is the Speedwolf VAJRA 80 high definition FPV.  The VAJRA80 is a professional vehicle with wireless functionality and developed with modularity in mind. The control system ensures it meets various flight tasks and is user friendly. The VAJRA80 is equipped with a GPS and compass units to assist the controller hold a position, land safely, one-click return, automatic take-off, and  landing, communication loss protection, capacity for a1.1Kg payload, 15-30 minutes flight time based on payload (Drone Racers, 2016). One of the main characteristics, is the frame is made of carbon fiber, making it strong and light.  With the quickness of racing models, it is important that they be designed to endure crashes. Other features include the separation of vibrating elements such as motors and blades from vibration sensitive devices like recording cameras, flight controller, FPV camera, video transmitter, and radio receiver are all subtle to vibrations and can trigger poor flight conditions. This is important for the front camera, where the operator needs to respond and make route adjustments immediately to continue in the race or crash.

 
References

B&H Photo and Electronics Corporation. (2016). DJI Inspire 1 RAW Quadcopter with Zenmuse X5R 4K Camera and 3-Axis Gimbal. Retrieved from http://www.bhphotovideo.com/c/product/1186059REG/dji_inspire_1_raw_quadcopter.html/prm/alsVwDtl

Drone Racers. (2016). VAJRA80 Professional and Racing FPV. Retrieved from http://droneracers.us/category/uas/page/3/

Lim, R. (2015). 10 Reasons Why You Should Be Shooting RAW. Retrieved from http://photographyconcentrate.com/10-reasons-why-you-should-be-shooting-raw

Olympus Imaging Corporation. (2016). Benefits of Micro Four Thirds. Retrieved from http://www.four-thirds.org/en/microft/#SlideFrame_6

 

 
 

                        
DJI Inspire 1 RAW
 
 

Friday, April 1, 2016

Bluefin 21 Exteroceptive and Propioceptive Sensor in Search and Rescue


Soon after the vanishing of flight 370 Malaysia Airlines, authorities approved the underwater search for the wreckage with an unmanned submersible vehicle to chart the bottom of the sea in a meticulous procedure. With the black boxes not transmitting a signal the Bluefin 21 autonomous underwater vehicle was called to action to generate a detailed, three-dimensional chart of the sea bottom 4,500 meters down with the side sonar. This type of unmanned platform is designated as an autonomous underwater vehicle because of the stages of autonomy necessary by the control, command, and communication encounters functioning underwater.

Exteroceptive Sensors Bluefin 21 Search and Rescue

The Bluefin-21 is an adaptable underwater vehicle configurable with different sensors. The Bluefin-21 operates particular exteroceptive and proprioceptive sensors that endure sea operations. The conditions they operate in demands the use of inertial and ultrasonic sensors which gather information to be analyzed after the task is completed. Bluefin has incorporated over seventy sensors. Exteroceptive sensors consist of vision systems in particular the side scan sonar from EdgeTech 2200-M 120/410 kHz, an efficient device for searches of the bottom of the sea. Accurate images of the area are delivered with signal technology called Full Spectrum® CHIRP Managing, Dynamic Aperture and Dynamically Focused Arrays increase long range resolutions through better signal-to-noise ratios. Multi-Pulse technology sends up to four pulses in the water at once, allowing a four times inspection speed increase, or a noticeable surge of pings on the objective, for greater feature recognition. The calibrated wideband digital Frequency Modulation sonar provides quantifiable and estimate, low-noise, high-resolution side scan pictures. It transmits straight swept Frequency Modulated pulses placed at two separate frequencies. The transmission of a longer duration, wide bandwidth pulse ends in higher resolution images and, because more energy is projected into the water, a better signal to noise ratio results in extended range (Subsea Technology Rentals, 2016). The Synthetic Aperture Sonar (SAS) combines consecutive pings constantly along an identified path to increase the azimuth resolution. SAS produces high-resolution images up to hundreds of meters, down to centimeter resolution range. This makes SAS an appropriate technique for search of small items and imaging of wreckages (Hanson, 2011). Multibeam echo sounders obtain water depth info in a search area; to establish smallest sea depths over significant objects such as wrecks, and detection of objects in general. The Multibeam echo scanner produces signals in the form of a wave from underneath the vehicle. Estimating and recording the time it takes for the sound signal to travel from the transducer to the bottom of the sea and back to the receiver (National Oceanographic and Atmospheric Administration of Coast Service, 2016).

Proprioceptive Sensors Bluefin 21 Search and Rescue

The proprioceptive safety system sensors detect leakages and faults, conduct drop weight analysis, track transponders, antennas strobe, RDF, and Iridium. The navigational sensors include Inertial Navigation Sensor (INS) a navigation technique where dimensions specified by gyroscopes and accelerometers are utilized to trace the location and path of a target in relation to a referenced initial position, course, and speed. Inertial measurement units (IMUs) include three orthogonal rate accelerometers and three orthogonal gyroscopes, estimating angular rate of displacement of an object and linear rate of change of velocity of an object (Woodman, 2007). The Ultra-Short Base Line sensor is a marine locating system that utilizes a craft attached transceiver to sense the space and direction to an object utilizing sound signals. USBL sensors are also comprised on other components, to include attitude sensors for the precise determination of vessel roll, pitch and direction, and for correction functions (Sonardyne, 2015). A transmissometer for measuring the reduction of light as it travels through a water used for defining turbidity of the ocean (WetLabs, 2016).

Research Questions

            The implemented Global positioning System, the Ultra-Short Base Line and the Inertial Navigational System sensors are specifically intended for the sea environment to ease the restrictions of radio signal transmission and navigation across seawater. A change to the system that would make the vehicle more effective in marine search and rescue operations is the development of a better battery technology such as high energy density batteries with deep cycle life and performance to prolong the operating range and to sustain sensors and propulsion systems for longer operating time and cover greater search areas. Unmanned aircraft systems could be employed in conjunction with the Bluefin-21 to determine locations of importance from drifting remains or oil spots to the vessel that operates the unmanned vehicle via acoustic link. Incorporating the operation of an unmanned aerial vehicle and the underwater unmanned vehicle in the same ship would significantly reduce expenses and strategically control the management roles and enabling interaction among operations. Unmanned maritime vehicles contribute to individual welfare and budget balance over their manned counterparts.

 References
Bluefin Robotics. (2016). Sensor Integration. Retrieved from http://www.bluefinrobotics.com/technology/sensor-integration/#Hanson, R. (2011). Introduction to Synthetic Aperture Sonar. Retrieved from http://cdn.intechopen.com/pdfs-wm/18868.pdf
NOAA Office of Coast Survey. (2016). Multibeam Echo Sounders. Retrieved from http://www.nauticalcharts.noaa.gov/hsd/multibeam.html#
Subsea Technology Rentals. (2016). EdgeTech 2200 Modular Sonar System. Retrieved from http://www.str-subsea.com/sales/edgetech-2200-modular-sonar-system#
Woodman, O. J. (2007). An introduction to inertial navigation. Retrieved from https://www.cl.cam.ac.uk/techreports/UCAM-CL-TR-696.pdf#
Wet Labs. (2016). C-Star Transmissometer. Retrieved from http://wetlabs.com/cstar#

Thursday, March 24, 2016

General Atomics Aeronautical Industries Lynx Radar


Lynx Multi-mode Radar

The General Atomics Aeronautical Systems Lynx multiple mode radar provides current situational awareness making a difference on the battlefield or in the defense of civilians against terrorists and insurgents gathering data that is changed into information, for making a significant difference in the battle zone by deterring enemies and or gathering intelligence. The radar produces very clear images by using a large number of lines and dots (high-res), pictographic quality images across fog, rainfall, dust, and smoke. Intended to encounter the onboard tasks of the remotely gather data that is turned into intelligence, for making a significant difference in the battle zone by deterring enemies and/or gathering intelligence controlled aerial vehicle systems surroundings, the Lynx radar occupies insignificant weight, space, and energy while providing accurate air to ground aiming precision and widespread range exploration potential. Lynx includes synthetic aperture radar, ground dismount moving target Indicator, and powerful maritime wide area search modes. Lynx's exploration modes deliver the widespread range coverage for several combined sensor set, allowing for traverse signal to a limited field of view electro optical infrared sensor (General Atomics Aeronautical Systems, 2016). 

 Synthetic aperture radar Lynx combines two highlight and two maps showing only a narrow band of terrain synthetic aperture radar modes (stripmap). Highlight mode generates high-res images on an identified spot. Stripmap mode targets several locations with synthetic aperture radar imageries collectively to make one big picture. Utilizing synthetic aperture radar images, slight variations in the landscape are identified by overlapping two pictures obtained at separate occasions. Algorithms such as Coherent Change Detection Amplitude Change Detection and Automated Man Made Object Detection can quickly point up the changes among the original and second synthetic aperture radar picture, offering an imagery evaluation device (General Atomics Aeronautical Systems, 2016). 

 Ground Dismount Moving Target Indicator mode offers a rapid and simple process for detecting traveling vehicles. While the Ground Moving Target Indicator remains to be a vital asset, Dismount Moving Target Indicator denotes a real model change. Dismount Moving Target Indicator allows operators to identify slow traveling vehicles and a person moving at about one mph. Maritime Wide Area Search mode identifies vessels and boat movement in several marine conditions; it combines automated identification system data for target association and relationship. Maritime Wide Area Search does extremely well in assignments such as littoral reconnaissance, drug interdiction, distant surveillance, minor object detection, and search and rescue operations (General Atomics Aeronautical Systems, 2016).

Features
•High-resolution imagery
•Long-range, up to 80 km
•High reliability, enclosed chassis
•Low weight and volume
•Real-time detection of vehicular movement
•Automatic cross-cue to EO/IR
•Available as a Commercial-Off-The-Shelf sensor
•Designed for use in RPA systems and manned aircraft


References:
General Atomics Aeronautical Systems. (2016). Lynx Multi-mode Radar. Retrieved from http://www.ga-asi.com/lynx-multi-mode-radar