UAV Mission Controls Applications

Target Location and Sensor Fusion through Calculated and Measured Image Differencing(2003)-

This paper describes the power tool approach for automating systems that contain a cognitive element ( human in the loop). Specifically as applied to UAV real time image analysis and mission control applications.

UAVSensor Network Attenuation Prediction (SNAP) Experiment (2004)-

This paper describes and experiment for using real time perspective view calculations in metrically accurate terrain for calculating communication reliability between grond stations and UAV platforms. This provides warnings when UAV’s fly into communication dead zones and provides a pre-flight planning tool for avoiding blackouts. 

UAV Target Mensuration experiment Using Synthetic Images from High Resolution TerrainDatabases at Camp Roberts (2005)-

This presentation describes an experiment for using calculated real time perspective views generated by the PVNT system for accurately mensurating target locations from live UAV imagery.

Multi-Eye Input Experiments for UAV ImageNavigation and Control (2010)

This paper describes our initial “fly like a bird” experiments for outfitting the cognitive element in UAV mission control applications with calculated from memory and live field imagery separately into the left and right eye. Such a system allows the human visual processing capability to compare and register calculated with live imagery and extract both UAV camera and potential terrain feature locations using only visual sensors. The promise of this technology is that heavy and expensive gimbals and orientation hardware could be eliminated. 

ĉ
Unknown user,
Jan 9, 2013, 7:18 PM
ć
Unknown user,
Jan 9, 2013, 7:20 PM
ĉ
Unknown user,
Jan 9, 2013, 7:20 PM
ĉ
Unknown user,
Jan 9, 2013, 7:21 PM
Comments