Near-Infrared Aerial Imagery and Oblique Photography


In order to establish a sensor setup for aerial photography in the visible (VIS) and near infrared (NIR) spectrum, the UAV/S platform is equipped with a compact digital camera which has been modified by removing the internal infrared-cut-filter (hot mirror) and is now capable of capturing radiation from about 330nm to 1100nm. The effectively captured spectra can be restricted by applying several optical filters absorbing specific ranges of the incoming radiation (e.g. VIS-only, NIR-only). As a further option the camera can be equipped with a self-built color filter which absorbs only the visible red radiation to enable photography of color infrared (CIR) images containing the information of the blue, green and NIR light.

UAS Colour Infrared Image (CIR) of a farm site near Münster.

The enhanced spectrum of the camera qualifies it for several monitoring and observation tasks, particularly concerning ecological vegetation analysis. Therefore the images obtained from aerial surveys are being rectified using ground control points measured a priori. Because of the high spatial resolution much information can be extracted not only from the spectral but also other (e.g. shape, texture) characteristics.

Another scientific aspect is the possible geoinformatic explotation and support of building damage classifications, environmental framesets or achaelogical surveys by use of UAS Oblique/3D Aerial Imagery.

Detection, monitoring and analysis of dynamic patterns in environmental phenomena


An important task considering future work is to establish an analytic workflow for the integration of gathered UAS data for issues of dynamic pattern recognition in environmental phenomena like biodiversity, pest, vegetation structures or even routing in dense human crowds by optical or thermal aerial imaging and video data. Therefore UAS- sensors like very high resolution cameras or thermal scanners are beeing tested and used for geoinformatic and environmental scientific purposes.

UAS-based map showing potential location points of invasive Acacia trees in a Brazilian Savanna ecosystem (Lehmann et al. 2017; see publications).

Free and Open Source Software (FOSS) based image analysis procedures – a transparent geoinformatic approach towards efficient UAS applications


The ifgicopter group puts some effort into the development of transparent, open source based UAS image analysis workflows, mainly by connecting OpenDroneMap (ODM), Python, QGIS functionalities by Docker/container technologies to establish various low cost and efficient tools, processing UAS data right from its aquisition to final geoscientific results. This interesting geoinformatic innitiative is also part of new interdisciplinary teaching concepts, where GI students are programming customized tools for their colleagues of related geosciences who utilize UAS for ecological field mapping. Often the preliminary outcomes are being used as ‘start ups’ for BSc- or MSc-theses. UAS as sensor platforms are specifically suited to support reproducible research: since researchers create their own data, they can package and share it along with their analysis methods and results without any copyright issues, thus fostering the reuse and further development of their research output.

Real-Time Data Processing


The used platform is enhanced with a standalone computing board managing the interaction with the actual sensors (a Sensirion SHT75 temperature/humidity sensor). Based on this approach we are able to easily integrate new sensors to the multi-sensor platform as it is completely independent of the UAS-specific electronics. Hence the developed computing board can be deployed on both UAV/S systems.

Using an independent sensor board with a separated wireless downlink lead to problems while processing the data. Different phenomena (e.g. GPS position, temperature) arrive at different timestamps at the groundstation. To overcome this issue we developed a software framework capable of synchronizing multiple data streams and subsequently transform the raw data into higher level protocols like Observation & Measurements. Thus we are able to integrate the gathered data into several services of the Sensor Web. For a description of the framework see SensorPlatformFramework.