The MAPPS 2014 Summer Conference in Coeur d’Alene, Idaho, included a number of sessions focused on the timely topic of unmanned aerial systems (UAS). The information shared included everything from potential applications of UAS data, evaluation criteria for UAS data, different types of UAS sensors and platforms, and the process to go through to legally fly a UAS in the United States today.

One presentation focused on the processing of UAS data to obtain useful information. The speaker, Apostolos Mamatas, head of sensor systems at Altavian, discussed the origin

Classical photogrammetric and UAS-specific workflows share much in common.

of the user-friendly software that automates a great deal of UAS data processing and why using automated software is not appropriate for every application.

Mamatas explained that several technological trends had a direct impact on the development of the automated UAS processing software being used today. Computer vision (CV) research took off during the ‘90s due to increased interest in analyzing and understanding images and other high-dimensional data from the real world. In conjunction with CV, the dramatic miniaturization and portability of digital imaging devices led to a groundswell of consumer demand for inexpensive packages such as games and smartphones. It can be argued that CV research drove much of this trend.

Within CV, advances in dense feature reconstruction using structure from motion (SFM), a fundamental mathematical approach of finding points from multiple photos and locating the cameras in space to reconstruct three-dimensional information, provided a less expensive alternative to traditional photogrammetry. SFM uses self-calibration and many different parameters to create information without the use of expensive laser scanners. Some photogrammetrists are skeptical that the self-calibration methods used by SFM are as reliable and accurate as traditional methods. The ideal situation would be to combine aspects of both approaches to get the best results possible.

“CV practitioners were not focused on creating geodetic world-frame 3D data  to the standards of surveyors,” said Mamatas. “They focused on taking out the need for human intervention to obtain some measure of reality and adequate solutions rather than ultimate accuracy. CV was developed in computer science departments not photogrammetry departments, so there is a disconnect in expectations.”

Part of the ongoing debate between traditional photogrammetrists and UAS proponents stems from different intended uses for the geospatial data being collected. Some applications require data that meets ASPRS mapping standards, others require daily revisits without high accuracy, and some require bare Earth while others need radiometric data for plant classification. In the same way that a large format digital camera isn’t the answer to everyone’s problem, a UAS is not the best choice for every situation, and automated data processing may not provide the desired results in either case.

“At Altavian, we design and manufacture UAS and incorporate best-of-breed sensor systems in small form factors  which are easily deployable and meet the needs of our customers,” said Mamatas. “This means we might use a DSLR camera on one and a machine-vision camera on the next. We know how important it is for the payload capabilities to match the intended use.”

Photo-derived point clouds have high density, but automated algorithms can miss narrow linear features.

There are at least nine major UAS all-in-one packages and 4-5 open source libraries/frameworks already available. Software providers are addressing different capabilities for different audiences, but in general UAS workflow is all about being as quick and inexpensive as possible. The generic processing steps include feature detection, robust feature matching, bundle adjustment, point cloud densification, DSM generation, and orthomosaic generation. Depending on the desired outcome, manual intervention can improve the feature detection. The process becomes semi-automated and takes longer, however, yields better results.

Mamatas referred to the Big Green Button Fallacy, which refers to the reality that no single automated processing method for UAS data will obtain exactly the same results as processing data acquired with a metric-grade aerial camera; however, the results can come close by introducing semi-automated processing techniques. “Software improvements are being released weekly,” concluded Mamatas. “The capabilities are changing so rapidly, it is hard to predict where we’ll be a year from now.”