The cloud is an increasingly popular platform for data management. Ben Vander Jagt, co-founder of PixElement, is one of many entrepreneurs who recognizes serious potential in cloud computing. He founded his company around 2.5 years ago, and it specializes in Web-based photogrammetry software. The initial goal of the Columbus, Ohio-based business was to speed up the processing of imagery. “We said it’s very slow. You have to buy a really expensive computer to process it quickly and a lot of people don’t have those resources,” Vander Jagt says. “So we thought, if we can make it fast, we’re going to be successful because it’s not really fast.”
That is how he started the company, but he quickly learned that the problem he thought he was solving wasn’t as big of a problem as two other things: storage and dissemination of data. He points out that if, for example, a photogrammetrist collects 500 images, they’re dealing with gigabytes worth of data. When it comes time to deliver the data to a customer, they often put it on a flash drive or hard drive and mail it out.
“That’s how they do things still,” Vander Jagt says. “What we found was that the fact that we could not only be a place where they could store their data and come back to it whenever they want, wherever they want, but they could also share it with the end customer and the end customer can interact with it and view it directly.”
This concept of 2D and 3D imagery being managed with a Web-based platform was the focus of a presentation Vander Jagt led at the 2017 ASPRS Imaging and Geospatial Technology Forum (IGTF) in Baltimore, “Cloud Photogrammetry: Challenges and Opportunities.” GeoDataPoint recently interviewed him and two other photogrammetric software specialists on what the age of cloud computing means for photogrammetric data management.
How it Works
Vander Jagt says PixElement requires users to log into its website and upload their images. Once the automated processing of the imagery is completed, they receive an email message letting them know that they are good to do whatever they want with it. They can send it to others, share it, annotate it or save it. The PixElement Web interface includes tools like ground control points that can be used to refine the accuracy of data.
If it is a surveyor, Vander Jagt says there is a chance they might want to download the data locally and do something else with it. They might want to classify the point clouds or they might want to drape other data that they have on top of the ortho image. So sometimes they download the data directly. Other times, they just share the data with their end customer.
“The storage is free right now,” Vander Jagt says. “We don’t charge based off of features. … It’s only based on the amount of data going in. So people that use 100 gigabytes of data pay more than those who use 5 gigabytes of data.”
He says customers often have a lot of work they can’t keep up with and want to push the processing step elsewhere so they can free up their in-house resources to do other things.
“If you’re processing imagery, it’s very intensive and it can basically take up all of your RAM on your workstation and make it very small, and essentially you’re unable to use it while you’re processing data,” he says. “So they can’t do anything else because all of their bandwidth is taken up while it’s processing. So now you’ve got to go out and buy another workstation. We haven’t even talked about what you do once the data’s processed. How do you get it to the end customer? So that’s the other part of it.”
Vander Jagt’s basic point here, which his IGTF presentation outline highlights, is that photogrammetry has relied heavily on desktop-based software solutions in the past. Cloud computing offers an alternative to the often costly investments in licenses and hardware that surveyors and geospatial professionals have to make.
Konrad Wenzel, CEO and founder of nFrames GmbH, a specialist in photogrammetry software and workflow consulting in Stuttgart, Germany, says his clients fit into the tradition of using desktop PCs and workstations to begin production. As they scale to a few hundred images per week, workstations are used in combination with RAID storage units. When scaling to a few thousand large-frame aerial images per week, on-premise server infrastructures with SAN storage are typically utilized. Customers have also implemented cluster environments with a throughput of a few thousand large-frame images with more than 200 megapixels each per day, he says.
From what Wenzel can tell, cloud infrastructure, in a local and global manner, is becoming more important for nFrames clients with large projects. “Cloud computing is the future for photogrammetric data production, storage, backup and client delivery,” he says. “Particularly the lower cost on hardware maintenance, but also the elasticity and scalability are key requirements for the rapidly growing market. Besides the amount of applications, resolution and dataset sizes also increase. Furthermore, the capturing frequency becomes higher with the requirement of short or instant worldwide data delivery. These requirements can only be met with private or public cloud offerings.”
According to Philippe Simard, president of SimActive Inc., a photogrammetric software specialist based in Montreal, Canada, the ability to scale processing power as needed is the biggest benefit that the cloud brings to photogrammetry. The other key impact he recognizes is that it allows users to instantly add or remove virtual machines, hence adapting their processing power to production demand. A growing number of SimActive users are subscribing to services like Amazon Web Services (AWS) or Microsoft Azure to eliminate the management of physical PCs.
From a pro standpoint, Vander Jagt says cloud computing makes it possible to access data anywhere. “You don’t have to worry about storing it, mailing it, any of that. You can visit it anywhere in the world, you can share it with anyone in the world, you can edit projects in real time.”
That said, that key benefit of cloud access for photogrammetry — Web-based — assumes Web access. “The pro is that it’s Web-based and the con is that it’s Web-based,” Vander Jagt explains.
If a geospatial professional is out in the field somewhere where an Internet connection is not available, cloud software like PixElement, which is only accessible with a connection, cannot be used to upload, view, share or save imagery. In this scenario, the user would have to collect the imagery and return to their office or somewhere else with an Internet connection to kick off the processing phase. In this age, many consider that a hassle and would like to be able to process data onsite, however remote it may be.
“That’s something we’re working on because we want to make sure people are comfortable that they’ve got all the data they need to collect at the site,” Vander Jagt says. “We’re actually in the process of writing a simple desktop piece of software just for that application somewhere out in the field. … That’s definitely a drawback. There are still certain tools and capabilities that the Web browser or Web-based software might not have yet. One of those, for example, might be 3D stereo extraction where you’re actually viewing the images in full 3D with stereo glasses. To do that, you need to be alternating images on and off very quickly.”
Wenzel says there are three limitations to cloud usage at this point in time:
- Transfer speed: “The dataset volumes in daily gigabyte and terabyte throughput cannot yet be handled with common publicly-available Internet connection bandwidth. The permanent storage of input and output data is a requirement for data processing (process where the data is).”
- Costs: “Photogrammetry and the large data volumes of high-resolution images with high dynamic bandwidth require large amounts of storage, which represents a key cost when using uncompressed professional data. However, storage costs are constantly decreasing and thus reducing the entry barrier.”
- Privacy: “Particularly governments and related service providers are obligated to protect sensitive mapping data. Many governments worldwide, for example, do not allow the storage of aerial imagery outside the country borders. This requires technical measures as well as education of all involved parties.”
Simard agrees that, at least in the form of commercial solutions, as opposed to locally managed ones, cloud computing raises security concerns. Vander Jagt sees this too; he considers it one of the most common misconceptions surrounding cloud data management. He considers it a misconception because of the perception from potential customers that they would simply be uploading imagery to the open Web for everyone to see.
“What I try and tell them is it’s a lot easier to pick a lock at someone’s office to get data than it is to not only break through Amazon’s security measures but then our own as well. It’s our duty to inform people otherwise,” Vander Jagt says. “Security we take very seriously. We want to provide software that is basically identical in features to something you’d traditionally install on your desktop. The features are the same, but we provide so much more data storage.”
Selecting a Cloud Software Provider
When searching for a cloud computing solution provider for photogrammery, geospatial professionals should evaluate the actual performance of the computing node and the storage access speed. Wenzel says the figures of the technical specification are often not comparable. Also, costs for storage should be considered —not just the costs for computing time. He says on-premise computing, as opposed to cloud computing, should be considered if: a) the general storage and backup is needed to be on premise, b) the current limitations of data transfer to the cloud are still a barrier, or c) the costs of cloud usage are higher in the long term.
More generally speaking, Vander Jagt encourages geospatial professionals to scout and test before making a decision. It is important to see what different options are available, and not to settle on the first thing they see. In addition to looking into cost, he says trying out a data platform with different providers to see what is easiest to use is always a good idea.
The Future of Photogrammetry and the Cloud
While there are some very established companies in this space, Vander Jagt still considers the industry relatively immature. He thinks of Google Earth as laying the foundation for a lot of what his company does and says he is a big fan of open source data. He realizes that there are big open source communities in existence already and appreciates that they make it easier for developers like himself to access and exchange data without the need for some sort of software tool.
“This industry is growing really rapidly, but in my opinion it hasn’t reached its full potential. Going forward, obviously our focus is a cloud-based mapping solution, so we want a solution that works for your phone … We want, essentially, our platform to be sensor agnostic,” Vander Jagt says.
The cloud will enable the scaling that the strong market growth requires from the photogrammetry and mapping industry, Wenzel says. In addition to short time availability and scaling to large data volumes, the instant worldwide availability of results for quick customer delivery is expected to play a key role. “For the industry itself, the lower entry risk of on-demand infrastructure will also enable small and medium sized companies to enter largescale data production, pushing innovation into existing and new verticals,” Wenzel says.