Point of Beginning

Technology Transitions

April 1, 2006
Transits were standard instruments for angle measurement before the rise of the theodolite. Photo courtesy of Sokkia.


Total stations, such as this Leica Geosystems System 1200 SmartStation, have remained popular with surveyors since the first models were introduced in the 1970s. Photo by Jennifer S. Hall.
Surveying instrumentation for the local surveyor in the United States was pretty much standard at the end of the first half of the 20th century: steel tape for horizontal distance measurement, open frame (steel circle) transits for angle measurement, and wye and dumpy levels for vertical distance measurement. Fifty-six years after the midpoint of the 20th century, a lot has changed: only a very small percentage of surveyors report carrying steel tapes into the field. Today, it isn't at all unusual to find younger field people who haven't even used a steel tape. And practically nobody uses wye levels or transits anymore.

Before the 1950s there were a few other technologies in use for specialized operations, mainly conducted by government agencies. These included glass circle optical theodolites, photogrammetric cameras and scanners, and even technology to produce orthophotos. Dr. Erik Bergstrand, the Swedish physicist involved in the measurement of the speed of light, developed the idea to use the principles of that experiment to measure distances in 1947. It is unlikely that anyone was actually using electronic distance measurement technology before then, though the theoretical concept was not unknown among knowledgeable surveyors.

For complicated and long computations with measuring technology, several human computers performed laborious calculations in parallel to each other. If compared results weren't identical at specific points in the process (depending on how many parallel computers were used), the computers were instructed to either start over from the previous check point, or the result of the majority was used to go forward.

The methods of calculation varied depending on the organization, its financial resources and the complexity of the computations. The adding machine existed by the 1900s; soon after came mechanical calculators that could multiply and divide. By the 1950s, electronic and electro-mechanical computers did exist, but they were probably only used in computing large surveying projects such as a national geodetic adjustment. For the most part, surveyors relied on tables of logarithms and hand calculations, particularly in the field.

This Gurley light mountain transit of 1915 was typical of the angle measurement instruments used in the first half of the 20th century. Photo courtesty of Museum of Surveying.

Marked Changes

From 1950 onward, particularly the 1970s, '80s and '90s, the profession was full of tumultuous change. Changes were wrought in technology for horizontal and vertical distance and angle measurement. Perhaps most significant are the new technologies that have been introduced to facilitate surveying in today's world. These offerings include various forms of satellite-based positioning, terrestrial and airborne laser scanning, and digital photogrammetry. One cannot talk about surveying without including mapping activities where technology changes have been just as relevant, including the introduction of software-based geographic information systems (GIS), high-speed electronic plotters, and various other software-based cartographic products. The changes in field mapping beginning in the 1950s through the 1980s focused on a migration from plane table to stadia mapping, and eventually to specialized optical instruments that were referred to as "self-reducing" stadia instruments.

The introduction of electronic computing, especially once it became available as the mini computer and eventually the microcomputer, greatly affected how surveys were planned and executed. In the 1970s, the introduction of the electronic handheld calculator had an even larger impact. Microprocessor-based technology eventually became the norm, resulting in miniaturization of hardware and field computing power that exceeded the dreams of a typical survey office of the 1970s. The electronic data collector gained a strong foothold in the 1980s; today the computing power and human interfaces can be found on modern surveying instruments. Finally, the profession has seen the addition of various forms of wireless communication to facilitate satellite technologies, total stations and connectivity to the Internet, including mobile phones and satellite phones to enable real-time links.

The first-generation EDMs were large and bulky machines, such as this one by maker AGA. Photo courtesy of Trimble.

Angle Measurement

Many surveyors associate the term "instrument" with angle measuring instruments. Heinrich Wild is emblematic of a generation of instrument designers who innovated to create a lighter, more compact and more accurate replacement to the transit. While the transit was really a simple instrument, it required a much higher workload to use and operate than the instruments the optical theodolite designers had in mind.

The most significant of the changes was the glass circle. By being able to pass light through the graduated circle and with fixed microscopes built into the instrument, surveyors were able to more comfortably read the circles and subdivide the graduations with higher precision. Verniers were eliminated and replaced by microscales and micrometers, which on the highest grade instruments approached least counts of 0.1 arc-second.

The mechanical construction of the horizontal and vertical axes of these instruments was also refined, as were the telescopes (more compact, higher power and with shorter focusing distances). With all these changes, the instrument reduced in size and weight and hugely improved in performance. In the 1950s through the 1980s, innovation continued in the form of vertical circle indexing, which, through gravity-indexed vertical circles, enabled much higher accuracy in the vertical angle measurement.

As the development of cheaper, low power, high performance microprocessors continued to proliferate in the 1970s and '80s, it was only natural that they would find a home in theodolites. True digital theodolites were able to directly display circle readings without requiring users to squint through the magnifying glass to read the steel circle transit's vernier or the optical theodolite's reading microscope. Even more radical was the provision of a port on the instrument so that an external device could be attached to it and angle measurement data could be recorded quickly, easily and without the transcription errors of the past.

The Topcon "Guppy" gave way to a new era of angle plus distance measuring technology. Photo courtesy of Topcon.

Distance Measurement

No single technology introduction has had as many shock waves in survey technology as the electronic distance meter (EDM). It was the first real technology to use electronics in the field. It also changed the surveyor's workload in a paradigm way compared with the progression from the link-chain to the steel tape or steel-circle transit to optical theodolite, where the changes were only incremental.

The first EDMs were gargantuan things, intended only for long-line control measurements. They were phenomenally accurate, but used vacuum tube technology and weighed hundreds of pounds. The first EDMs did not use lasers but rather tungsten light. Infrared EDMs came much later, after first progressing through conventional visible laser and microwave technology. In addition to the Swedish company AGA, forerunner to the Geodimeter and the later Spectra Precision brand, which bought Dr. Bergstrand's EDM invention, some of the early players included Tellurometer (South Africa), Cubic (San Diego, California), Precision International (Tullahoma, Tennessee) and of course Hewlett-Packard. Zeiss, Wild and Kern soon joined in, and in the 1970s were hotly followed by Topcon, Lietz (Sokkia), Pentax and Nikon.

The early days of the EDM are filled with record-breaking distance measurements. "The record was a 120 km base line measured in Hawaii," recalls Karl Ramström, retired senior vice president of Trimble's Engineering & Construction Division. He joined AGA in 1969 as an application engineer, testing new products and training new customers. "The technology was so new and so complicated, it took a three-day training course to properly show new customers how to use the EDM," he says.

But even the waves of changes in EDMs alone are more in number than the entire spectrum of surveying technology in the century preceding 1950. The first AGA EDMs were required to be used at night to get a range more than 400 m. It also took a lot of data collection and post-surveying calculations to actually know the distance. By the time EDMs were being offered to the local surveyor in the 1970s, they had become compact, although they still needed a heavy external battery. The distance was displayed directly on the instrument, and it was even possible to program it to make the atmospheric refraction corrections. Finally the EDMs became light enough to mount on angle measuring instruments, and the concept of making all measurements with a group of instruments mounted on a single tripod was born. At the same time, manufacturers were either adding onboard capability to enter the vertical angle to make reductions to horizontal and vertical distance measurements, or creating plug-in handheld devices that added the keyboard, display and additional computing power to support those calculations. Some instruments even added the capability to sense vertical angles; in a few seconds in the daylight, they automatically produced the corrected horizontal distance for the line.

(Above) Leica Geosystems introduced the first digital level, the NA2000, in 1990. Photo courtesy of Leica Geosystems.

Total Stations

In the 1970s, Zeiss, AGA and Wild all produced devices that looked similar to today's total stations called "tacheometers" and "tachymeters." Some had onboard data recording, but the significant thing about them is that they integrated the latest angle measuring and distance measuring technology into a single instrument. The Hewlett-Packard 3820 introduced in 1978 established the term "total station" as the generic name for this type of instrument. As radical as this technology was, the acquisition price of $30,000 proved to be a barrier. That was until Topcon introduced its GTS-1 known as the "Guppy" in 1979. Dominic Auletto, current vice president of business development for Topcon, started working at Topcon as a sales coordinator that same year. "We started delivering the Guppy in 1979, though sales didn't take off until 1980," he says. "Pretty soon we couldn't keep up; the reception of the market, as shown through sales, skyrocketed and far exceeded the forecasts." This upswing in sales happened in spite of a popular customer objection: "What if the EDM fails and I have to send it in for repair-I'd lose my theodolite too!" Today's surveyors don't let such a thought cross their minds when purchasing a total station.

The GTS-1 was introduced at a price of $7,750. Even though there was a difference in functionality and performance between the Guppy and the HP 3820 (the Guppy was an EDM integrated with an optical total station; the HP 3820 had electronic angle measurement), surveyors recognized that much of the backbreaking work of surveying, especially with taping, was now relegated to a bygone era.

The significant development of the total station was that the surveyor, when doing an operation such as measuring a traverse, had only to point once at a target to get the complete vector from instrument station to the observed point. While much of the early total station technology was quite elementary in terms of the onboard computations, it didn't take long for subsequent models to add the ability to manually enter vertical angles on the instrument's keyboard to obtain horizontal and vertical distances. Once horizontal angles could be keyed in, coordinates could be calculated, and eventually a variety of stakeout functions were supported.

The total station reached its next boost when the electronic digital theodolite was combined with EDM and all total stations possessed functionality similar to that of the HP 3820. Now, with the press of a button, the raw measurements and even the orthogonal reductions could be bypassed to directly get the coordinates of the observed point, or in the case of setting out, the differences in horizontal angle and distance to get to the desired point.

Total stations went through a sort of psychological sag (in the minds of users and manufacturers) when GPS grew in popularity in the late 1980s. But total station technology has not only survived, it has been rejuvenated through several add-on innovations. The first was to motorize the horizontal and vertical motions so that the instrument could "reverse face" automatically. It soon went from there to aligning itself to the correct azimuth for setting out. The next innovation to be added was target recognition and acquisition. With this, the instrument could be pointed in the general vicinity of the target (later, it could search on its own), and then use motors to accurately point at the target. Finally, with the addition of a wireless (usually radio, but sometimes infrared) interface, it has become what the profession refers to as "robotic." An operator is no longer needed once the instrument has been set up. The remote display and keypad mounted on the prism pole are used to control the instrument, initiate measurements and store data. The instrument can also be configured to "follow" the prism so that when the rod is stable over the next point to be measured, pressing the "read" key on the remote keyboard immediately triggers a measurement. The data can then be reviewed and if it looks OK, can be recorded and the prism pole moved to the next point.

In addition to the string of motorized and robotic innovations, total station EDMs now have the added option of operating in "reflectorless" mode. This helps in the area of safety since the measurement of highway features can be accomplished without shutting down lanes and endangering field crew members' lives. It also aids in the completeness of information; sometimes it is impossible for a rodperson to get to the desired point such as an overhead line or a church steeple. It also helps with speed, as the rodperson's "walk time" is no longer a factor.

(Left) This cutaway view of the 1994 Zeiss DINI level shows its interior optics and electronics. Photo courtesy of Trimble.

Levels

The automatic level was very slowly received in the United States until the 1970s, when through advanced manufacturing processes, its price steadily dropped and surveyors started to trust this new-fangled technology. Next to EDMs, this technology probably had the most significant technological impact in terms of putting older technologies (in this case wye and dumpy levels) rapidly into permanent storage.

While the level hasn't gone through many visible changes externally, improvements in the compensator designs used to make sure the line of sight is horizontal have improved its ruggedness and reliability. Accuracy has also gotten better.

Automatic levels received another boost quite recently. It began when Leica Geosystems introduced its digital level, the NA2000, in 1990. In many respects the NA2000 was an automatic level except that the observer no longer had to look through the telescope to read the level rod. The instrument had to be aligned with the level rod manually, but then the readings were taken automatically. And if the appropriate software and memory were on board, the notes and calculations for the level loop were done automatically, too.

Many surveyors erroneously think that total stations and even GPS can produce results comparable with good quality leveling performed with an automatic or digital level. This is not so. However, the digital level has produced similar workflow advances to precise vertical distance measurement that the EDM has provided to horizontal distance measurement.

The U.S. Navy's TRANSIT satellite system was a precursor for the paradigm-shifting Global Positioning System.

Satellite-based Positioning

The Global Positioning System (GPS) was the most significant paradigm shifter of the field surveying technologies to be introduced for the surveyor. Some limited production work for applications such as seismic surveying were done with the U.S. Navy's TRANSIT satellite system even before GPS. But whereas TRANSIT produced survey quality positions using a method known as translocation on the order of ±10 m, GPS has a carrier phase-based positioning accuracy on the order of centimeters.

TRANSIT was developed for the navigational needs of the missile submarines; GPS was the next step to support all military forces. Clever scientists quickly figured out how to use two receivers, one fixed, to do various kinds of differential positioning that far improved on the system's original civilian-accessible accuracy of ±100 m. "Standard differential" produced accuracies in the ±10-20 m range. By tracking the carrier frequency, centimeter and millimeter level positions were feasible. These latter instruments quickly caught the fancy of surveyors.

Just as with the early EDMs, early centimeter-level GPS receivers were awkward and expensive. Macrometer (Macrometrics Inc.) and Texas Instruments had receivers in the half-million dollar range to start with, though the prices quickly plummeted, especially as new players such as Trimble and a joint venture between Wild and Magnavox entered the fray. The earliest applications of GPS, just as with EDMs, were for long-line control surveys (tens to hundreds of miles long). Each receiver consumed a prodigious amount of power; the data collection session for a single base line lasted four to eight hours. Then it took the "modern" AT computers of the day-the third generation of microcomputers-about the same amount of time to process the data.

Very quickly, GPS became smaller (while adding more channels to track more satellites), lighter, faster to use and much less expensive. The most significant change for surveyors was that with GPS it was not necessary to worry about line of sight between the points being measured. But it added new obstacles: line of sight from the receiver to the satellites; understanding and taking into consideration the Earth's shape; and the understanding of the geoidal undulation. Short base line work was adequately handled using single-frequency (L1) receivers, but long base lines required dual-frequency receivers.

The survey techniques also rapidly evolved from long continuous occupations, suited for long-line control work, to shorter (about an hour, and eventually much less) occupations that made it suitable for shorter lines, project traverses and project control. Kinematic surveying was introduced, allowing centimeter-level measurements of lines and areas. Linear features such as the edges of streams and the centerlines of roads could now be mapped at the centimeter level. This capability also added the ability for surveyors to do topographic surveys.

The final step in the evolution of GPS was the invention of real-time kinematic (RTK) GPS surveying introduced by Trimble in 1992, where, through the addition of a wireless link, correction data could be transmitted from the base station to one or more rovers. Now each receiver used to survey, as long as the telemetry link was up, could be used to determine centimeter-level positions in real-time. The dawn of using GPS for setting out had arrived.

When the introduction of GPS for surveying came about in 1983, the early manufacturers of receivers were hard-pressed to find many customers outside of government circles, not only because the equipment was expensive and required a lot of training to use but because the software available to calculate the results wasn't very user-friendly. The results often related to the ellipsoid and most ordinary surveyors really wanted results they could use in the "plane" surveying environment. Geodesy was coming at these surveyors at an accelerated pace and they were back-pedaling furiously. Luckily, scientists and manufacturers did try to understand the needs of local surveyors and produced software that attempted to meet them. Ron Hatch, now director of navigation systems at NavCom Technology Inc., was a software engineer at Magnavox in the 1980s trying to understand how the arcane language of GPS results could be best presented to surveyors who had their own special language. "In the early 1980s I was led into an investigation of multipath by the Applied Physics Lab (APL) of Johns Hopkins University," he remembers. "I wrote a paper on carrier smoothing to get rid of multipath, and that led to a development contract between Magnavox and Texas Instruments to produce a suite of survey software for them." The work of Hatch and his professional colleagues worldwide led to spirited discussions and a continuous stream of improvements to help with functionality and operational issues connected with using GPS for surveying.

Software and hardware developments have seen surveying technology "stretch" beyond its traditional boundaries. An exemplary application is construction measurements. Mostly GPS-based technology, although often aided by robotic or optical laser technology, is being used to control the cutting surfaces of earthmoving equipment in real-time. Where such technology is used, an entire phase of the surveying operation in construction that consisted of measuring and setting control points, stakes and other information has been almost totally eliminated. These changes have bred changes in professional surveyors that have forced them to look at new services to provide as this field-intensive service diminishes.

On the whole, GPS hasn't been without its competitors. The Soviet Union developed and launched GLONASS (GLObal NAvigation Satellite System) along the same lines and times as GPS. In 1996, Ashtech introduced the first commercial survey receiver that coupled the ability to use the positioning technology of GPS and GLONASS. However, GLONASS has not been all that well-funded in the past, partly due to the breakup of the Soviet Union and partly due to the priorities of its current owners, the Russian Federation. Therefore, the system has never had its full complement of 24 satellites in orbit. To surveyors who need true 24/7 positioning such as those who work in real or artificial canyons that limit the antenna's view of the sky, every little bit of positioning assistance can theoretically help. This is the idea behind the combined receiver. Their potential advantages notwithstanding, the combined GPS/GLONASS receivers have not been in the forefront of satellite positioning receiver sales, but they do have strong adherents who swear by them. And with recent launches, GLONASS is at its highest number of satellites (13), more than half the operational complement of 24. With promises by Russia to put up more, some with Indian help, interest in the system has again become guardedly optimistic.

The European Union's Galileo system launched its first test satellite last December and promises to also have similar functionality to that of GPS and GLONASS. Most GPS manufacturers are converting their "GPS" business to Global Navigation Satellite Systems (GNSS) technology to refer to these new opportunities afforded by all three systems; these offerings are sure to improve and enhance surveying activities.

Laser Scanners

During the latter half of the 20th century, probably the most prolific set of electro-optical developments have been centered on the laser. Applications for this technology vary from industrial cutting and welding to communications to non-contact measurement. Thus, it was inevitable that scientists and engineers began experimenting with ways in which to harness lasers to make integrated measurements that determine the three-dimensional coordinates of hundreds of thousands and even millions of points on objects and scenes of interest. The commercialization of laser scanners as a new technology for surveyors and mappers is well-advanced now, though it has scarcely reached the terminal point of its potential development: the products have been available for barely 10 years. In surveying, the laser scanner technology has moved in two directions: airborne laser scanning and tripod-based scanning. In airborne scanning, the technology performs very successfully as a vehicle for preliminary surveys, before more detailed photogrammetric work is done, but also in a new application putting aircraft and survey technology together for primary data collection in areas where monitoring, security, natural resource mapping and change detection are done. In the tripod-mounted world, initial receptivity was high in industrial settings such as chemical, petroleum and paper plants, tank farms and fields of piping and valves. However, the applications are constantly widening to include such areas as building and other structural (such as bridge) assessment, construction monitoring and accurate measurement of existing facilities that are to be renovated or added to. Just as with GPS, laser scanning is likely to be a tool that expands the measurement professional's abilities, adding new business areas while not replacing any of the existing set of tools.

A Dynamic Pace of Change

This review of the last few decades should convince surveyors that the fields of surveying and mapping are not stagnant, at least with respect to the technology that is used to accomplish the work. Understanding the changes forced by the new technologies, and preparing for innovations that may only be on the distant horizon, are important to today's practice-aspects that were just not relevant in 1900.