This series of columns started with a discussion of basic factors to consider when setting up and operating a deformation monitoring network, including epochs, network adjustment versus deformation analysis, and why we need to think carefully about adjustment procedures. (Adjustment and Analysis of Deformation Monitoring Networks, POB June 2016). I followed this with network simulation and balancing accuracy and redundancy. (More on Deformation Monitoring Networks, POB August 2016). After simulation and testing, we arrive at a much more accurate and efficient design, which will require less adjustment of physical sensors and monitored points. Now, it’s time to connect the network.

Modern asset monitoring depends on multiple sensors communicating with each other via processing, adjustment and alerting software. In other words, most modern monitoring applications depend on monitoring networks. So each sensor in a network has to be able to communicate — that’s what makes it a network.

When I’m consulting on network design, I’m sometimes asked about the “best” way to implement network communications. And the answer is, of course, “It depends.” Several factors have to be considered:

  • How many sensors are there in your network? As a rule of thumb, more sensors in a network means more data is being produced, and large amounts of data can rule out some communication methods.
  • How far apart are your sensors? Physically wired networks are reliable and fast, but there are practical limits on the amount of wire that can be run between sensors.
  • How data-intensive is your network? Quantity of sensors is only one factor affecting the amount of data generated during monitoring. Sensor type matters too. Total stations, for example, will almost always generate less data than GNSS receivers. I’m starting to see laser scanners used in monitoring, and they can easily generate huge amounts of data. Measurement frequency is also a factor. I have set up networks that use just a couple of GNSS receivers to monitor bridge movement dynamically, and those receivers were set to process 50 positional locations per second. That’s a lot of data.
  • Are sensors visible to each other? This matters if radio frequency (RF) communication is being considered.
  • What is cell service like in the area being monitored?
  • How critical is continuous monitoring? Can you afford some unpredictable downtime or are there factors, like safety, that require 24/7 monitoring?

With these factors in mind, let’s take a closer look at the three most common communication methods.

Physically Wired

Running physical wire between sensors, and to the computer or server running your processing solution, can be the cheapest and most reliable way to provide network communications. Physically wired networks also allow large amounts of data to be processed. But wired networks aren’t especially common because there are many conditions that rule them out.

Maximum distances without substantial signal loss for serial/nine-pin systems, for example, are about 300 feet between sensors. Ethernet systems are a bit better. And, wires will usually need some sort of support infrastructure, like conduit, that can make installation costs prohibitive.

Still, physical wire is often exactly what’s needed. I worked on a project in Tennessee where very steep cliff faces were being monitored for safety reasons; the need for reliable monitoring during work hours ruled out use of the region’s cellular services. A hardwired network was installed and worked very satisfactorily.

And if you are on the cutting edge of monitoring, working with laser scanners, you’re probably going to need a hardwired network simply to manage the massive amounts of data produced.

Radio Frequency Communication

Because network operators can choose, set up and control the equipment needed, RF-based networks can be extremely reliable and they can be cost effective. Radio networks typically cost more to set up than cellular (and hardwired) networks, but on the other hand, there is no need for an ongoing cellular plan.

There are some potential downsides. Radios used for network communications are usually in the 900 MHz frequency range, which is reserved for very local communications. This means you don’t have to worry about interference from most voice radios, but on some busy job sites there can be interference from other contractors, such as surveyors using RTK rovers.

Radio networks at these frequencies need line-of-sight between sensors. There’s some wiggle room on this, but if a building or a hill lies between two sensors, you’ll need a repeater to make that connection. A repeater receives the signal from a transmitter and retransmits it to another receiver, allowing the signal to cover a greater distance or overcome obstacles.

There is a limit to how much data can be uploaded. It’s hard to be precise, but I’ve noticed that five continuously operating GNSS receivers are about the limit. Fortunately, more “master radios” can be incorporated into a network as needed, and all it costs is money.

Basically, even the weaknesses of radio can be addressed in a well-designed system. It makes sense that radio is one of the most commonly used communication tools in monitoring networks.

Cellular

If a reliable, high-speed cellular provider is available where the monitoring network will be deployed, then cellular modems are the right choice for most networks.

“Reliable” is a relative term; if you are monitoring for rockslides above a construction zone, it may be that no cellular provider is going to be reliable enough for your purposes. But if a “dropped call” is more likely to be merely inconvenient, then most services will work. In fact, I’ve noticed a substantial, across-the-board improvement in cellular uptime over the last couple of years.

“High-speed” is easier to define. Basically, 3G is not quite fast enough for some monitoring applications. LTE and 4G networks usually are.

There is one more confounding factor I’ll mention, even though very few readers will encounter it. In my work setting up vertical alignment monitoring systems for super-tall construction sites, such as 432 Park Avenue in New York City, I’ve noticed that cellular connectivity tends to degrade near top stories, perhaps due to the lower placement of cellular infrastructure. So, if you happen to be working on the tallest building in your city or region, it’s a good idea to have a backup plan ready as you near project completion.

All of the above may become obsolete over the next decade or two as private Wi-Fi networks become more robust and cost-effective. This kind of connectivity is already being implemented in underground mining, and the only real downside is expense. So I assume it will become more common and widely adopted as the setup costs come down.

But for now there are at least three good communication options to consider, and one or more of them will be right for your project.