IoT
April 10, 2019

When Every Millisecond Matters in IoT

One of the big promises of IoT is understanding the physical world around us and taking action based on insights and observations. Often, every millisecond counts—especially for use cases like earthquake monitoring. We need to build real-time networks for these mission-critical scenarios that can keep up.

One of the big promises of the Internet of Things (IoT) is understanding the physical world around us and taking action based on insights and observations.

Over the last decade, we’ve gotten really good at the first part, using smart devices and sensors for monitoring and data collection. We have sensors everywhere, in consumer products, on the floor and embedded in manufacturing and industry, distributed across nature and remote areas of the world—always on and always streaming new readings as they happen. This has transformed our understanding of how we work and live because we have more up-to-the-second data and analysis than ever before.

The next area ripe for innovation is what we do with that data. We’re beginning to see more artificial intelligence (AI) / machine learning (ML) implementations in IoT that better process massive sets of data and derive insights. This has allowed businesses across a wide variety of verticals to increase efficiency, make better decisions and predict future performance more accurately. And with the leaps and bounds made in processing and analysis, the underlying data infrastructure technology has also grown. With a vast flow of data, immense advances are being made in optimizing the machines and networks that deliver the massive streams or individual pieces of data where they need to go.

That’s critical, since every second matters in the world of IoT. Indeed, every millisecondcounts. Companies are taking advantage of this instantenaiety to the benefit of all.

When Every Millisecond Matters

It’s a dreary, cozy Saturday morning and you’re enjoying a nice cup of coffee. You look outside as the rain runs down your window. What you don’t see is that hundreds of miles away, under the ocean floor, a rupturing fault is sending out its first wave, a quick-moving, non-harmful “P-wave.” It’s detected by a sensor managed by the United States Geological Survey (USGS). But the slower, more dangerous “S-waves” are next up.

However, the data has already been transmitted to the USGS where the location and size of the impending earthquake are determined. The data transmission beats the quake. And with that speed, apps like QuakeAlert from Early Warning Labs can calculate the time to shaking and intensity, and deliver an individual alert, providing precious seconds to keep subscribers safe.

Every millisecond counts when it could mean life or death. Earthquake alert systems are a real-life example of this already in action today.

Diving Into the Tech

Let’s look closer at what makes IoT use cases like this possible.

The Sensors

It starts with a network of seismic sensors, which are used to detect the P-waves, providing a ton of information that can be used to calculate the size and location of the damaging earthquake. The data is distributed in real-time to every subscribed party: emergency response, infrastructureand everyday users who have the app installed.

The Real-Time IoT Network

The next critical piece of technology is a real-time network: a super-fast, low-latency and reliable infrastructure that is optimized to broadcast small amounts of data to huge audiences of subscribers. This may include both the earthquake data itself, as well as push notifications or alerts specified by the app developer. This is where every millisecond matters, so ensuring reliability at scale, even in unreliable environments, is mission critical. When selecting a real-time network, whether you go with a hosted service or build it yourself, app developers need to understand the underlying technology, real-time protocols and other indicators of scalability.

Application

Lastly, you need the application that connects your real-time IoT network to the deployed sensors, where notifications are transmitted and the response is automated based on incoming data. The application is responsible for determining who and where alerts and notifications need to be sent.

Edge Computing

Edge computing brings data processing as close to the source as possible (at the “edge”). Rather than sending data to be processed on external servers or at central data centers, costing precious seconds and additional resources, the computation takes place on the device (the sensors in this case), or in the network, itself. From there, the processed data can be delivered to its destination sooner. Edge computing reduces the cascade of potential bandwidth bottlenecks, and processes the data that matters, keeping it close to the source.

Looking Forward: Integrated Intelligence

The natural next move, as with every other industry, will be increased integration of cognitive services: machine learning and AI technologies that enhance the speed of, and derive deeper insights on, the data flowing through the system.

Beyond data processing and analysis, cognitive services empower the devices themselves to more meaningfully communicate with one another. The promise is fully-automated monitoring and response, free of human error. To be sure, increased reliance on intelligent systems comes with risk, when solely relying on the cognitive services from end to end. But with it comes a faster, more accurate and more robust way to build time-sensitive IoT applications.