With IoT now being deployed at scale, the management burden has increased significantly, with a magnifying effect on errors that occur across large IoT device estates. Efficient management of IoT deployments in the hyperscale era requires a toolset capable of supporting a variety of devices and connectivity standards. Nick Gyles, the chief product officer at Daizy, tells George Malim, the managing editor of IoT Now, that standardised, intuitive to use IoT management tools are essential for IoT success.
George Malim: Why is it important to ensure broad interoperability across the IoT architecture?
Nick Gyles: There are two aspects to this, one involves equipment and devices and the other addresses the connectivity mechanism. For devices, we’re seeing commoditisation and maturing of hardware. Five years ago, IoT devices were typically propriety, but now large manufacturers are creating highly sophisticated sensors at scale. While this commoditisation has lowered the cost of delivering IoT solutions, device manufacturers are interpreting IoT standards in different ways, right down to the way payloads are constructed. Although solution providers are keen to take advantage of the latest technology, they are often hamstrung by a legacy approach of ‘hardwiring’ their applications to a specific brand of device, and adding new device types can mean rewriting whole swathes of underlying code.
From a connectivity perspective, we’re seeing increasing demand for solutions that support multiple connectivity standards. LoRaWAN, for example, is great for large buildings or scenarios with high levels of sensor density, while NB-IoT and LTE-M offer flexibility for dispersed sensor deployments or for mobility use cases. Increasingly we’re seeing deployments that require both LoRaWAN and NB-IoT. Monitoring for damp and mould in social housing is a great example of a need to deliver a low-cost multi-connectivity solution across a diverse range of premises – from blocks of flats through to separate houses in more rural environments.
With multiple device types and connectivity technologies in use, solution providers are likely to end up with multiple management solutions to capture data. This is why interoperability is important and an open toolset like Daizy provides a much better alternative than segmented vertical stacks. Regardless of which device or network technology is used, data delivered through Daizy is standardised and ready to be consumed.
George Malim: Why is standardised service delivery critical for delivering IoT at scale?
Nick Gyles: By adopting a more horizontal approach, Daizy standardises sensing metrics such as temperature, electricity metering, positional information, air quality and many others. We have a standardised schema of how that data gets presented – we decode the manufacturer-specific data payloads and ensure the data is easily consumable via standard technologies that developers understand.
We’ve engaged with many organisations who have approached IoT with a single use-case in mind. However, end-customers will ultimately be looking at multiple use-cases across their estate, and the last thing they need is multiple platforms for data, devices, and connectivity. With a standardised approach to deploying and managing IoT projects, issues such as in-life maintenance requirements, monitoring, and the ability to add other capabilities across the IoT service stack can be streamlined and aligned with existing technology asset management.
IT departments are really good at managing servers and desktops but they don’t have the resources or tools to manage a broad range of low-cost IoT devices at massive volume. The advantage of a toolset like Daizy is that you can monitor energy usage, water levels, humidity or any other metric in the same centralised operating environment.
George Malim: In what ways will being prepared for AI and digital twins help to drive growth?
Nick Gyles: We place a lot of emphasis on contextualisation. You need to be confident that the device is where you think it is and the data provided is valid. Within Daizy, there are checks on location, checks that installation has been performed correctly, and checks on the device identity and various attributes. In smart bin deployments, for example, the depth of the bin is vital information because the system can’t know the fill level without knowing the bin’s dimensions and the position of the sensor at the point of installation.
Rich context and the interoperable data pipeline we support create normalized data structures that make it straightforward to enable a digital twin. High-quality structured data ensures a really high-fidelity sensor model which is resilient to project changes such as faulty device swap-outs.
Our goal is to do all the non-differentiated activity so our partners have access to a foundational data environment that eliminates the huge inefficiencies in today’s IoT operations. Every piece of equipment in Daizy has an auditable asset history from original installation through to end-of-life. That completeness is essential to effective IoT project management and handling the immense scale of deployments.
For more please click here.
Interview by George Malim, the managing editor at IoT Now
Comment on this article via X: @IoTNow_ and visit our homepage IoT Now