Friday, May 11, 2018
Computer networking giant CISCO has developed an ‘internet of things’ platform called Kinetic, to make it easier for companies to manage complex networks of sensors and the data they generate.
The service is most beneficial for companies which have thousands of different devices, and want a way to reduce the headache of setting the devices up, managing the policy of what data they want and where they want it, running appropriate software on the devices, and then actually moving the data.
The service is provided as a mixture of hardware (Cisco routers) and software running on cloud services.
You can immediately set the whole system to a default setting, where your sensors and other devices provide standard data to cloud hosted software, ready to be fed into your various software applications. You can then use this default setting as a starting point for configuring the system to do exactly what you want.
For the oil and gas industry, Cisco provides a starter kit, as a ‘default blueprint’ to get you going. It includes a default way to extract data, and cloud hosted applications which can receive the data on a daily, weekly or monthly data.
The oil and gas industry uses many different ‘internet of things’ devices, including acoustic sensors, temperature sensors, fibre optics, gas leakage monitoring and equipment health monitoring.
Apart from oil and gas, the system is used in city management (including lighting and parking), manufacturing (monitoring machines, energy, inventory and deliveries), transport (including traffic lights and ‘connected car), and retail.
The service is offered following Cisco’s acquisition of Jasper, a company which makes a cloud based ‘internet of things’ service platform, in March 2016. It is used by over 16,000 companies.
Working with well data
One oil and gas industry customer uses the system to gather and work with well data.
It has thousands of wells worldwide, all fitted with different sensors, generating data in different formats, and with different ‘application programming interfaces’ (APIs). This meant that gathering and managing the data involved a great deal of manual work. It was very difficult to build a centralised system for understanding the whole ‘fleet’ of wells, and do any data analytics.
The company wanted to get visibility in near real time of what was happening with the wells, and wanted it all automated, so it would not require the data to be formatted by an engineering team before it could be used.
The oil company had four different business units who wanted to work with the well data and do different kinds of analysis on it.
They wanted statistics for individual wells, and aggregated information about multiple wells, and the ability to do big data analytics.
CISCO installed a new router on each of the customer’s sites, which connected upstream to the various sensors on the rigs, and downstream to the cloud hosted software. From there the data could be distributed to the software systems of the business units who wanted to work with it.
Setting up the system
When setting up your IOT system, you need to determine what data you want to receive from all of your sensors, and where this data should go.
This is called creating data ‘policies’.
For example, if you have 32 sensors on a well, you might want sensors 1-10 to send data to SAP hourly, and send data from another sensor once a month.
A company might want its business partners to see some of the data but control exactly what they can see.
It is much easier to set up a system if your sensors are more standardised, generating data using the same data model, says Theresa Bui, Director of IoT Strategy, CISCO.
Various options are available for connectivity, including cellular, and a new satellite communications options, and Narrow Band IOT (NB-IOT), enabling communications between devices over cellular communications bands.
Cisco has appointed a number of companies ‘device partners,’ where their devices are proven to work with Kinetic.
The CISCO Kinetic software has three core modules – gateway management, edge processing, and data control.
The ‘gateway management module’ is for adding new devices or ‘gateways’ to the system, making sure the network can extract the data it needs from the device in some kind of standard format, enabling fast set-up of a sensor (in minutes rather than days, the company says), and enabling remote management of the device.
The ‘Edge & Fog Processing Module (EFM)’ is for managing processing on the ‘edge’ or in the ‘fog’ – terms which basically mean doing processing near to the device itself.
This means that data only needs to be communicated to the cloud when there is something to actually say. An offshore drilling platform can generate 16 terabytes of data a month, Ms Bui says, and companies probably won’t want all of that data going to the cloud.
It also means that the system can be programmed to do something immediately something happens (for example shut a piece of equipment down is a shaft is rotating too fast), rather than send the data to Houston and get an instruction back.
In the jargon, the system is ‘pushing policy back to the device’.
A third ‘data control module’ does the work of aggregating data from all of the devices, and can be used to set your policies for where the data goes, and who gets to see the data.
The software modules can run on CISCO’s networking infrastructure.
To keep the system secure, all the components of the system need to be secure. This can include making sure manufacturers are making their devices secure, making sure any applications running on the device are secure. Also making sure that the movement of data is secure, and the storage of data, such as in a cloud server, can’t be tampered with. It also includes establishing who is responsible for what.
The most sophisticated companies are very aware of all of the vendors that work in their IoT ‘value chain,’ and set minimum security requirements, so people can be held accountable, Ms Bui says.