The aptly named mimik Technology may just have come up with the final piece of the edge computing jigsaw – how you make it work right out there in the dark and dirty world of billions of IoT sensors and controllers in order to seamlessly make both good sense and very effective use of the tsunami of data coming the way of every business.
One of the hot topics of IT infrastructure development right now is edge computing, and as the idea has started to gain real traction it has brought with it an interesting and important question, that goes something like: ‘Yeah, yeah, it’s a great idea but….ummmmm….. how do you actually do it?’
The whole idea of virtualising and distributing the data centre so that compute is located right with the sources of data, rather than trying to cope with the cost and latency of shifting Petabytes- and what will soon enough be Zetabytes- of data from those sources out in the field back to a central repository.
Until recently, it has been far easier to envisage the software and applications infrastructure that would be needed. But the provision of the hardware horsepower was still hard for people to imagine. But to talk of `virtualised data centers’ still suggests the wrong idea- that a local rack or two shifted out of the datacentre might dothe trick. I have heard several suggestions that the edge would mean businesses using cloud services effectively clustering their facilities around one or two remote cloud services providers to ensure the lowest latency possible.
But the signs of a real alternative solution have now emerged as mimik Technology comes out of skunkworks stealthmode. This Canadian start-up is targeting bringing the computational element of workflow as close as possible- both physically and logically -to where the application demand and the sources of data are located. The ultimate implementation of this is to make it possible for the devices producing the raw data also to run the application needed to process that data, communicate it to other applications and, quite possibly, manage and change the actions of the process the device is sensing or a part of.
This, according to mimik’s founder and CEO Fay Arjomandiis the first element of a three-part design model. The second element is making it possible for that device to find other devices and communicate with them. In this way, the collaborative environment required by the edge computing model can be built.
The third element is that, as that collaborative environment grows, the need for it to communicate up and down the ‘chain of command’ inherent in building an operational whole right out to the edge becomes an imperative. This means being able to operate the same applications and orchestrations from the individual edge devices right up into the cloud and back up the network.
Now the cloud is only half-way there
For now the end-point of an application is the local remote cloud service (which in practice could be physically anywhere). So why not mimic the fabric of the cloud so that it can be logically continued right out to the physical endpoints of the network- the devices themselves? If the environment then allows them to communicate and collaborate, they can combine to provide the resources that applications need from the micro-services at their disposal. Arjomandi explains:
“Devices need to provide information of their own resource to each other. And then the resources need to be able to reach out to each other as in the cloud. That’s what we call forming the fabric of cloud, which is about cluster formation in an ad hoc fashion. And in order to do that formation, we do it based on three scopes: Network, Proximity, and Account.”
`Network’ means the obvious direct interconnection- be that in the same home attached to the same Wi Fi, or the same manufacturing facility attached to the same network. Here, there are a huge number of use cases where applications and devices need to communicate with each other.
`Proximities’ refers to situations where a device needs to reach beyond the direct network to find the right resource, such as additional computation to process a specific workload, while `Account’ refers to devices or resources that belong to associated accounts for which access is authorised.
This ability to mimic the cloud then leads to the next obvious stage-the ability to start orchestrating the collaborations possible right out at the edge, using what Arjomandi refers to as a lite container. This uses the uses the same API semantics as Docker in order to ensure compatibility from within a cloud environment and seamlessly out to the edge. This allows existing cloud applications and services to be readily extended as far out into the edge as required:
“If I were using an edge approach, which is going to be the next generation of applications, I still want to use Cloud appropriately. Now I have a live container that I can instantiate as a serverless micro-service, either dynamically or as part of my application package. Your application can now send text-based messages through TCP and video-based through UDP using two separate sidecar components. On one side you can bring the workload to the device; on the other side, you can now decompose your application to a group of serverless micro-service components and sidecar components. And if you have to use Kubernetes you can now utilise those tools for your orchestration.”
Getting to the end of end-to-end
In essence, this capability extends the reach of a fully orchestrated operations infrastructure right down into the heart of the edge and making it an integral part of the whole environment. These serverless micro-services provide the additional granularity required by applications so they can reach into the edge services such as IoT environments, right down to the individual sensor level, regardless of what operating system is being used there.
It is perhaps no surprise, therefore, that Amazon’s AWS operation has noted the possibilities and formed a partnership with mimik, with one certain aim being to distribute AWS services and resources as far down to the edge as it is possible to go.
Research which Arjomandi attributes to Red Hat, suggests the reason why this is important. This says that so far only 20% of industry across the board has gone through any real digital transformation and moved into using cloud services. And yet the cloud market is expected to top $220 billion this year, so there is logically huge potential to be had out of the 80% that has yet to move. And they are the businesses where the ability to reach right down there, seamlessly, into the smallest levels of IoT granularity, will make practical sense of the whole digital transformation storyline.
This makes it sound like it is a copy and paste of cloud services out to the edge, but Arjomandi is adamant that it is not that simple. That’s still really just a regional data centre infrastructure:
“That cannot be done, it’s a different door. You have to look at the functionality that you have and say,‘ How can I image it to the edge?’. It’s about mimicking the cloud to the edge, but not copy/pasting it because you can’t. We give you the environment that runs your serverless micro-service on any device. And we give you the environment that can decompose your client application to micro-services, which means that now you reduce your application development time because now you can run that application, that same workload, as micro-services across every operating system.”
This opens up a new avenue for business managers who will at this point, be thinking, ‘Here comesa whole new investment budget to plan for…’. The ability to run on any operating system, coupled with the fact it is micro-services that only load the specific code called for a specific task rather than the whole application code, means a whole range of old devices can become servers in the environment. And let’s face it, there is going to be a market for millions of them, if not billions.
So otherwise obsolete laptops and mobile phones can find a new, extended service life alongside Raspberry Pi-based devices and similar new serverlettes. It also allows whole new levels of data collaboration to be built in what Arjomandi calls “a true hyper-connected world”, such as data taken from an athlete at an event being fed into a sports game to allow a player to compete directly with the athlete, in real time.
It’s your data, you make money from it
One fascinating side issue of this hyper-connected world that mimik makes possible is a fundamental change in the economics of personal data. The common mantra with social media’s use of personal data is that users are the product being sold. That being the case, what might people feel about that if they were the ones making some money from the transactions? Arjomandi suggests:”When a company gets your data, they decide, based on their overall data, how to improve their product, how to funnel to other markets, and how to expand their business. As the data producer, end users need to participate in that income. The only way to do it is when our data is in our control, managed by our device. And now I’m having a reverse API model where I provide the API and instead of my data going, being pulled, I decide who to push the content to. I can broker revenue with the content or digital solution provider. Are we all going to be jobless because technology will take over? Well, let’s talk about it. Let’s make our data part of our basic income. It could even be part of the inheritance I leave after my death.”
My take
Edge computing has been the next coming thing for a while now, and it has remained so for nearly as long because it has lacked one component, the hardware implementation capable of being an integral part of the very edge, out there at the coal face and the software implementation capable of providing the bridge between the physicality of what happens at the edge and the logic running the business process. I am of a mind to predict that this, or something very like it, will be that solution. It will be fascinating to watch what comes along to compete with mimik and how user businesses take it up and start using it. The next year could hear a lot of pennies dropping amongst the user community.