Scaling the edge

Connected Technology Solutions spoke to Julian Chesterfield, founder and CEO at Sunlight about the advantages that edge computing offers industry and how their offering has been developed with that challenge in mind,

When you talk about edge, what do you mean by the edge?

I think there are many different interpretations of edge, particularly when you talk about the telco edge, and many data centre providers who claim to have edge capability because they have geographically positioned data centres. When we talk about the edge, it is the very far edge, the last hop close to where the end customer is. For example, supporting an industrial IoT environment. The edge for us would be sitting on the factory floor, helping to provide advanced capability functionality close to where the production in the factory is happening.

We are working with a customer opportunity now, that is exactly that. They have several different production lines across the factory, and they need to have some infrastructure at each of those production lines. Another example would be in the quick service retail environment, where restaurants or shops need to have infrastructure running on premise in the location.

Another one we would highlight, which we see increasingly a lot more focus now, is around the military and defence space, which is a very ruggedised, disconnected mode, where the infrastructure is not communicating to centralised cloud. It must be able to operate autonomously, it must be able to be mobile. It might be sitting in a backpack or a drone.

What is the difference between edge and intelligent edge?

Some companies have been doing edge for many years. In retail you could argue that they have done edge for ages, they just do not call it edge. What is changing now is they are trying to deploy more intelligent services, including things like AI powered applications, or predictive analytics on an oil rig. These services are becoming much smarter, which means that the type of infrastructure that is required is no longer little terminals, you must do some significant processing.

Do you feel that the need for edge is being driven by the advances of machine learning and artificial intelligence analytics?

It has had a huge impact on having to deploy at the edge, in that it is often the underlying technology for a lot of these smart services that consumers and businesses now expect.

Initially everyone assumed that there were masses of bandwidth. 5G was here, we could transmit whatever we wanted, and sticking it in the cloud made sense. I think when you start looking at the quality of video that must be generated to drive a lot of these machine learning algorithms, the balance has shifted the other way. There are the latency aspects of having to transmit data over a long distance, but then also having to process and analyse it, and then getting the responses back to the edge where it matters. Even in the sensor area with factory automation where there are so many different types of sensors that are required to make sure that a factory can be fully automated and function at its peak output, the volume of data that is being generated is getting larger and larger.

What is driving that change is the requirement to be able to process data where it is generated. It is not necessarily all being stored there, a lot of processing is happening at the edge because you must be able to respond quickly. But then a lot of that data does then get passed back into more of a central cloud, where you feed that back into a machine learning data model, and you make the data model more effective and more accurate. The actual logic processing needs to happen where the large volume of data has been generated.

Are there any sectors that are pushing the boundaries of intelligent edge?

There is a lot of post COVID impact across retail that is important, and quick service retail did well during COVID in lots of cases, because people were still ordering food. What they have had to do is change it and put in smart kiosks that recognise people when they go through a drive through. These things have had to happen at a very accelerated pace. The rest of the retail world has not fared so well over COVID.

Talk to me about the challenges, what do people normally come up against?

Energy will be another space where you have intermittent, slow, or unstable connections, which if you are trying to run your application remotely in the cloud, is going to cause business interruption. Lots of data is being generated in areas such as oil rigs. Here you have 20-30,000 sensors generating one to 10 terabytes of data a day, and a satellite uplink that can handle 10 megabits per second. Even at the low end of that data range being generated, it is going to take you nine or ten days to upload one day of data, which is not feasible. It is a necessity having to process the data where it is generated.

Latency is another problem, if you are in a factory trying to spot defective goods going down a production line, you cannot round trip to the cloud, to make sure that defective potato chips are picked off the conveyor belt. Privacy and compliance are another thing, especially for things such as hospital wards, being able to keep the data, summarise where it is generated, process it there, and then be able to summarise it, anonymise it and pass it to your core facility, that helps in that regard.

How does edge help in harsh environments?

It is not a traditional data centre environment. We are not talking about standard off the shelf server systems that you would find from HPE or Dell. Clearly, the infrastructure that is being deployed at the edge is quite often in an extreme ruggedised environment. The requirements are quite similar across all these different verticals that we have seen.

Typically, they are air-cooled systems that need to be able to run with lower power embedded processors. We see a lot of embedded systems based on embedded Intel Atom, Intel Xeon D processors and increasingly, ARM processors. This scenario has started to drive more ARM adoption outside of the traditional mobile phone market marketplace.

One of the other things to highlight is that the economics of the edge are quite different as well. With a conventional centralised data centre application, enterprises will have a significant budget to invest in a core set of infrastructure. With the edge, when you start pushing out logic to potentially hundreds or thousands of locations, the economics become important because if you are looking at putting infrastructure in each of these locations, you need to be able to operate within a relatively low budget, across a very distributed footprint. It is driving a new class of slightly more affordable hardware based on embedded lower power components, with a lot of wireless interfaces on these devices. Being able to communicate over wireless LAN, or over a cellular link is common.

One of the models that we see a lot of interest in is a switchless model where you can connect devices back-to-back so they can communicate without the need for external infrastructure. This simplifies the infrastructure and makes it as redundant as necessary.

What do you and your company bring to the game, and where do you make a difference on the edge?

Sunlight technology was initially developed as part of a collaboration with ARM technologies in Cambridge, which is where we are based. We were working on developing a next generation server architecture, which was initially a research project. The challenge that we had was we were trying to develop a system that was efficient to be able to run on mobile device processes; the sort of processor that you would find in a Samsung Galaxy phone, for example, but with a scale out architecture.

The difference for us was that we started from the edge world, and we came into the data centre world as our technology developed. We came from the position of developing our platform with a clear discipline in mind, which was that we needed to be able to run on these very resource constrained environments, with a limited number of cores, limited processing capacity, very limited memory footprint, and being able to drive all of them super efficiently. Our core technology is extremely lightweight, high performance hyperconverged infrastructure (HCI) at the edge solution. This means that we can build out a very robust, distributed platform. We can start from a single node, we can scale up, depending on how much redundancy you need at any of these edge locations, and we can run on extremely small-scale devices.

We can run all the way down to an ARM embedded mobile processor, something you would find on an Nvidia Jetson board, for example. We can scale that up into larger data centre infrastructure as well. Our focus is very much on being able to, from a centralised point, push out all the automation and application deployment logic that allows you to deploy services at the edge.

Which brings us back very much to this core ethos, which is that edge infrastructure must be very robust, must be very rugged, and it must be fully manageable in a distributed mode, but also needs to be able to run autonomously.

Our software is designed to be able to, if necessary, periodically connect to infrastructure, when it is available, and to be able to deploy and configure applications and services to be able to run on any of those remote locations.

We have mentioned software defined quite a few times. Why is that important?

Software defined infrastructure is very much where the IT market has gone, and that everything needs to be automatable, and needs to be operated through API’s. What we are doing is providing a layer on top of the physical hardware to abstract the computer networking and the storage and allow you to control all that through API’s. It gives you enormous ability to automate all those things and makes it easy to deploy applications and manage those in the field.

The thing that really differentiates, and the reason that you would care about software defined infrastructure at the edge, is being able to leverage it. It adds a layer of transparency between the choice about what kind of hardware platform you are going to use, and the software and services that you are deploying on top of that.

Software defined infrastructure really allows you to leverage best of breed. You can take commodity storage, commodity network interfaces, commodity compute, and you can use your software defined infrastructure layer to build out all the redundancy and fault tolerance. It is an important strategic decision; in that it allows you to define your services and applications that run independent from the actual infrastructure layer that you deploy on top of.

The biggest challenges I hear from a manufacturing management is scaling. How do you address that?

Scaling is the biggest problem with IT, and if you have an edge deployment of two or three sites, it is not that complicated to do. Now if you are trying to deploy across 10,000 restaurants, that scaling problem is enormous. That is why it is so important to be able to drive standard infrastructure, but then also to have something like Sunlight that can give you that abstraction from the hardware layer so that you are able to really scale out efficiently across hundreds of sites.

Read more of our features here!

Related Posts
Others have also viewed
Supply chain

Will technology save the supply chain?

It is no surprise that events in recent years have led to supply chain shortages ...

Generative AI at work: Creating a transparent company culture

The power of generative AI has risen to prominence in the past year. Even for ...

Working in harmony to propel the energy transition forward

To reach net zero, we need new technologies and solutions that work in harmony with ...

Investing in data governance is a non-negotiable for GDPR compliance

Since the General Data Protection Regulation (GDPR) went into effect in the EU five years ...