Why an open-standard infrastructure is crucial to meeting data demands at the edge

2


12

As the Internet of Things continues to take hold, a plethora of connected devices and sensors – along with smartphones and even smart appliances – is generating a deluge of data driving demand for more compute power at the edge to process it all, often in real-time.

Retailers are trying to improve the customer experience through artificial intelligence (AI) applications, augmented and virtual reality, kiosks, and more. Manufacturers are implementing Industry 4.0 applications such as AI and computer vision-based controls to improve quality and worker safety, along with AI-driven robotics to increase productivity. IoT technology is enabling smart cities, with sensors and GPS data driving traffic light timing to ease flow and support intelligent parking solutions.

All of these use cases, and many more, require significant, localized compute power, often in areas that are not exactly well-suited to house traditional IT equipment. Smart city applications, for example, usually require compute power housed in an enclosure that lives outdoors, attached to or inside of a streetlight pole.

So what’s the best approach to dealing with all this data? One answer is to use the same approach that hyperscale data center providers took to building their infrastructure: using open hardware and software platforms.

Today, infrastructure at the world’s largest data centers is typically based on x86 computers and software with open APIs. There’s a heavy dose of container technology, packaged code that makes it easy to run applications on any compatible environment (and to move them to other underlying resources as necessary).

Open platforms drive innovation

The beauty of using open APIs is that application developers can innovate on top of the hardware platform. The 5G networks that carriers are building out (and that will be crucial in supporting edge applications) offer a good example, says Michael McNerney, Vice President of Marketing and Network Security at server-maker Supermicro.

Cellular service providers have traditionally used proprietary hardware and software to support previous generations of their networks. That made them wholly dependent on the vendor that delivered the solution for any innovations.

But now we’re seeing developments like the Open Radio Access Network (O-RAN), which calls for the standardization and interoperability of RAN elements, enabling components from different vendors to work together on open hardware platforms, including x86 based systems. That drives innovation, McNerney says, while also driving costs down.

“x86 based servers have gotten faster and faster,” he says. “Now we can do some of those things [like O-RAN] with standard x86 hardware, versus expensive dedicated silicon, and open software.”

The same concept can be applied to the hardware and software required for virtually any edge application. On the hardware side, vendors, including Supermicro, take a building block approach to servers.

“If you’re requiring AI inference at the edge and want a short-depth chassis that can withstand extreme temperature variations, we can ship you a system with up to three GPU cards that can locally process images and telemetry, all built on a common x86 open architecture,” McNerney says. “On top of that hardware design, you’re free to choose whatever software you like, including containers.”

That kind of innovation at the hardware level is helping drive all the software applications we see today, and this will be crucial in handling edge requirements.

“The GPU compute we can throw at applications, the storage capacity and performance, reduced latency—all those underlying technical infrastructure improvements allow a new set of applications,” McNerney says. “We can continue to innovate at the infrastructure level, and the apps can take advantage of it.”

Visit us at supermicro.com/Cloud to learn more.

Organizations today are under unprecedented pressure to adapt. Intel has delivered five generations of custom silicon built for cloud scale, along with co-engineering with partners and relationships with top cloud providers. Today’s top clouds are powered by Intel® Xeon® Scalable Processors.