Open Compute taps IOWN to help design distributed datacenters and a 'computing continuum'
Because AI won’t only run in Big Tech’s giant GPU garages, and won't tolerate slow connections
by Simon Sharwood · The RegisterThe Open Compute Project (OCP) wants to develop specs for distributed datacenters and has decided the all-optical Innovative Optical and Wireless Network (IOWN) stack can make them possible.
OCP calls its new effort the “AI Computing Continuum” and hopes it will deliver “a seamless computational infrastructure from centralized to edge deployments.”
It’s widely expected that AI workloads will increasingly move to the network edge, where lightweight models can undertake inferencing close to users so latency won’t negatively impact UX.
But edge AI won’t stand alone, so as OCP points out in its announcement, centralized datacenters will need to connect with geographically distributed resources in regional data centers, colocation facilities, telcos, private datacenters, factories, and even individual offices.
Communication between those facilities will need to be fast, again to prevent high latency. The links OCP envisages also have the potential to create huge amounts of data and network traffic.
Which is where IOWN comes in, because as The Register explained last year the tech offers all-optical networks that can improve network transmission capacity by 125x and reduce network latency to 0.5 percent of current levels.
The main agenda item for the two organizations is creating “a roadmap for a multi-site, high-bandwidth, low-latency compute and network infrastructure.” The IOWN Global Forum gets the job of designing a communications architecture that uses its photonics-based optical communication technologies and is ready to meet the needs of users in major industries. OCP will design specs for open hardware that can connect to the kit IOWN devises and try to use its roots as a designer of hyperscale infrastructure to inform its efforts.
This alliance is a big win for IOWN, which has lofty goals and has won support from most major industry players, but is yet to reach a state in which it is easily deployable by hyperscalers, never mind mainstream users. OCP tech is very widely adopted, so the organization could increase interest in and adoption of IOWN.
The mere fact that OCP wants to work on distributed AI is also notable given the huge sums of money tech giants are spending on datacenters and equipment to fill them, as it suggests the open hardware design outfit is trying to cater to emerging priorities.
Neither org has said when they expect to deliver their part of this puzzle. ®