2023 has arrived, and once again it’s time to look ahead and predict the tech trends that will dominate the next 365 days. According to consultants and analysts, advances in storage will prevail in 2023, as we have seen some of those trends take shape in the last few months. CXL is one of the emerging technologies that gained most traction last year and is likely to get more real in the new year.
In 2023, CXL will enable datacenters to break free from the barriers of physical hardware and unlock the power of server architectures in a more complete fashion. This will be particularly advantageous as we are witnessing the emergence of new classes of cutting-edge AI and HPC workloads.
For a new technology, CXL is already gathering a lot of steam with Samsung, Micron, Intel and several other industry heavyweights pioneering the early adoption of the technology with their respective offerings. But 2023 will see it becoming more usable and practical.
Understanding CXL
Before diving into all that action, let’s first wrap our heads around CXL. One of the greatest challenges of our times is data inflation. We are faced with an explosive amount of data. The rich intelligence trapped inside that data makes it too valuable to let go to waste. The amount of insights that can be extrapolated out of so much data is unquantifiable. But organizations have some strong headwinds to deal with first, the cost of memory scaling being a chief one.
If companies had bottomless budgets and they could write up a check any amount every month, this would be a completely different conversation. But as neither budget nor computation resources are infinite, they need a more actionable solution.
As datacenters get crushed under the weight of growing memory requirements, only a technology that provides limitless scaling can bring memory capacity and bandwidth up to speed with compute.
The Best of CXL
CXL is the David that was created to fight the Goliath of architectural rigidity at datacenters. Typically, proprietary interconnects are used to share resources inside servers. But those interconnects are limited in their compatibility, in that they only work with CPUs of the same make. To get around that, engineers have designed a universal interconnect that is CXL.
Predominantly a high-speed memory interconnect, CXL or Compute Express Link is cache-coherent and open-standard, meaning it offers optimized memory sharing and is universally compatible.
Although fundamentally that, the versatility of CXL as an interface goes beyond memory expansion. It serves to connect the host processor with a multitude of devices, not just memory, but compute and other things as well, such as hardware accelerators and I/O devices. This will eventually expand to include any number of devices and peripherals.
Joined at the hip with PCIe, CXL offers high speed, high bandwidth, and ultra-low latency. Like PCI Express, CXL is designed for point-to-point communication, but we believe that a few iterations down, it will support one-to-multiple communication as well.
Versions So Far
As of today, CXL is 3 iterations old. In its second iteration 2.0, CXL supports memory pooling allowing memory of multiple systems to be brought together and used as one. The real advantage of disaggregation via CXL is that it can locate unused memory and utilize it, thus minimizing resource wastage and optimizing cost.
CXL 3.0 ups the ante by enabling disaggregation of compute, storage, network and accelerators so that like memory, they too can be liquified, pooled and allocated dynamically as required. Additionally, it offers peer-to-peer communication within the device hierarchy which allows devices to communicate directly instead of going through the host or over the network.
As new iterations are developed, CXL will augment disaggregation and help datacenter operators navigate and overcome the complexities of server architectures, and address resources dynamically.
Companies That Have Signed on in 2022
Companies are already heralding CXL into mainstream, and pretty much every hardware vendor at this point supports CXL in their newer products. In 2023, we will see more known names piloting and rolling out CXL-based products into the market. The enablement work will continue, and the ecosystem of CXL-supported applications will grow considerably this year
Intel has been in the CXL game since the start, and will continue to play its part in 2023. Intel recently added new CXL 2.0 codes for Linux 6.2. The update introduced many new features and enhancements focused on CXL 2.0.
Samsung is going all in too. After a CXL-based “memory-semantic SSD”, Samsung is now building custom AI chips leveraging a host of technologies including processing-in-memory (PIM), processing-near-memory (PNM) and CXL to unlock new thresholds in performance and efficiency.
Among emerging companies, SK hynix is a name that comes up over and over in the context of CXL. Late last year, the DRAM and Flash memory chip company developed a DDR6 CXL computation memory solution becoming the industry’s first to introduce computation functions in CXL memory chips. It is due to demonstrate the solution at CES 2023 in Las Vegas in just a couple of days.
Micron‘s is one of the loudest voices in the CXL conversation. A couple years ago, Micron killed off the production of its non-volatile memory technology 3D XPoint, and stirred up speculations. But it all makes sense as the company voices its support for CXL making it obvious that Micron has a bigger plan then catering to just PMEM buyers.
If you are eyeing the new AMD EPYC Genoa or the upcoming Intel Xeon Scalable Sapphire Rapids, you will be glad to know that they both come with support for CXL.
IBM, Astera Labs and several other companies too have joined the race, and we will see the list continue to grow in 2023.
In Conclusion
CXL is a technology that everyone should care about, not just because it is disruptive and brings pliability and composability of a never-before kind to server architectures – those are obviously key – but its importance in the context of Artificial Intelligence is even more pressing. Materialization of CXL is critical because it has all the potential to play a frontal role in realizing the vision of AI and ML in modern-day IT, and companies recognize that. That is why we must be ready for CXL.
As for 2023 CXL predictions, we’ll see a continued push towards bringing more CXL-enabled technologies into the market. In the coming days CXL will continue to empower high-performance storage as it further strengths and is deployed at scale. Suffice it to say that CXL will get more real in 2023.
If you’re as passionate as we are about CXL, then you’re in luck. At Gestalt IT, we have a ton of great content on CXL like the one above that explores its capabilities and potentials. Be sure to check out our Utilizing Tech series where we talk with industry experts about the evolution of CXL and cover blow by blow details of what’s going on in that space.