All Solidigm Solidigm 2024 Sponsored Utilizing Tech

Efficiently Scaling AI Data Infrastructure with Ocient | Utilizing Tech 07×05

As the volume of data supporting AI applications grows ever larger, it’s critical to deliver scalable performance without overlooking power efficiency. This episode of Utilizing Tech, sponsored by Solidigm, brings Chris Gladwin, CEO and co-founder of Ocient, to talk about scalable and efficient data platforms for AI with Jeniece Wnorowski and Stephen Foskett. Ocient has developed a new data analytics stack focused on scalability with energy efficiency for ultra-large data analytics applications. At scale, applications need to incorporate trillions of data points, and it is not just desirable but necessary to enable this without losing sight of energy consumption. Ocient leverages flash storage to reduce power consumption and increase performance but also moves data processing closer to the storage to reduce power consumption further. This type of integrated storage and compute would not be possible without flash, and reflects the architecture of modern processors, which locate memory on-package with compute. Ocient is already popular in telco, e-commerce, and automotive, and the scale of data required by AI applications is similar, especially as concepts like retrieval-augmented generation are implemented. The conversation around datacenter, cloud, and AI energy usage is coming to the fore, and companies must address the environmental impact of everything we do.

Apple Podcasts | Spotify | Overcast | More Audio Links | UtilizingTech.com


Data Analytics’ Troubling Energy Demands, with Chris Gladwin of Ocient

A problem that has existed for decades in computing is the disproportional data to dollar ratio.

“If you want to analyze a million times the data, it’s going to cost you more than a million times the dollars,” exclaims Chris Gladwin, CEO and cofounder of Ocient, a Chicago-based data analytics solution company that is known for its breakthrough in energy efficiency.

Compound cost acceleration is a persistent challenge that has left many organizations struggling to make full use of data.

The Cost of Doing Analytics Work on Data Is No Insignificant Sum

Artificial intelligence is setting alight the world tech scene, and data analytics has become the new phrase du jour on everyone’s lips. Emerging AI superpowers are plowing money to build gigantic power-hungry data centers to crunch through repositories of data with AI.

The cost of that is becoming increasingly clear. Analytics landmarks require astronomical computing power. In the last few years, computing demand in enterprises has shot up by 10 time.

“The ultra-large analytics workloads of today will become tomorrow’s normal, and eventually that’s what your phone will do.” he says.

It is a remarkable feat to extract out huge amounts of business intelligence from raw, unstructured data. But it cannot be done without arrays of servers fitted with specialized chips.

Behind the scenes, these servers rely on kilowatts of power to perform intensive calculations. That can boost up the carbon emissions, as a majority of the data centers run on fossil fuel.

“As an industry, we really have to be on the right trajectory of efficiency,” Gladwin urges, “Otherwise, we’re going to have some real problems powering this future.”

This episode of Utilizing Tech, presented by Solidigm, focuses on the importance of balancing scalable performance and energy-efficiency in AI data infrastructure. Host, Stephen Foskett, and Data Center Product Marketing Manager for Solidigm, Jeniece Wnorowski, talk to Gladwin about the immense energy footprints of data centers, and how it can be contained before it blows out of proportion.

Some Perspective

Data has grown an incomprehensible amount since the turn of this century. Research shows that approximately 402.74 million terabytes of data are generated every day.

What was only at petabytes a few years back has quickly burgeoned to exabytes and zettabytes.

“The next number that people are going to start to learn is quadrillion, which is a thousand trillion. Right now, we’re working on the 1st quadrillion scale system. These things don’t deploy overnight.”

This has ignited an obsession for always-on, hyperscale, compute-intensive data analytics workloads.

The biggest pursuer right now is the telecommunication sector, pouring somewhere between $5 trillion to $10 trillion.

“Telcos are big networks and they’re going through a process of making the largest investment in human history – 5G.”

A large telecom company generates massive amounts of data and metadata every day, owing to the infinite connections the network hosts. With 5G, the volume grows 30-fold.

“If they want to go back and analyze why the network is slow in a certain area, or where they should put their next cell tower, or for compliance reasons, they have to analyze this data, and they can’t analyze at that trillion scale.”

Other sectors like automotive and financial services too are going through this crisis. A big portion of the data that is generated gets thrown away every year due to lack of cost-efficient and sustainable solutions.

“You need at least 500 cores to deliver that solution. Typically, in terms of data volume, the average query or the average machine learning function are going to look at hundreds of billions if not trillions of data elements, or rows in a spreadsheet would be one way to think of that.”

These make the target market for Ocient.

A Bad Option Is No Option at All

For years, companies have addressed the situation with one of the two available options – DRAMs and hard disk drives.

Building machines with adequate DRAMs to support the growing volumes of data is an insanely expensively proposition.

The alternative does not have the performance yields required for it. “Performance is ultimately a physical phenomenon,” says Gladwin. “It’s how fast the read-write head settles into the track, and how fast the platter spins. That time hasn’t changed for decades and on a Moore’s law adjusted basis, spinning discs keep getting slower and slower than they used to be.”

Ocient has devised a novel software architecture that can perform complex analytics on large datasets at breakthrough costs. Targeted at the emerging crop of ultra-large analytics workloads, the solution can scale limitlessly without the usual cost escalations.

“We can do 50 to 90% reduction in energy consumption,” and Gladwin credits it to Ocient’s unwavering focus, and dedicated and diligent engineering work.

Ocient leverages solid-state drives to push down the power and cost envelop.

“Solid-state drives today offer approximately 2000 to 3000 times the price-performance per dollar than spinning discs. That’s limited not by a physical phenomenon but electrical phenomenon. It’s on a Moore’s law curve.”

Frontline companies like Solidigm have constantly pumped money into SSD innovation through the past decade. As a result of that, the technology today is thriving in the big data analytics field.

Making a Big Difference Takes Change

Ocient adopts a data approach similar to Apple and NVIDIA.

“If you want to run a query or ML function, you cannot use the usual architecture. It takes a full day, to move petabytes of data.”

The traditional architecture has a two-tier design for storage and compute. To make up for the lower bandwidth caused by the separate tiers, it works on small-size datasets.

“That’s fine if it’s a GB or even a TB of data. When you’re getting into tens of petabytes, or the hundreds of terabytes scale, it is not going to work.”

Ocient’s architecture has a collapsed design. In the model, compute and storage are squeezed together into a single tier. Data is not required to be hauled across the network from storage systems to compute servers. Instead, data can flow in and out through parallel PCIe lanes within the server. This amps up the bandwidth many times.

“The problem we often have is that the rate at which data is being added to the system is greater than the rate at which other systems can add data. So you never catch up.”

With Ocient’s ultra-efficient solution, racks of equipment and hundreds of kilowatts of power draw can shrink down to half the rack size and one-tenth the energy footprint. This does two things at once – reduce the physical footprint of deployments, and as a natural consequence of that, minimizes the cost blowout.

“We immediately come out with 50% to 90% improvement, in some cases demonstrating even a 98% reduction.”

Ocient prioritizes the low-hanging fruits that have big impacts on cost and energy savings, before striving for the glide goals.

Gladwin pleads that customers be circumspect about purchasing systems without investigating energy footprints. When that awareness is baked into their purchasing decisions, customers can not only enjoy breakthrough economics, but the whole industry will be compelled to turn its focus on long-term sustainability.

Ocient has joined efforts with a lot of industry heavyweights to up the game. Together with partners like Solidigm, it is defining measurements and metrics, and setting visions and goals, to push awareness deeper into the industry.

Be sure to catch the episode – Efficiently Scaling AI Data Infrastructure with Ocient- at the Utilizing Tech website, or on your favorite podcast platform. Head over to Ocient’s website to learn more about their solution. To learn about Solidigm’s sustainability efforts, be sure to browse around the archives on Solidigm’s website.

Podcast Information:

Stephen Foskett is the Organizer of the Tech Field Day Event Series President of the Tech Field Day Business Unit, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter and read more on the Gestalt IT website.

Jeniece Wnorowski is the Datacenter Product Marketing Manager at Solidigm. You can connect with Jeniece on LinkedIn and learn more about Solidigm and their AI efforts on their dedicated AI landing page or watch their AI Field Day presentations from the recent event.

Chris Gladwin is the CEO and Cofounder of Ocient. You can connect with Chris on LinkedIn. Learn more about Ocient on their website.

Learn more about Ocient:


Thank you for listening to Utilizing Tech with Season 7 focusing on AI Data Infrastructure. If you enjoyed this discussion, please subscribe in your favorite podcast application and consider leaving us a rating and a nice review on Apple Podcasts or Spotify. This podcast was brought to you by Solidigm and by Tech Field Day, now part of The Futurum Group. For show notes and more episodes, head to our dedicated Utilizing Tech Website or find us on X/Twitter and Mastodon at Utilizing Tech.

About the author

Sulagna Saha

Sulagna Saha is a writer at Gestalt IT where she covers all the latest in enterprise IT. She has written widely on miscellaneous topics. On gestaltit.com she writes about the hottest technologies in Cloud, AI, Security and sundry.

A writer by day and reader by night, Sulagna can be found busy with a book or browsing through a bookstore in her free time. She also likes cooking fancy things on leisurely weekends. Traveling and movies are other things high on her list of passions. Sulagna works out of the Gestalt IT office in Hudson, Ohio.

Leave a Comment