Product
Ocient Favicon
The Ocient Hyperscale Data Warehouse

To deliver next-generation data analytics, Ocient completely reimagined data warehouse design to deliver real-time analysis of complex, hyperscale datasets.

Learn More
Pricing Icon
Pricing

Ocient is uniquely designed for maximum performance and flexibility with always-on analytics, maximizing your hardware, cloud, or data warehouse as a service spend. You get predictable, lower costs (and absolutely zero headaches).

See How
Solutions
Customer Solutions and Workload Services Icon
Customer Solutions and Workload Services

Ocient offers the only solutions development approach that enables customers to try a production-ready solution tailored to their business requirements before investing capital and resources.

Explore
Management Services Icon
Management Services

Tap into the deep experience of the Ocient Management Services team to set up, manage, and monitor your Ocient solution.

Learn More
Company
Ocient Favicon
About Ocient

In 2016 our team of industry veterans began building a hyperscale data warehouse to tackle large, complex workloads.

Learn More
Published October 18, 2024

Navigating the AI Compute Curve at the 2024 OCP Summit

Building a collaborative ecosystem for sustainable and scalable data center and software solutions

Jenna Boller, Ocient teamBy Jenna Boller, VP of Marketing at Ocient

Attending the 2024 Open Compute Project (OCP) Global Summit in San Jose this week had one clear and repeated theme – the AI era is fueling a surge in demand for energy that is currently not sustainable.

During the summit, I joined representatives from Solidigm, Supermicro, and Iceotope for a panel discussion moderated by Allyson Klein from TechArena: Navigating the AI Compute Curve: Adoption of Efficiently Scalable Data Center Solutions.

Kelley Mullick, Jenna Boller, Roger Corell, and Wendell Wenjen joined Allyson Klein from the TechArena at the OCP Global Summit 2024

Some highlights and takeaways from our panel:

  • Today, data centers use 2% of the world’s power. This is expected to reach 17% in the coming years.
  • In Ocient’s Beyond Big Data report launched last month, 31% of respondents said they would be willing to switch to a different data or AI solution if they could reduce energy consumption for compute-intensive workloads.
  • Innovation in liquid cooling enables almost 100% heat capture with solutions under development to store and transfer heat to swimming pools and other relevant sources.
  • Energy supply and demand may drive opportunities and challenges within edge computing, particularly for AI use cases.

In addition to the panel, the OCP Summit included several presentations around sustainability, data center energy efficiency, and the measurement of scope 2 and 3 emissions. Many sessions ended with a clear call to action to advance the impact of these working groups.

Here are three key themes I took away from the 2024 OCP Global Summit.

A call for transparency around the energy cost of AI and data applications

During our panel discussion and throughout the Q&A sessions I attended at the OCP Summit, there was ample talk around the need for software companies to provide transparency in reporting the amount of energy consumed for compute-intensive applications. Without transparency, there is no real ability for buyers and users to optimize or govern the use of their tech stack with energy requirements or goals in mind.

To date, much of the discussion has centered around efficient data centers, however, when partnering closely with hardware and components manufacturers, Ocient has been able to demonstrate a 50-90% reduction in the cost, system footprint, and energy consumption of an always-on, compute-intensive data environment against other legacy and cloud-based data platform software solutions.

To me, this makes it clear that software has a role to play in driving down the energy consumption and cost of compute-intensive data and AI applications. Providing transparency around the energy cost of various software applications can lead to more efficient computing – ensuring that software is written and modernized to maximize the underlying hardware and cloud instances.

There is an urgent need to adopt a common framework for carbon measurement and reporting

In a session titled From Metrics to Impact: Implementing Effective Carbon Reduction Strategies, Andrea Desimone from Schneider Electric and Kellie Jensen from Meta presented challenges in measuring and reporting on carbon emissions along with their ideas to build effective decarbonization strategies.

“From Metrics to Impact: Implementing Effective Carbon Reduction Strategies” – the expected goal of developing a standard approach to measure carbon impact of data centers.

By developing and aligning around a standard carbon metric for the various data center equipment and device components, organizations would have a way to rate the sustainability of their supplies. Perhaps more importantly, they could compare the carbon impact of their preferred solutions alongside price, performance, and other common purchase consideration metrics.

Desimone and Jensen also introduced some complexities around measuring embodied carbon versus operational carbon, including challenges around how to account for both metrics when making an informed decision.

In addition, their research uncovered challenges in driving alignment across the industry around a standard framework for carbon measurement and reporting.

Once a standard metric is defined, the next challenge is driving industry alignment and adoption.

Software and hardware vendors must co-design sustainable solutions

Ashish Jain from AMD shared that his company has set a “30×25” goal – 30x efficiency by 2025. While the company is on track to meet that goal, part of his call to action included more co-design between hardware and software vendors with tight collaboration across system components.

Ashish Jain from AMD presents his company’s call to action for continued impact.

He also called for standardized power monitoring and management capabilities across system components.

Across all the presentations, one thing was clear – collaboration is at the heart of OCP’s working groups. It was nice to see cross-sector representatives from utilities, tech giants, and startups leading projects and initiatives to drive adoption of more sustainable metrics and best practices.

Ocient delivers efficient compute-intensive data and AI workloads

While at the conference, I talked to Allyson Klein from TechArena and Jeniece Wnorowski from Solidigm about the ways in which Ocient is pioneering a new software architecture and services to drastically reduce the cost, system footprint, and energy consumption of always-on, compute-intensive data and AI solutions.

To hear more about Ocient’s innovation in the data warehousing technology space, check out the podcast here:

From its founding, Ocient optimized for efficient performance at scale in order to deliver the lowest cost and system footprint to customers. We partner closely with companies like Solidigm to leverage the latest high performance standard hardware and we architect our software layer to process data as efficiently as possible.

From our Compute Adjacent Storage Architecture® (CASA), which collapses the compute and storage layers into a single tier to our Zero Copy Reliability™ footprint which makes data reliable without copies, Ocient’s breakthrough approach delivers a 50-90% reduction in the system cost, footprint, and energy consumption for always-on, compute-intensive workloads.

Whether you’re looking to reduce the cost and energy footprint of your existing data-intensive workloads or architecting a new AI application, please reach out if you’d like to explore the cost and energy savings you can realize by migrating to Ocient.