Highlights:

  • Due to security and privacy concerns, organizations from several industries also want control over the AI models and data they employ.
  • The needs of a customer for AI infrastructure are ascertained by Equinix and its regional partners, who then convert those needs into precise specifications.

A digital infrastructure provider, Equinix Inc., partnered with Nvidia to introduce a private cloud service that enables businesses to control their own Nvidia Corp. DGX supercomputing setup for creating unique generative artificial intelligence models.

The new service, which Equinix runs, combines Nvidia’s DGX systems, networking, and AI Enterprise software, a set of tools for implementing AI applications.

Generative AI has a wide range of possible applications across several industries. It is transforming patient care and medication discovery in the healthcare industry. It analyses enormous volumes of data in the financial services industry to help with fraud detection and customer support. Personalized content is improving consumer experiences in retail. Despite the excitement, many businesses lack the knowledge to develop and oversee sophisticated AI systems.

Due to security and privacy concerns, organizations from several industries also want control over the AI models and data they employ. During an analyst briefing announcing the service, Charlie Boyle, Vice President of DGX systems at Nvidia, explained that Equinix partnered with Nvidia to make sure that businesses can position AI close to their data — whether they’re training new models or using advanced AI techniques like retrieval-augmented generative AI.

Boyle said, “Our customers have come to us over the years saying they need AI functionality to add to their business applications. Every day they don’t have that functionality is a day of lost opportunity. So, we want to make it very easy and flexible for them by working with Equinix to build a turnkey solution.”

Delivering a complete, proven design is essential to this collaboration. Customers have repeatedly expressed their desire for disaggregated solutions to provide them freedom and choice in their infrastructure deployments throughout the years. But as the saying goes, watch what you ask for.

With the complete disaggregation of data center technologies, clients now face challenges in optimally combining the correct parts to advance projects like AI and scalability for the future. That is precisely what the service-based relationship between Nvidia and Equinix accomplishes.

This is how the service functions: The needs of a customer for AI infrastructure are ascertained by Equinix and its regional partners, who then convert those needs into precise specifications. One invoice for all the technology that was acquired streamlines this process. Equinix’s global network of data centers is being configured, and the essential systems are being deployed in the background by Nvidia and Equinix.

Equinix deploys Nvidia infrastructure for its clients in 250 IBE data centers in 32 nations. This complies with stringent data protection and compliance regulations while guaranteeing quick and safe connectivity to cloud services and the internet. Customers can now access Nvidia’s AI software and obtain concurrent assistance from Nvidia and Equinix specialists. This makes it easier to construct AI apps, enables global scalability of their solutions, and does away with the requirement for internal knowledge.

The private cloud service uses DGX SuperPOD, Nvidia’s AI data center infrastructure platform. This implies businesses won’t have any issues with data center space, power supplies, or infrastructure management when setting up their AI infrastructure. According to Jon Lin, Executive Vice President and General Manager of Data Centre Services at Equinix, “Equinix and Nvidia provide a new level of convenience” by managing the difficult work of system setup for customers.

Lin said, “It’s challenging for enterprises to deploy Nvidia DGX inside of their existing legacy data centers, which may not have enough power or cooling to support new workloads and capabilities. We’re hearing loud and clear from our enterprise customers that sustainability is at the forefront of everything they’re thinking about on their infrastructure.”

Given the operational requirements of AI infrastructure, such as higher electricity and cooling needs, sustainability is a critical component of the new service. For instance, Equinix has promised to run all its operations, including the data centers that house Nvidia’s artificial intelligence systems, using only renewable energy sources.

Additionally, the business uses equipment that uses less water and energy for cooling. To help clients achieve their sustainability objectives, they receive comprehensive information on sustainability parameters unique to their installations.

A compelling value proposition for the collaboration is the final point. Sustainability is a top priority for every business leader and information technology professional. AI is another area of attention for them, although the infrastructure needed for AI is incompatible with sustainability as it requires significant cooling, data center space, and extensive infrastructure updates. According to the company’s service-led strategy, customers may advance with AI and delegate sustainability management to Equinix, which has years of experience in this field.

This service is perfect for companies who want to use AI without dealing with the difficulty of building and maintaining their infrastructure because of its worldwide accessibility, professional administration, and dedication to sustainability.