Highlights:

  • As per the sources, the company intends to construct a supercomputer utilizing 100,000 H100 graphics processing units from Nvidia Corp. to bolster its AI development initiatives.
  • In July of last year, three months post-launch, X.ai unveiled Grok, a large language model capable of generating both text and code.

An AI startup, xAI Corp., secured USD 6 billion in Series B funding aims to propel its product development and commercialization strategies forward.

The company, established just last year by Elon Musk, disclosed on Sunday that its latest funding round involved the contribution of over six investors, such as Valor Equity Partners, Vy Capital, Andreessen Horowitz, Sequoia Capital, Fidelity Management and Research, Prince Alwaleed Bin Talal, and Kingdom Holding, among others. This investment has elevated xAI’s valuation to USD 24 billion, marking a significant increase from its previous valuation of USD 18 billion.

The company announced its intention to utilize the funds from the round to launch its initial commercial offerings. Additionally, x.AI plans to allocate a portion of the capital towards developing advanced infrastructure. This revelation follows closely on the heels of a report from The Information, which stated the company’s plans to construct a supercomputer utilizing 100,000 H100 graphics processing units from Nvidia Corp. to bolster its AI development endeavors.

In March, Nvidia replaced the H100 with its latest flagship GPU, the Blackwell B200. During the peak demand for the H100 last year, retail prices reportedly soared to USD 40,000. Even with the introduction of the Blackwell B200, which is expected to reduce the price of the H100 by two-thirds, constructing the 100,000-GPU supercomputer that xAI intends to build would still incur a cost exceeding USD 1 billion.

Sources from The Information suggested that the company aims to have the system operational by the upcoming fall. There’s speculation that xAI could potentially partner with Oracle Corp. for this endeavor. Constructing a substantial AI training infrastructure demands more than just GPUs; it also entails other hardware components like storage systems, which Oracle could potentially provide.

Three months post its launch in July, X.ai unveiled a large language model named Grok, capable of generating both text and code. Subsequently, the model was made open source, and a more advanced iteration, Grok 1.5, was introduced. In internal testing, the latter version of the language model accurately responded to 74.1% of questions in HumanEval, a widely recognized benchmark for evaluating neural networks’ problem-solving capabilities.

In its concise statement regarding the USD 6 billion funding round, xAI elaborated on its ongoing efforts towards “multiple exciting technology updates and products soon to be announced.”  While the company refrained from delineating the specific products under development, in mid-April, upon unveiling a version of Grok 1.5 with image processing capabilities, xAI revealed its ambition to notably enhance “both our multimodal comprehension and generation capacities in the upcoming months.”

Multimodal AI models refer to neural networks with the ability to handle diverse data types beyond just text. In the last year, OpenAI, a competitor to xAI, has unveiled AI models proficient in generating images, videos, and speech. There’s a possibility that Elon Musk’s startup is contemplating a similar trajectory for its commercialization endeavors.

Presumably, any novel AI models developed by xAI will be accessible through commercial application programming interfaces (APIs). Currently, it provides access to Grok 1.0, the initial version of its flagship large language model, via an API accessible through an early access program.

Additionally, the company has crafted a desktop application aimed at aiding developers in prompt engineering, a process essential for refining prompts to optimize the output quality of an AI model, specifically Grok in this instance. Part of xAI’s fresh funding could be allocated to developing supplementary tools for software teams integrating its AI models.