Highlights:

  • IBM’s generative AI strategy, which focuses on algorithms that generate software code, is hinted at in another way by the open-source Granite models that were announced recently.
  • The business released some specifics about the coding performance of the Granite models, claiming that they outperformed several industry benchmarks, such as GSM8K, HumanEvalPlus, and HumanEvalPack.

IBM Corp. updates Watsonx platform with a new family of open-source Granite models. The company intends to stay at the forefront of generative artificial intelligence development.

Additionally, it introduced InstructLab, a project designed to hasten the open-source community’s contributions to the advancement of generative AI.

Chief Executive Arvind Krishna said that open-source will turn out to be a crucial segment of IBM’s generative AI strategy. “We want to use the power of open source to do with AI what was successfully done with Linux and OpenShift,” he stated while addressing the annual Think conference of the company.

Mohamad Ali, Senior Vice President and Chief Operating Officer of IBM Consulting, briefed that the company has been working with all AI businesses which is not surprising given its traditional bent. However, IBM is emphasizing on businesses rather than the users that most AI enterprises are striving to appeal to.

And those companies require assistance. According to him, 42% of clients have begun implementing AI pilots in production, while 40% are still in the sandbox. He continued by saying that IBM is collaborating with clients on over 300 AI projects.

IBM’s generative AI strategy, which focuses on algorithms that generate software code, is hinted at in another way by the open-source Granite models that were announced recently. IBM claims they are exceptionally good at coding tasks, exhibiting both efficiency and the capacity to produce high-caliber code that is better than many other large language models.

Granite LLMs excel in coding

According to IBM, the Granite LLMs would be offered in base- and instruction-following versions, with sizes ranging from three billion to 34 billion parameters. They can handle various jobs, including creating code, resolving issues, elucidating and documenting code, modernizing applications, managing repositories, and more. According to the business, the core models achieve state-of-the-art performance compared to other LLMs across coding tasks because they were trained on an astounding 116 programming languages.

The business released some specifics about the coding performance of the Granite models, claiming that they outperformed several industry benchmarks, such as GSM8K, HumanEvalPlus, and HumanEvalPack. Code synthesis, explanation, editing, translation, and fixing are among the activities they evaluate in popular programming languages, including Python, Rust, Go, JavaScript, C , and Java.

In fact, the company is so sure of Granite’s capabilities that it trained Watsonx’s Code Assistant for particular areas using the 20 billion-parameter base code model. Additionally, it powers the Watsonx Code Assistant for Z, an AI assistant meant to assist in rewriting COBOL-written mainframe applications.

Furthermore, IBM said that the 20 billion-parameter Granite base code model has been optimized to generate Structured Query Language queries from natural language instructions, enabling users to extract insights from their databases even without SQL expertise.

According to a renowned analyst, IBM’s emphasis on generative AI code generation is not unexpected, given that a large number of its primary clients are enterprise chief information officers and IT executives, who are especially intrigued by generative AI’s potential for IT modernization.

“IBM perhaps foresees an opportunity to help those customers with modernization,” he stated. “While the move toward open source is important as it provides clients with transparency and customizability, whether IBM can monetize that opportunity is yet to be seen.”

According to Krishna, the company will be able to expand on its initial capabilities and become even more capable by leveraging the support of a much wider community of developers, consumers, and other specialists, which is a major benefit of open-sourcing the Granite family.

“Open means more eyes on code, more minds on problems, and more hands-on solutions,” Krishna mentioned while explaining the business strategy. “For any technology to gain velocity and become ubiquitous, you’ve got to balance three things: competition, innovation, and safety, and open source is a great way to achieve all three.”

The analyst for a well-known media outlet claims that IBM’s choice to fully commit to the open-source model with the Granite models will significantly affect the broader trend of AI code development. “With all of the coding experience and exposure IBM possesses, it has created some of the best coding LLMs so far, and you can see from its partner momentum that they’re going to be extremely popular,” he said.

InstructLab to catalyze open AI development

IBM said it is collaborating with its partner Red Hat Inc. on a new project named InstructLab, a methodology for the ongoing development of foundational generative AI models, in keeping with its open-source approach. It invites community participation in their advancement, with steady, incremental improvements over time enhancing the effectiveness, safety, and performance of AI models.

Open-source developers can leverage their own data to boost performance by tailoring Granite LLMs and other models for particular business areas using InstructLab’s tools and tutorials.

Meanwhile, Red Hat and IBM unveiled Red Hat Enterprise Linux AI, an enterprise-ready version of InstructLab, as a new product that will take advantage of these open-source AI contributions. According to the company, it supports the deployment of AI across hybrid cloud and on-premises information technology infrastructures by giving access to the complete suite of Granite models and the Red Hat Enterprise Linux platform.

New Watsonx assistants

The business unveiled new features in Watsonx Orchestrate to assist clients in creating their own and a new class of Watsonx assistants in a separate Watsonx upgrade.

Among the new assistants are the October-releasing Watsonx Code Assistant for Enterprise Java Applications and the June-launching Watsonx Assistant for Z, which will revolutionize user interaction with IBM’s Z mainframes. New features that enable Watsonx Code Assistant for Z to comprehend and record applications in natural language will be added to the program in June as well.

In the meantime, the RHEL platform and OpenShift AI will be supported by the new Nvidia Corp. graphics processing units, including the GPU L4 and L40s processors, which will be available through the Watsonx platform. Additionally, IBM stated that teams may expedite the time it takes to deploy generative AI models in a compliant and secure manner by utilizing deployable architectures in Watsonx.

Additionally, improvements will be made to the current Watsonx.data platform, as well as the IBM Data Hub and Data Gate for Watsonx, both of which are scheduled to launch in June. According to IBM, these will support businesses’ efforts to monitor, manage, and enhance their datasets for artificial intelligence.

Watsonx automation, integrations, and third-party models

Finally, IBM revealed several third-party connectors and models that are now accessible in Watsonx, along with a new set of automation capabilities to enable AI-powered predictive automation of IT infrastructures.

The infrastructure-as-code capabilities of HashiCorp Inc., which IBM is acquiring for an estimated USD 6.4 billion, will augment the IT automation tools. A new product called IBM Concert, which serves as the hub of an organization’s IT operations and provides AI-powered insights from various apps and infrastructure platforms for performance optimization and troubleshooting, will be essential to these automation initiatives.

“The release of IBM Concert will be a major step forward for customers running IBM’s and other systems,” a media house analyst said.

The third-party model and integration that improves AI governance on the Amazon Web Services cloud is Watsonx.governance, Amazon SageMaker, and the watsonx platform’s availability on Microsoft Azure. The Salesforce Inc.’s customer relationship management platform will offer the IBM Granite models to support that business’ Einstein assistants. At the same time, the Watsonx platform will also grant access to additional models, including Meta Platforms Inc.’s Llama 3 and Mistral’s Mistral Large model.

Ali stated that IBM is collaborating with other leaders in AI, including Google LLC and Anthropic PBC, albeit IBM did not acknowledge them by name.