Highlights:

  • In addition to an expanded catalog of preconfigured neural networks, the latest iteration of watsonx.ai incorporates new AI development features.
  • The Granite series comprises two language models: Granite. 13b.instruct and Granite.13b.chat.

Recently, IBM Corp. unveiled its latest collection of language models, the Granite series, which will soon be accessible as an integral component of its watsonx product suite.

The Granite series is being introduced in tandem with several other features. As per IBM, watsonx is set to receive a tool designed to simplify the process of generating artificial intelligence training datasets for companies. Another new feature addition will simplify adapting neural networks for new tasks.

Launched in May, watsonx is a suite of software products created to assist companies in constructing generative AI models. It also pledges to simplify associated tasks, ensuring neural networks meet safety standards.

The newly introduced Granite models by IBM will be accessible as a component of watsonx known as watsonx.ai. According to the company, this offering provides tools that facilitate the creation of custom neural networks. Watsonx.ai also features a set of prepackaged AI models, slated to receive enhancements with the addition of the Granite series later this quarter.

The Granite series comprises two language models: Granite. 13b.instruct and Granite.13b.chat. IBM states these models can summarize documents, conduct “insight extraction,” and generate textual content. The models were developed using a training dataset of 2.4 terabytes, meticulously crafted by IBM’s engineers.

Both Granite models have 13 billion parameters, making them sufficiently compact to operate on a single Nvidia Corp V100 graphics card. The V100 is notably more budget-friendly than the chipmaker’s flagship H100 graphics card. Consequently, the Granite series should theoretically be more accessible for companies to implement than larger language models, as they demand less advanced hardware.

“The initial Granite models are just the beginning: more are planned in other languages and further IBM-trained models are also in preparation,” Dinesh Nirmal, Senior Vice President of IBM Software, penned in a blog post recently.

The Granite series is being introduced on watsonx.ai alongside two open-source AI models. The first one is Llama-2, a general-purpose large language model developed by Meta Platforms Inc. IBM is also incorporating StarCoder, a neural network specifically optimized for programming tasks, which ServiceNow Inc. and Hugging Face Inc. jointly released in May.

In addition to an expanded catalog of preconfigured neural networks, the latest iteration of watsonx.ai incorporates new AI development features.

Developing a custom AI model necessitates a substantial volume of training data. In numerous instances, the manual compilation of such data can demand a substantial investment of time and effort. One approach that companies employ to streamline the workflow is by automatically generating training data using software.

This synthetic data, often referred to as such, may not always match the accuracy of manually created records, but it is frequently suitable for AI training purposes.

According to IBM, watsonx.ai is set to receive an integrated synthetic data generation tool. Companies must use this tool to upload a sample dataset, such as purchase logs. Watson.ai can analyze these logs and produce synthetic records with comparable attributes.

Modifying an AI model already trained for a new task typically involves retraining, which can demand substantial computational resources. To tackle this challenge, IBM is integrating watsonx.ai with a parameter tuning tool, enabling the optimization of a neural network for new tasks without the need for retraining.

With parameter tuning, developers optimize an AI model by creating a second neural network that plays a supporting role. The second neural network gives the AI model instructions on performing a given task. The AI can perform the task more effectively when those instructions are combined with users’ natural language prompts.

As part of the recent update, IBM is also enhancing watsonx.data. That’s a component of the watsonx product suite built to help companies manage their AI training datasets.

IBM is improving the tool by introducing a conversational interface. The interface will make it easier for users to visualize information contained in watsonx.data, refine it and discover specific records. IBM is also introducing a vector database intended for holding embeddings, the mathematical structures used by AI models to store their internal knowledge stores.