Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More
As more enterprise organizations look to the so-called agentic future, one barrier may be how AI models are built. For enterprise AI developer A121, the answer is clear, the industry needs to look to other model architectures to enable more efficient AI agents.
Ari Goshen, AI21 CEO, said in an interview with VentureBeat that Transformers, the most popular model architecture, has limitations that would make a multi-agent ecosystem difficult.
“One trend I’m seeing is the rise of architectures that aren’t Transformers, and these alternative architectures will be more efficient,” Goshen said. “Transformers function by creating so many tokens that can get very expensive.”
AI21, which focuses on developing enterprise AI solutions, has made the case before that Transformers should be an option for model architecture but not the default. It is developing foundation models using its JAMBA architecture, short for Joint Attention and Mamba architecture. It is based on the Mamba architecture developed by researchers from Princeton University and Carnegie Mellon University, which can offer faster inference times and longer context.
Goshen said alternative architectures, like Mamba and Jamba, can often make agentic structures more efficient and, most importantly, affordable. For him, Mamba-based models have better memory performance, which would make agents, particularly agents that connect to other models, work better.
He attributes the reason why AI agents are only now gaining popularity — and why most agents have not yet gone into product — to the reliance on LLMs built with transforms.
“The main reason agents are not in production mode yet is reliability or the lack of reliability,” Goshen said. “When you break down a transformer model, you know it’s very stochastic, so any errors will perpetuate.”
Enterprise agents are growing in popularity
AI agents emerged as one of the biggest trends in enterprise AI this year. Several companies launched AI agents and platforms to make it easy to build agents.
ServiceNow announced updates to its Now Assist AI platform, including a library of AI agents for customers. Salesforce has its stable of agents called Agentforce while Slack has begun allowing users to integrate agents from Salesforce, Cohere, Workday, Asana, Adobe and more.
Goshen believes that this trend will become even more popular with the right mix of models and model architectures.
“Some use cases that we see now, like question and answers from a chatbot, are basically glorified search,” he said. “I think real intelligence is in connecting and retrieving different information from sources.”
Goshen added that AI21 is in the process of developing offerings around AI agents.
Other architectures vying for attention
Goshen strongly supports alternative architectures like Mamba and AI21’s Jamba, mainly because he believes transformer models are too expensive and unwieldy to run.
Instead of an attention mechanism that forms the backbone of transformer models, Mamba can prioritize different data and assign weights to inputs, optimize memory usage, and use a GPU’s processing power.
Mamba is growing in popularity. Other open-source and open-weight AI developers have begun releasing Mamba-based models in the past few months. Mistral released Codestral Mamba 7B in July, and in August, Falcon came out with its own Mamba-based model, Falcon Mamba 7B.
However, the transformer architecture has become the default, if not standard, choice when developing foundation models. OpenAI’s GPT is, of course, a transformer model—it’s literally in its name—but so are most other popular models.
Goshen said that, ultimately, enterprises want whichever approach is more reliable. But organizations must also be wary of flashy demos promising to solve many of their problems.
“We’re at the phase where charismatic demos are easy to do, but we’re closer to that than to the product phase,” Goshen said. “It’s okay to use enterprise AI for research, but it’s not yet at the point where enterprises can use it to inform decisions.”
VB Daily
Stay in the know! Get the latest news in your inbox daily
Thanks for subscribing. Check out more VB newsletters here.
An error occurred.
FAQs
Q: What is the main limitation of Transformer models according to Ari Goshen?
A: According to Ari Goshen, Transformer models have limitations that make a multi-agent ecosystem difficult due to their reliance on creating many expensive tokens.
Q: Why does Goshen believe Mamba and Jamba architectures are more efficient?
A: Goshen believes that Mamba and Jamba architectures can make agentic structures more efficient and affordable, with better memory performance for AI agents.
Q: What trend has emerged in enterprise AI relating to AI agents?
A: The trend of AI agents has grown in popularity in enterprise AI, with companies launching AI agents and platforms to facilitate agent development.
Credit: venturebeat.com