Datagrom AI News Logo

AI Weekly News: Stay current without the noise

Subscribe

AI21 Labs’ updated hybrid SSM-Transformer model Jamba gets longest context window yet

AI21 Labs’ updated hybrid SSM-Transformer model Jamba gets longest context window yet

August 22, 2024: AI21 Labs Unveils Enhanced Jamba LLMs with Longest Context - AI21 Labs introduced Jamba 1.5 Mini and Large, open-source hybrid SSM-Transformer models with 256,000 token context windows. These models efficiently handle longer data sequences, providing superior performance and latency over competitors like Llama 8B and 70B by leveraging a unique architecture that marries Transformers with Mambas structured state space. This results in improved context understanding, essential for enterprise AI tasks and reducing costs.

KEEP UP WITH THE INNOVATIVE AI TECH TRANSFORMING BUSINESS

Datagrom keeps business leaders up-to-date on the latest AI innovations, automation advances,
policy shifts, and more, so they can make informed decisions about AI tech.