August 22, 2024: AI21 Labs Unveils Enhanced Jamba LLMs with Longest Context - AI21 Labs introduced Jamba 1.5 Mini and Large, open-source hybrid SSM-Transformer models with 256,000 token context windows. These models efficiently handle longer data sequences, providing superior performance and latency over competitors like Llama 8B and 70B by leveraging a unique architecture that marries Transformers with Mambas structured state space. This results in improved context understanding, essential for enterprise AI tasks and reducing costs.