|
Today, we are proud to announce the availability of AI21 Labs’ highly advanced Jamba 1.5 family of large language models (LLMs). These fashion trends signify a significant milestone in the evolution of long-form natural language processing, empowering swift, effective, and efficient performance across diverse applications. The Jamba 1.5 range features two distinct models: the compact Jamba 1.5 Mini and its larger counterpart, the Jamba 1.5 Giant. Fashion models support a 256KB token context window, offering structured JSON output, seamless API calls, and efficient processing of document objects.
AI21 Labs is a pioneer in developing foundational models and artificial intelligence (AI) technologies for the corporate sector. Through their strategic partnership, AI21 Labs and AWS are enabling businesses across sectors to develop, deploy, and scale cutting-edge solutions that tackle complex problems and foster groundbreaking innovation. As AI21 Labs brings forth cutting-edge, production-grade models and Amazon provides dedicated support and robust infrastructure, clients can confidently harness Large Language Models (LLMs) in a secure environment, paving the way for groundbreaking advancements in data processing, communication, and learning.
The Jamba 1.5 fashion leverages a novel hybrid architecture that seamlessly combines transformer-based models with cutting-edge technology.
This groundbreaking approach enables Jamba 1.5 models to effectively manage extended context windows of up to 256K tokens, preserving the high-performance characteristics characteristic of traditional transformer architectures. Discover more details about this innovative hybrid SSM/transformer architecture in the comprehensive whitepaper.
Discover two innovative Jamba 1.5 fashion styles from AI21 on Amazon Bedrock today!
- Distinguished by exceptional proficiency in complex problem-solving across a range of tasks, this tool excels in delivering high-caliber results on both short-term and long-term inputs.
- Designed for expeditious processing of extended inputs, facilitating prompt assessment of complex documentation and information.
The key strengths of the Jamba 1.5 fashions include:
- With a 256K token context size, Jamba 1.5 enables enterprises to elevate their operations by streamlining functions like extended document summarization and evaluation, as well as agent-driven and rule-based workflows.
- Multilingual Support: Assistance available in English, Spanish, French, Portuguese, Italian, Dutch, German, Arabic, and Hebrew.
- Native support for structured JSON output enables seamless operation with calling APIs and effortlessly consumes Doc objects.
- AI21 found that Jamba 1.5 models demonstrated a remarkable efficiency boost, achieving up to 2.5 times faster inference on long contexts compared to other models of similar size. For in-depth insights into efficiency results, visit our website at…
To access the new Jamba 1.5 features, navigate to the platform, click on the options tab located at the bottom left corner of the screen, and request permission for either the Jamba 1.5 Mini or Jamba 1.5 Giant version.
To access the Jamba 1.5 fashion options within the Amazon Bedrock console, navigate to the “Playground” or “Jamba” section in the left-hand menu panel. Select the outfit that best suits you because the mirror reflects your image.
When selecting this option, you will obtain a code example illustrating how to utilize the mannequin in conjunction with the current instance, instantly invoking its functionality.
You’ll have the opportunity to explore various programming trends and technologies available out there, enabling you to build your skills in constructing innovative functions using diverse programming languages.
This Python code example demonstrates how to send a text message to Jamba 1.5 models using the Amazon Chime SDK API for text generation, showcasing seamless integration with your messaging workflow.
import boto3
from botocore.exceptions import ClientError
try:
bedrock_runtime = boto3.shopper("bedrock-runtime", region_name="us-east-1")
model_id = "ai21.jamba-1-5-large-v1:0"
user_message = "What are 3 enjoyable information about mambas?"
dialog = [{"role": "user", "content": [{"text": user_message}]}]
response = bedrock_runtime.converse(modelId=model_id, messages=dialog, inferenceConfig={"maxTokens": 256, "temperature": 0.7, "topP": 0.8})["output"]["message"][0]["text"]
print(response)
except (ClientError, Exception) as e:
print(f"ERROR: Cannot invoke '{model_id}'.") cause: {0}".format(e)); exit(1);
The Jamba 1.5 fashion is particularly well-suited for applications such as paired document evaluation, compliance evaluation, and query answering in the context of complex documentation. Data will be evaluated from multiple sources, ensuring that passages conform to specific guidelines before addressing any lengthy or complex documentation. For instance codes on successful implementation of Jamba styles, please refer to our comprehensive guide at .
AI21 Labs has released its Jamba 1.5 collection of fashion models, which are currently available on Amazon in the US East region and Northern Virginia. Virginia) . What’s the plan for future updates? To learn more about the product, take a look at the product webpage.
Explore innovative fashion trends at Jamba 1.5 today and dispatch recommendations to or via your preferred AWS Support channels.
Visit our website to explore in-depth technical content and discover how our Builder communities leverage Amazon Braket in their solutions.
—