Introducing AI21 Labs Jamba 1.5 Large and Jamba 1.5 Mini on Azure AI Models-as-a-Service by info.odysseyx@gmail.com August 22, 2024 written by info.odysseyx@gmail.com August 22, 2024 0 comment 11 views 11 This June, AI21 Labs’ AI21 Jamba-Instruct was released. First in Azure And now, In collaboration with AI21We are excited to announce the launch of two new open models. AI21 Jamba 1.5 Large and AI21 Jamba 1.5 MiniIn the Azure AI Model Catalog. These models are based on the Jamba architecture, which combines Mamba and Transformer layers to achieve performance and efficiency for long-term context processing tasks. You can start with models like the client sample in Azure AI Studio Hub. Langchain, Light LLM, Web request and Azure Client for AI21. “We are excited to deepen our collaboration with Microsoft to bring cutting-edge innovations in the Jamba Model family to Azure AI users,” said Pankaj Dugar, North America SVP and GM, AI21. “As an advanced hybrid SSM-Transformer model family, the Jamba Open Model family democratizes access to LLMs that deliver efficiency, low latency, high quality, and long-context processing. These models drive enterprise performance and are seamlessly integrated with the Azure AI platform.” Scalability of Azure AI Model CatalogFeaturing over 1,600 basic models, this collection offers versatility and ease of use. The collection includes contributions from industry leaders such as: AI21 LabsComprehensive coverage for diverse needs through Cohere, NVIDIA, OpenAI, G42, Mistral, etc. Partnerships and launches with leading AI vendors Pie-3 With the help of Microsoft Research, the product has been significantly expanded, making it easier for customers to find and select the right model for their specific applications. What are the features of the Jamba 1.5 model? Based on information from AI21 Labs, Jacket 1.5 Large and Jamba 1.5 mini The models are the most powerful models ever built on the Jamba architecture. These models utilize a Hybrid Mamba-Transformer architecture that optimizes the tradeoff between speed, memory, and quality by using the Mamba layer for short-range dependencies and the Transformer layer for long-range dependencies. The result is a family of models that can handle long-range contexts with high efficiency and low latency. Jamba 1.5 Mini has 12 billion active parameters and 52 billion total parameters, while Jamba 1.5 Large has 94 billion active parameters and 398 billion total parameters. Both models support 256K context windows, allowing them to process up to 256,000 tokens or characters at a time. This is a significant improvement over the standard context windows of most large language models, and opens up new possibilities for generative AI applications that require longer texts, such as document summarization, text generation, or information extraction. These models provide several features that are easy to use and integrate, such as function invocation, RAG optimization, and JSON mode. These features allow you to perform complex tasks, such as querying external knowledge sources, composing multiple functions, or formatting output, with simple natural language commands. AI21 Labs highlights a variety of use cases for these models, including: Financial Services Generate loan terms Customer Service Representative (Grounded Q&A) Investment Research (Grounded Q&A) Healthcare / Life Sciences Digital Health Assistant Research Assistant Retail / CPG Product Description Generator Product FAQ Generator Shopping Assistant Why use the Jamba model family in Azure? conjugation Jamba 1.5 Model Family on Azure It enables organizations to fully leverage AI with safety, reliability, and security. This offering also enables developers to use Azure AI Studio tools, such as Azure AI Content Safety To strengthen responsible AI practices Azure AI Searchand rapid flow We evaluate LLM outcomes by calculating metrics such as validity. Customers can use the API with a variety of clients, including Prompt Flow. OpenAI, Langchain, Light LLM, CLI using curl and Python web requestsAnd the AI21 Lab Azure ClientJamba 1.5 Large and Jamba 1.5 Mini models are available as Models-as-a-Service (MaaS) on Azure AI, making it easy to deploy as pay-as-you-go inference APIs without managing the underlying infrastructure. Developers can also build with confidence knowing their data is safe. To use Jamba 1.5 Large and Jamba 1.5 Mini, access the Azure AI Studio model catalog, select a Jamba 1.5 model, and deploy using the pay-go option. How to use Jamba 1.5 Large and Jamba 1.5 Mini in Azure AI To start the build, type: Azure AI Studio Model Catalog Take advantage of the Jamba 1.5 model. Visit for documentation on getting started. this Link. Deploying a Jamba 1.5 model takes a few minutes, following these steps: Get used to it: If you’re new to Azure AI Studio, review this first.documentationUnderstand the basics and set up your first project. Access the model catalog: Open the model catalog in AI Studio. Find a model: Use the filters to select an AI21 Labs collection or click the “View Models” button on the MaaS announcement card. Select a model: Open the Jamba 1.5 model from the list. Model deployment: Click ‘Distribute’ and select the Pay-as-you-go (PAYG) distribution option. Subscriptions and Access: Subscribe to the offer to access the model (fees apply). Then proceed with deployment. Playground Exploration: After deployment, you will be automatically redirected to Playground, where you can explore the model’s features. Customize your settings: Tune the context or inference parameters to fine-tune the model’s predictions to your needs. Programmatic access: Click the “View Code” button to get the API, key, and code snippets that will allow you to access and programmatically integrate with the model. Tools and Integrations: Use APIs provided by large-scale language model (LLM) tools such as prompt flow, semantic kernel, LangChain, or other tools that support REST APIs for inference using key-based authentication. Frequently Asked Questions (FAQ) How much does it cost to use? Jamba 1.5 Large or Jamba 1.5 Azure’s mini model? You will be charged based on the number of prompts and completion tokens. You can review the pricing in the Marketplace Offering Details tab when deploying your model. You can also find the pricing here: Azure Marketplace. Jamba 1.5 Large: Paygo-inference-input token is 1k for $0.002. Paygo-inference-output-token is 1k for $0.008. Jamba 1.5 Mini: Paygo-inference-input tokens cost $0.0002 per 1,000. Paygo-inference-output-tokens cost $0.0004 per 1,000. Do I need GPU capacity in my Azure subscription to use the Jamba 1.5 model? No, GPU capacity is not required. Jamba 1.5 Large and Jamba 1.5 Mini models are available as APIs via Models as a Service. ~is Jamba 1.5 Large or Jamba 1.5 Can I use Mini in Azure Machine Learning Studio? Yes, Jamba 1.5 models are available in the model catalogs in Azure AI Studio and Azure Machine Learning Studio. Jamba 1.5 Large and Jamba 1.5 Mini are listed on Azure Marketplace. Can I purchase and use the Jamba 1.5 model directly from Azure Marketplace? Azure Marketplace is the commercial foundation for models built on or for Azure. With Azure Marketplace, you can purchase and bill for: Jamba 1.5 model. However, model discoverability occurs both in the Azure Marketplace and in the Azure AI Model Catalog. That is, you can search and find Jamba 1.5 models in both the Azure Marketplace and the Azure AI Model Catalog. When you search for a Jamba 1.5 model in Azure Marketplace, you can subscribe to it before being redirected to the Azure AI Model Catalog in Azure AI Studio, where you can complete the subscription and deploy the model. When you search for Jamba 1.5 models in the Azure AI Model Catalog, you can subscribe to and deploy models from the Azure AI Model Catalog without having to start from the Azure Marketplace. The Azure Marketplace still tracks the primary commerce flow. is given Jamba 1.5 model Does billing through Azure Marketplace expire my Azure Use Commitment (aka MACC)? Will my inference data be shared with AI21 Labs? Are there any rate limits? Jamba 1.5 Large or Jamba 1.5 Azure’s mini model? Yes, there are rate limits for Jamba 1.5 Large and Jamba 1.5 Mini models in Azure. Each deployment has a rate limit of 400K tokens per minute and 1,000 API requests per minute. If you have any further questions, please contact Azure Customer Support. Are you? Jacket 1.5 Are the models available in different regions? A Jamba 1.5 Large or Jamba 1.5 Mini model API endpoint can be created in an AI Studio project for an Azure Machine Learning workspace in the following regions: Jamba 1.5 Mini Region: US East 2, Sweden Central Jamba 1.5 Large Area: Eastern US, Central Sweden To use a model in a prompt flow in a project or workspace in a different region, you can manually use the API and key as a connection to the prompt flow. By default, if you create an API in a region listed above, the API will be available in all Azure regions. Can you fine-tune it? Jamba 1.5 Large and Jamba 1.5 Azure’s mini model? Currently, fine-tuning models through Azure AI Studio is not possible. Is the MaaS model available for all Azure subscription types? Customers can use the MaaS model in all Azure sub-section types using a valid payment method, except for the Cloud Solution Provider (CSP) program. Free or trial Azure subscriptions are not supported. Source link Share 0 FacebookTwitterPinterestEmail info.odysseyx@gmail.com previous post Configurando DAPR, KEDA no AKS next post New on Microsoft AppSource: August 12-17, 2024 You may also like 7 Disturbing Tech Trends of 2024 December 19, 2024 AI on phones fails to impress Apple, Samsung users: Survey December 18, 2024 Standout technology products of 2024 December 16, 2024 Is Intel Equivalent to Tech Industry 2024 NY Giant? December 12, 2024 Google’s Willow chip marks breakthrough in quantum computing December 11, 2024 Job seekers are targeted in mobile phishing campaigns December 10, 2024 Leave a Comment Cancel Reply Save my name, email, and website in this browser for the next time I comment.