Advertisment

Cloudera's New AI Accelerators to Speed Up Machine Learning Projects

These AMPs provide pre-built, industry-tested workflows that reduce development time, enhance AI integration, and maximize the value of data. They are open-source and flexible for deployment in any environment, accelerating AI adoption and innovation.

author-image
Aanchal Ghatak
New Update
machine learning

Cloudera, a player in hybrid data platforms for analytics and AI, has launched six new Accelerators for Machine Learning Projects (AMPs), designed to expedite the development and deployment of AI use cases for enterprises. The latest AMPs aim to reduce time-to-value for businesses by providing pre-built, industry-tested solutions that simplify complex machine learning workflows and accelerate AI adoption.

Advertisment

Making AI Accessible and Scalable

Cloudera's Accelerators for ML Projects (AMPs) are end-to-end machine learning projects that can be deployed directly from the Cloudera platform with a single click. The AMPs provide a practical solution to a common challenge in AI deployment: the long and resource-intensive process of developing AI models from scratch.

By encapsulating best practices and offering pre-built solutions, Cloudera is not only enhancing the accessibility of AI but also empowering enterprises to scale their AI initiatives more confidently and quickly.

Advertisment

The six new AMPs unveiled by Cloudera are tailored to address specific needs in the AI landscape:

  1. Fine-Tuning Studio: A comprehensive application that facilitates the fine-tuning and evaluation of Large Language Models (LLMs), enabling enterprises to refine AI models to suit their specific needs.

  2. RAG with Knowledge Graph: This AMP showcases how to power Retrieval Augmented Generation (RAG) applications using a knowledge graph, capturing relationships and context beyond what is typically achievable with vector stores alone.

  3. PromptBrew: An AI-powered tool that assists in creating high-performing prompts through an intuitive user interface, addressing one of the most significant pain points in generative AI projects—prompt engineering.

  4. Document Analysis with Cohere CommandR and FAISS: This accelerator demonstrates the use of Retrieval Augmented Generation with Cohere CommandR as the LLM and FAISS as the vector store for efficient document analysis.

  5. Chat with Your Documents: Building upon the previous LLM Chatbot augmented with enterprise data, this AMP enhances chatbot responses using context derived from an internal knowledge base formed from user-uploaded documents.

These AMPs are designed to help companies in India, which is a growing market for AI. Cloudera believes that these tools will help Indian businesses use AI more effectively.

Advertisment

The AMPs are open source, which means companies can use them for free and customize them to their needs. Cloudera hopes that these tools will help more companies use AI and improve their businesses.

Mayank Baid, Regional Vice President India and South Asia at Cloudera, emphasizes that these new AMPs will be transformative for Indian businesses seeking to maximize the potential of their data. The pre-built, industry-tested solutions help reduce deployment time and complexity, enabling organizations to innovate faster and scale AI use cases more effectively.

Advertisment