Exploring Generative AI Hub

Objective

After completing this lesson, you will be able to describe AI foundation and generative AI hub

AI Foundation on SAP BTP

In this chapter, you will learn about SAP’s AI foundations and the tools and services that come with it to help you solve your business challenges. Learn about generative AI hub, how to access it, and how to use it.

Our Business AI is embedded across the portfolio. Joule: A copilot that truly understands your business. Embedded AI capabilities: SAP Cloud ERP, SAP Supply Chain Management, SAP ERP Human Capital Management, Spend Management and SAP Business Network, SAP Customer Relationship Management, SAP Business Technology Platform. Customized AI. AI Foundation on SAP Business Technology Platform. AI ecosystem partnerships and investments: Aleph Alpha, Anthropic, AWS, Cohere, Databricks, Google Cloud, IBM, Meta, Microsoft, Mistral AI, NVIDIA.

SAP’s AI Foundation provides the platform to infuse Business AI across applications and processes, making it a vital resource for extending and innovating within and beyond SAP landscapes. It's the center of gravity for developers who want to orchestrate cutting-edge technology with business context for mission-critical scenarios. SAP uses this foundation to build their own applications and also offers it to customers and partners to bootstrap their own customized solutions so that they can benefit from the same technology integrations.

AI Foundation is a comprehensive set of services and tools for AI developers to develop, deploy, and manage powerful custom-built AI applications and AI-powered extensions on SAP Business Technology Platform (SAP BTP).

It offers ready-to-use, customizable, and business data-grounded AI capabilities, supported by the flexible access to all frontier AI models and specifically generative AI foundation models. SAP embeds AI Foundation across its portfolio.

The following video describes AI Foundation on SAP BTP.

Generative AI Hub

Need for Generative AI Hub

The growing landscape of LLMs necessitates a generative AI hub that will help to facilitate systematic and tool-supported model selection tailored to diverse use cases. Generative AI hub, part of SAP AI Core and SAP AI Launchpad on SAP Business Technology Platform, consolidates trusted LLM and foundation model access, grounded on business and context data, and LLM exploration into a single entity. This streamlines innovation, ensures compliance, and offers versatility, benefiting both SAP's internal needs and its broader ecosystem of partners and customers.

Generative AI Hub in SAP AI Core

The generative AI hub, within SAP AI Core, is crucial in boosting business AI capabilities, acting as the main hub for including generative AI into AI tasks in both SAP AI Core and SAP AI Launchpad.

The following video shows how generative AI hub is constructed within SAP AI Core.

SAP leverages diverse models from multiple providers through the hub, reducing dependency on a single vendor. The generative AI hub also supports innovation by providing accessible tools like the AI playground. As a result, it enhances the accuracy and relevance of AI applications that use SAP's data assets.

Generative AI Hub

Describing the development, deployment, and management of AI solutions and extensions of SAP applications, focusing on adaptability and AI lifecycle management.

Generative AI hub covers the AI lifecycle efforts end to end and enables customers to develop, deploy, and manage custom-built AI solutions and AI-powered extensions of SAP applications.

Access: Enables access to frontier AI models, out-of-the-box selection of compute resources, and orchestration modules (like content filtering, data masking, and more).

Exploration and Development: Provides a comprehensive toolset for building of custom AI solutions and model exploration, including a prompt editor, prompt management, prompt registry, libraries, SDKs, and a fine-tuning service. This also includes an AI playground, chat, and prompt management to explore different models, meta data and parameter changes to find the best fitting technology for your needs. All in a secure and safe environment to interact with cutting-edge technology.

Deployment and Delivery: Software deployment includes all the steps, processes, and activities that are required to make a software system or update available to its intended users. Today, most IT organizations and software developers deploy software updates, patches, and new applications with a combination of manual and automated processes.

Support for Bring Your Own Model (BYOM) as well as allocation of compute resources, (re) training, and serving template workflows. This facilitates efficient deployment and delivery of AI models and applications to ensure they are operational and accessible to users.

Orchestration: AI orchestration refers to the process of coordinating and managing the deployment, integration, and interaction of various AI components within a system or workflow. This includes orchestrating the execution of multiple AI models, managing data flow, and optimizing the utilization of computational resources.

AI orchestration aims to streamline and automate the end-to-end life cycle of AI applications. It ensures the efficient collaboration of different AI models, services, and infrastructure components, leading to improved overall performance, scalability, and responsiveness of AI systems.

Coordinating and managing AI compute workflows and scheduling, content moderation, data masking, agent deployment, grounding capabilities, and inference engines to ensure seamless and efficient operation of AI systems.

Governance: Implementing policies and procedures to manage the development, deployment, and operation of AI systems in compliance with regulatory and organizational standards. This includes logs for tracking and auditing purposes, metering, monitoring, multi-tenancy, CaaS flow, as well as roles and responsibilities.

Adaptability (Adaptation): With custom AI model it's crucial to constantly adapt, whether you need new better models, different model architecture, or simply exchange the underlying dataset.

Benefiting from easier interaction with LLMs for grounding, fine-tuning, custom AI models, and AI Agents, you can drive your AI adaptation at your pace without the need to move to other services. Adaptability allows you to:

  • Adapt easily by switching models when needed or change orchestration configuration on the fly. Adapt your prompts to be more agnostic by saving different variants that call different LLMs.
  • Switch models, configure orchestration, add further modules, and register prompt variants for different LLMs.
  • Benefit from configurations, lifecycle, and change of custom models and content packages.

Trust & Security: Ensuring data privacy, data isolation, and robust security measures to protect AI systems and their data:

  • No automatic saving of prompts or data
  • Ensuring data masking and content filtering (prompt injection, jailbreak)
  • SOC 2, NIST, ISO certifications

Vector Engine

With our grounding capability, we've integrated an SAP managed vector engine (powered by SAP HANA cloud) to simplify retrieving your business documents relevant to a question or task and providing them as context for the LLM.

In summary, the generative AI hub offers a comprehensive suite of tools and services to integrate AI into your applications, exploring generative AI capabilities in a safe environment, ensuring trust, control, and seamless access to foundation models and business data.

Generative AI Hub Access

Before deep diving into using generative AI hub in applications, it's important to discover how to access generative AI hub in SAP AI Core and SAP AI Launchpad.

Here are the main steps to gain access to generative AI hub and integrate an LLM into an application:

  1. Set up a BTP global account:

  2. Provision SAP AI Core from the SAP BTP cockpit:

    • Provision SAP AI Core from the SAP BTP cockpit within the SAP Business Technology Platform. This process generates a service key, which includes the necessary URLs and credentials to access your SAP AI Core instance.
    • See the following documentation for this process: Intial setup for SAP AI core
  3. Connect to SAP AI Core tools like SAP AI Launchpad:

    • Generative AI hub can be accessed through SAP AI launchpad. To access SAP AI Launchpad, you need to connect it to SAP AI Core. You can also connect other tools like Postman and python.
    • See the details of this in the following tutorial: Set Up Tools to Connect With and Operate SAP AI Core
  4. Create a deployment: Create a deployment, programmatically or via the SAP AI Launchpad, to instantiate a use-case-specific LLM configuration. This step involves:

    • Referencing a model provider-specific executable (for example, models provided via the Azure OpenAI service).
    • Configuring parameters like model name, model version, and so on.
    • SAP AI Core will provide a unique URL for each deployment that can be used to access the LLM.

The details about deployment are explained in the following topic.

Create a Deployment

See the following video to know how to deploy LLMs in generative AI hub.

Note

Each model version has a specified deprecation date. When a deployment uses a model version, it will stop working on that version's deprecation date.

To ensure continued functionality, choose one of the following model upgrade options:

  • Auto Upgrade: Configure or update your LLM to use the latest model version. When SAP AI Core supports a new model version, your existing deployments will automatically migrate to the latest version. This might imply behavior changes in the updated version.

  • Manual Upgrade: Update your LLM configuration with a replacement model version of your choice. This version will be used in your deployments, regardless of any updates to the models supported by SAP AI Core. Choosing this path, you need to keep track of the deprecation date and update accordingly. Otherwise, after the deprecation date, the endpoint will not deliver a response.

Note

You need to deploy all the needed models for your scenario before you proceed with developing prompts. For example, in this learning journey you want to use models such as meta--llama3-70b-instruct, mistralai--mixtral-8x7b-instruct-v01, gpt-4o, and gemini-1.5-pro. You can deploy all these models using the steps described in the preceding video.

Log in to track your progress & complete quizzes