Generative AI has broken out of research labs and is now transforming the way business is done. SAP is moving at full speed to embrace this trend and has launched an agent called Joule. In this blog series, I’ll provide a “super-fast hands-on” guide to help you quickly call default models of SAP AI Core and expand them into practical AI agents for real-world business use, so you can understand how these agents work behind the scenes.
Notice
The Japanese version is available here.
Time Commitment
Each part is designed to be completed in 10–15 minutes. If you enjoyed this post, please give it a kudos! Your support really motivates me. Also, if there’s anything you’d like to know more about, feel free to leave a comment!
Connecting to SAP AI Launchpad
In this chapter, we’ll connect to SAP AI Launchpad using the service key created for the SAP AI Core instance. Then, we’ll proceed to deploy the LLM for running a chat model. Creating a service key gives your local scripts a secure, OAuth‑based passport into SAP AI Core. Without it, Python SDK calls have no way to authenticate or discover endpoints.
Open your AI Core instance in BTP Cockpit → Instances & Subscriptions. Go to the Service Keys tab and click Create. Give it any name. Download the generated JSON; you’ll need it later.
NOTE – Resource Group
When calling SAP AI Core from Python you must set an extra environment variable named RESOURCE_GROUP in addition to the values contained in the service key. In this tutorial we’ll simply point it to the built‑in default group.
You can deploy an LLM (Large Language Model) directly to SAP AI Core using the service key you created earlier. However, in this guide, we'll use SAP AI Launchpad to make the deployment process easier. To begin, you'll need to establish a secure connection route from SAP AI Launchpad to SAP AI Core. From Subscriptions open AI Launchpad → Go to Application. Click Add (top‑right) → API Connection and upload the Service Key JSON. Select the default Resource Group in the side panel and save.
On the "Configuration" screen of SAP Launchpad, you can select the type of foundation model to use. Think of this as creating a model profile that can be shared across multiple deployments. Let’s go ahead and create the configuration. In this case, we’ll use the default foundation_models scenario and configure it to use gapt-4o-mini. Navigate ML Operations → Settings and click Create. Fill in: Name =
By the way, there are various LLMs available to run on SAP AI Core. You can browse them in the Generative AI Hub > Model Library section. The Leaderboard also allows you to compare models based on accuracy, token cost, and other metrics.
Part 2 Building a Chat Model with LangChain
We’ll finally implement the chat model! Make sure your Python development environment is ready—VS Code or a similar IDE will work perfectly!
All the views and opinions in the blog are my own and are made in my personal capacity. SAP shall not be responsible or liable for any of the contents published in this blog.