Integrating prompts into your system
When your prompt is ready for production, you can easily integrate it into your system.
Asserto functions as a CMS for your prompts, separating prompt content from your application code. Once integrated, you can push prompt updates without the need to redeploy your system—keeping your production workflow lean and flexible.
1. Getting the Prompt Content (Messages and Configuration)
Prompt data includes the messages to be sent to LLM and configuration options like model name and temperature.
You can retrieve this information in two ways:
Export as JSON via the UI
From the Prompt Page, click the Use Prompt button and select Export as JSON. You can also export all prompts at once from the prompt-list page.
Get Prompt via API
Generate a project API key from the Settings page.
Use the following endpoint to fetch the prompt definition:
Refer to the API documentation for more details.
2. Sample Code for Integration with OpenAI API
Once you have the prompt messages and configuration, you can render them and send the request using the OpenAI API.
Here’s an example in python using chevron, a Mustache templating library:
By combining this with the prompt API, your system can stay in sync with content changes without code changes or redeployments.