PromptQL Configuration
Overview
You can manage PromptQL's LLM settings and system instructions from a single promptql-config.hml
file, automatically
created in the globals/metadata
directory of your project when you initialize a project with the --with-promptql
flag using the CLI.
Examples
Minimal configuration:
kind: PromptQlConfig
version: v2
definition:
llm:
provider: hasura
Custom providers for LLM & AI primitives and custom system instructions:
kind: PromptQlConfig
version: v2
definition:
llm:
provider: openai
model: o3-mini
apiKey:
valueFromEnv: OPENAI_API_KEY
aiPrimitiveLlm:
provider: openai
model: gpt-4o
apiKey:
valueFromEnv: OPENAI_API_KEY
system_instructions: |
You are a helpful AI Assistant.
Mapping environment variables
If you do specify environment variables in your promptql-config.hml
, don't forget to add them to the globals
subgraph's subgraph.yaml
under the envMapping
section.
With promptql-config.hml
file, you can:
- Set the LLM provider and model used across the application.
- Define a separate LLMs for AI Primitives such as Classification, Summarization, and Extraction.
- Add system instructions that apply to every PromptQL interaction.