Understanding the LLM step
Understanding the LLM step
Differences between the LLM step, ChatGPT and Claude AI
Under the hood, ChatGPT and Claude AI both expose the same large-language models you can pick directly in AirOps. Each model comes with a name and version—for example, “gpt-4” (ChatGPT) or Claude Sonnet and Claude Opus (Anthropic). Every platform also uses a built-in system prompt to set the AI’s persona, behavior, boundaries and ethics.
- OpenAI does not publicly share ChatGPT’s system prompt.
- Anthropic does publish theirs if you search for “Anthropic system prompt.”
The LLM step in AirOps simply surfaces two things you must define yourself:
- Your chosen model
- Your custom system prompt
Configuring the LLM step in an AirOps Workflow
- Open your meta-description Workflow.
- Drag in the Prompt (LLM) step.
- In the step’s settings, select your model dropdown—Claude Sonnet is a great starting point for writing tasks.
- Paste or type your system prompt into the System field.
System vs User vs Assistant prompts
- System prompt: Defines the AI’s persona, role and task.
- User prompt: Your human-entered message (just like when you type into ChatGPT).
- Assistant prompt: Pre-loaded AI responses you could inject—but we won’t use it right now.
Building a simple meta-description prompt
System prompt example:
“You’re an SEO expert. Write a meta description between 150–160 characters.”
User prompt example:
Please give me a meta description optimizing for the keyword “{{ keyword }}.”
Here is the article content: {{ step_#.output }}
– Reference your Workflow variables (keyword, step_#.output from the web page scrape step)
– Don’t pass the raw URL—supply the scraped page text so the model sees all of the content.
Testing and iterating
- Click Test on the Prompt step and watch the output stream.
- Review the generated meta description and adjust any additional rules:
- Character-length checks
- Forbidden characters (dashes, ampersands, etc.)
- Tone or formatting constraints
By choosing the same underlying models as ChatGPT and Claude AI but providing your own system prompt plus exact context, you gain much tighter control over the output.
Hopefully this gives you a really good understanding of what the LLM step is.
Your First Workflow
Learn how to build your first workflow end-to-end in AirOps