AirOps Academy
Your First Workflow

Understanding variables

Understanding Variables

In our AirOps Workflow for generating meta descriptions, we’ve already set up inputs and defined outputs. Before invoking the large language model, we need to fetch the actual page content—models can’t automatically scrape URLs.

Why Scraping Comes First

Models only see the URL text, not the page itself. To supply real content, you must scrape the page before passing it along.

Using the Web Page Scrape Step

AirOps provides a Web Page Scrape step. This step accepts a URL and returns the full page content. Simply add it to your Workflow and point it at the target URL.

Introducing Variables

Hardcoding a URL would scrape the same page every run. Instead, use variables to pass user inputs from one step to the next. Here, the URL input flows into the scrape step via a variable.

Passing Dynamic URLs with Variables

  • Type a backslash \ in the URL field and select the URL variable.
  • Or, open the variables menu on the right and click URL.
    This ensures the scrape step always uses whatever the user entered.

Testing Your Scrape Step

  1. Click Test Step.
  2. Enter a sample URL (e.g., the ramp page) and a dummy keyword.
  3. Save the inputs and run the test.
    • The step replaces the variable with your sample URL.
    • AirOps scrapes the page and returns the content.

Passing Data Between Steps

In every AirOps Workflow, inputs move forward with variables and step outputs feed into later actions. This dynamic chaining makes your automation both flexible and reusable.

Search

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
No results found