Service: Airflow - EyevinnOSC/community GitHub Wiki

Getting Started

Apache Airflow is an open-source workflow orchestration platform for authoring, scheduling, and monitoring data pipelines as code. DAGs (Directed Acyclic Graphs) written in Python define your pipeline logic. Available as an open web service in Eyevinn Open Source Cloud, Airflow gives you a visual web UI for triggering and monitoring runs, a scheduler, and a metadata database to persist state.

Prerequisites

Step 1: Create a PostgreSQL database

Airflow requires a metadata database. Create a PostgreSQL instance first by navigating to the PostgreSQL service in the OSC web console. Click Create psql-db and enter a name and password.

Once running, note the IP and port from the instance card. The connection string is:

postgres://postgres:<password>@<IP>:<PORT>/postgres

Step 2: Store the database URL as a secret

It is good practice to store credentials as a service secret rather than entering them in plain text. Navigate to the Airflow service page, open the Service Secrets tab, and click New Secret. Name the secret dburl and paste in the full PostgreSQL connection string.

Step 3: Create an Airflow instance

Go to the My airflows tab on the Airflow service page and click Create airflow.

Field Description
name Unique name for this Airflow instance
AdminPassword Password for the admin user (used to log in to the web UI)
DatabaseUrl PostgreSQL connection string — reference the secret with {{secrets.dburl}}

Click Create. Wait for the instance status to turn green.

Step 4: Log in

Click the instance card to open the Airflow web UI. Log in with:

  • Username: admin
  • Password: the AdminPassword you set in Step 3

Change the password after first login under Security → Users.

Usage example

Deploy DAGs by placing Python files in the dags/ directory of a repository and pointing the agent at it. Or use the Airflow REST API to trigger runs programmatically:

curl -X POST \
  -H "Content-Type: application/json" \
  -u admin:<password> \
  https://<instance>.apache-airflow.auto.prod.osaas.io/api/v1/dags/<dag_id>/dagRuns \
  -d '{"conf": {}}'

CLI usage

osc create apache-airflow myairflow \
  -o AdminPassword="mysecretpassword" \
  -o DatabaseUrl="{{secrets.dburl}}"

Resources

⚠️ **GitHub.com Fallback** ⚠️