🧚 Why Run dbt Inside Airflow Docker Container

Why I Run dbt Inside Airflow Docker Container In modern data engineering pipelines, dbt and Airflow often work side by side. One common design decision is how to run dbt alongside Airflow: Should dbt run in its own container, orchestrated via API or CLI call? Or should dbt run directly inside Airflow’s Docker container as part of the DAG? After experimenting with both, I prefer running dbt inside Airflow’s Docker container. ...

June 4, 2025

🔧 ARM Mac + Docker + dbt: Troubleshooting Startup Issues

While setting up Airflow + dbt projects with Docker, you may run into this common error message and its solutions. 🔍 Problem 1: Platform Architecture Mismatch Error message: The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) My Mac is running on ARM (Apple Silicon - M1/M2/M3). The official dbt Docker image is built for amd64 (x86-based). As a result, Docker tries to run cross-architecture using QEMU emulation, which sometimes leads to internal Python path issues → surfaces as the dbt dbt --version error. This is not a simple dbt bug — the root cause is platform mismatch. ...

May 30, 2025

🔧 Solving Airflow Docker Startup Issues

Common issues you will often encounter when running Airflow with Docker. ❗ Issue 1 — .env file is not visible inside Airflow container 🔍 Symptom Summary The .env file exists at the project root. But inside the Airflow container, load_dotenv() fails to read it. The reason: Docker automatically passes .env as environment variables. But Docker does not copy or mount the file itself into the container. Therefore, load_dotenv() has no file to read. ✅ Solution 1️⃣ Add volume mount for .env in docker-compose.yml This way, the .env file becomes available inside the container at the correct path. ...

May 30, 2025

🔧 Why Do We Split Airflow into init, scheduler, and webserver?

If you start working with Airflow a bit more seriously, you’ll quickly notice that it’s usually split into multiple services: airflow-init airflow-scheduler airflow-webserver At first, you may wonder: “Why do we need to split them up like this?” Well — this is actually the standard production architecture. Let’s break it down in simple, practical terms. 1️⃣ airflow-init — Preparation Step Also sometimes called airflow-db-migrate or airflow-bootstrap. This runs only once when you initialize Airflow. ...

May 30, 2025

🚀 Building a Batch Data Pipeline with AWS, Airflow, and Spark

✨ Project Summary Assuming I am working for a fintech company, I built a batch pipeline that automatically aggregates → transforms → analyzes credit card data. Since I couldn’t use real data, I used synthetic transaction data generated using Faker, but I believe it was sufficient for the purpose of designing the overall data flow and structure. 🎯 Goal “Build an Airflow pipeline that processes realistic financial data with Spark, analyzes and stores them.” ...

May 1, 2025