"If a worker wants to do his job well, he must first sharpen his tools." - Confucius, "The Analects of Confucius. Lu Linggong"
Front page > AI > Automating import of CSV to PostgreSQL using Airflow and Docker

Automating import of CSV to PostgreSQL using Airflow and Docker

Posted on 2025-04-12
Browse:797

This tutorial demonstrates building a robust data pipeline using Apache Airflow, Docker, and PostgreSQL to automate data transfer from CSV files to a database. We'll cover core Airflow concepts like DAGs, tasks, and operators for efficient workflow management.

This project showcases creating a reliable data pipeline that reads CSV data and writes it to a PostgreSQL database. We'll integrate various Airflow components to ensure efficient data handling and maintain data integrity.

Learning Objectives:

  • Grasp core Apache Airflow concepts: DAGs, tasks, and operators.
  • Set up and configure Apache Airflow with Docker for workflow automation.
  • Integrate PostgreSQL for data management within Airflow pipelines.
  • Master reading CSV files and automating data insertion into a PostgreSQL database.
  • Build and deploy scalable, efficient data pipelines using Airflow and Docker.

Prerequisites:

  • Docker Desktop, VS Code, Docker Compose
  • Basic understanding of Docker containers and commands
  • Basic Linux commands
  • Basic Python knowledge
  • Experience building Docker images from Dockerfiles and using Docker Compose

What is Apache Airflow?

Apache Airflow (Airflow) is a platform for programmatically authoring, scheduling, and monitoring workflows. Defining workflows as code improves maintainability, version control, testing, and collaboration. Its user interface simplifies visualizing pipelines, monitoring progress, and troubleshooting.

Automating CSV to PostgreSQL Ingestion with Airflow and Docker

Airflow Terminology:

  • Workflow: A step-by-step process to achieve a goal (e.g., baking a cake).
  • DAG (Directed Acyclic Graph): A workflow blueprint showing task dependencies and execution order. It's a visual representation of the workflow. Automating CSV to PostgreSQL Ingestion with Airflow and Docker
  • Task: A single action within a workflow (e.g., mixing ingredients).
  • Operators: Building blocks of tasks, defining actions like running Python scripts or executing SQL. Key operators include PythonOperator, DummyOperator, and PostgresOperator.
  • XComs (Cross-Communications): Enable tasks to communicate and share data.
  • Connections: Manage credentials for connecting to external systems (e.g., databases).

Setting up Apache Airflow with Docker and Dockerfile:

Using Docker ensures a consistent and reproducible environment. A Dockerfile automates image creation. The following instructions should be saved as Dockerfile (no extension):

FROM apache/airflow:2.9.1-python3.9
USER root
COPY requirements.txt /requirements.txt
RUN pip3 install --upgrade pip && pip3 install --no-cache-dir -r /requirements.txt
RUN pip3 install apache-airflow-providers-apache-spark apache-airflow-providers-amazon
RUN apt-get update && apt-get install -y gcc python3-dev openjdk-17-jdk && apt-get clean

This Dockerfile uses an official Airflow image, installs dependencies from requirements.txt, and installs necessary Airflow providers (Spark and AWS examples are shown; you may need others).

Docker Compose Configuration:

docker-compose.yml orchestrates the Docker containers. The following configuration defines services for the webserver, scheduler, triggerer, CLI, init, and PostgreSQL. Note the use of the x-airflow-common section for shared settings and the connection to the PostgreSQL database. (The full docker-compose.yml is too long to include here but the key sections are shown above).

Project Setup and Execution:

  1. Create a project directory.
  2. Add the Dockerfile and docker-compose.yml files.
  3. Create requirements.txt listing necessary Python packages (e.g., pandas).
  4. Run docker-compose up -d to start the containers.
  5. Access the Airflow UI at http://localhost:8080.
  6. Create a PostgreSQL connection in the Airflow UI (using write_to_psql as the connection ID).
  7. Create a sample input.csv file.

DAG and Python Function:

The Airflow DAG (sample.py) defines the workflow:

  • A PostgresOperator creates the database table.
  • A PythonOperator (generate_insert_queries) reads the CSV and generates SQL INSERT statements, saving them to dags/sql/insert_queries.sql.
  • Another PostgresOperator executes the generated SQL.

(The full sample.py code is too long to include here but the key sections are shown above).

Conclusion:

This project demonstrates a complete data pipeline using Airflow, Docker, and PostgreSQL. It highlights the benefits of automation and the use of Docker for reproducible environments. The use of operators and the DAG structure are key to efficient workflow management.

(The remaining sections, including FAQs and Github Repo, are omitted for brevity. They are present in the original input.)

Latest tutorial More>

Disclaimer: All resources provided are partly from the Internet. If there is any infringement of your copyright or other rights and interests, please explain the detailed reasons and provide proof of copyright or rights and interests and then send it to the email: [email protected] We will handle it for you as soon as possible.

Copyright© 2022 湘ICP备2022001581号-3