Daily Postgres Table Backup to GitHub in CSV Format

This workflow automatically backs up all public Postgres tables into a GitHub repository as CSV files every 24 hours.
It ensures your database snapshots are always up to date updating existing files if data changes, or creating new backups for new tables.

How it works:
Schedule Trigger – Runs daily to start the backup process.
GitHub Integration – Lists existing files in the target repo to avoid duplicates.
Postgres Query – Fetches all table names from the public schema.
Data Extraction – Selects all rows from each table.
Convert to CSV – Saves table data as CSV files.
Conditional Upload –
If the table already exists in GitHub → Update the file.
If new → Upload a new file.

Postgres Tables Preview

GitHub Backup Preview

Use case:
Perfect for developers, analysts, or data engineers who want daily automated backups of Postgres data without manual exports keeping both history and version control in GitHub.

Requirements:
Postgres credentials with read access.
GitHub repository (OAuth2 connected in n8n).

0
Downloads
1
Views
7.98
Quality Score
beginner
Complexity
Author:Jay Emp0(View Original →)
Created:9/10/2025
Updated:11/17/2025

🔒 Please log in to import templates to n8n and favorite templates

Workflow Visualization

Loading...

Preparing workflow renderer

Comments (0)

Login to post comments