Daily Postgres Table Backup to GitHub in CSV Format
This workflow automatically backs up all public Postgres tables into a GitHub repository as CSV files every 24 hours.
It ensures your database snapshots are always up to date updating existing files if data changes, or creating new backups for new tables.
How it works:
Schedule Trigger – Runs daily to start the backup process.
GitHub Integration – Lists existing files in the target repo to avoid duplicates.
Postgres Query – Fetches all table names from the public schema.
Data Extraction – Selects all rows from each table.
Convert to CSV – Saves table data as CSV files.
Conditional Upload –
If the table already exists in GitHub → Update the file.
If new → Upload a new file.
Postgres Tables Preview
GitHub Backup Preview
Use case:
Perfect for developers, analysts, or data engineers who want daily automated backups of Postgres data without manual exports keeping both history and version control in GitHub.
Requirements:
Postgres credentials with read access.
GitHub repository (OAuth2 connected in n8n).
Related Templates
USDT And TRC20 Wallet Tracker API Workflow for n8n
Overview This n8n workflow is specifically designed to monitor USDT TRC20 transactions within a specified wallet. It u...
Send structured logs to BetterStack from any workflow using HTTP Request
Send structured logs to BetterStack from any workflow using HTTP Request Who is this for? This workflow is perfect for...
Automate Daily Keyword Research with Google Sheets, Suggest API & Custom Search
Who's it for This workflow is perfect for SEO specialists, marketers, bloggers, and content creators who want to automa...
🔒 Please log in to import templates to n8n and favorite templates
Workflow Visualization
Loading...
Preparing workflow renderer
Comments (0)
Login to post comments