Daily Postgres Table Backup to GitHub in CSV Format
This workflow automatically backs up all public Postgres tables into a GitHub repository as CSV files every 24 hours.
It ensures your database snapshots are always up to date updating existing files if data changes, or creating new backups for new tables.
How it works:
Schedule Trigger – Runs daily to start the backup process.
GitHub Integration – Lists existing files in the target repo to avoid duplicates.
Postgres Query – Fetches all table names from the public schema.
Data Extraction – Selects all rows from each table.
Convert to CSV – Saves table data as CSV files.
Conditional Upload –
If the table already exists in GitHub → Update the file.
If new → Upload a new file.
Postgres Tables Preview
GitHub Backup Preview
Use case:
Perfect for developers, analysts, or data engineers who want daily automated backups of Postgres data without manual exports keeping both history and version control in GitHub.
Requirements:
Postgres credentials with read access.
GitHub repository (OAuth2 connected in n8n).
Related Templates
Send structured logs to BetterStack from any workflow using HTTP Request
Send structured logs to BetterStack from any workflow using HTTP Request Who is this for? This workflow is perfect for...
Provide latest euro exchange rates from European Central Bank via Webhook
What is this workflow doing? This simple workflow is pulling the latest Euro foreign exchange reference rates from the E...
Convert Tour PDFs to Vector Database using Google Drive, LangChain & OpenAI
🧩 Workflow: Process Tour PDF from Google Drive to Pinecone Vector DB with OpenAI Embeddings Overview This workflow au...
🔒 Please log in to import templates to n8n and favorite templates
Workflow Visualization
Loading...
Preparing workflow renderer
Comments (0)
Login to post comments