by Robert Breen
This workflow transforms raw marketing data from Google Sheets into a pivot-like summary table. It merges lookup data, groups spend by name, and appends the results into a clean reporting tab — all automatically, without needing to manually build pivot tables in Sheets. 🧑💻 Who’s it for Marketing analysts who track channel spend across campaigns Small businesses that rely on Google Sheets for reporting Teams that need automated daily rollups without rebuilding pivot tables manually ⚙️ How it works Get Marketing Data (Google Sheets) – Pulls raw spend data. Vlookup Data (Google Sheets) – Brings in reference/lookup fields (e.g., channel labels). Merge Tables – Joins marketing data and lookup data on the Channel column. Summarize – Groups data by Name and sums up Spend ($). Clear Sheet – Wipes the reporting tab to avoid duplicates. Append to Pivot Sheet – Writes the aggregated results into the "render pivot" sheet. The result: a pivot-style summary table inside Google Sheets, automatically refreshed by n8n. 🔑 Setup Instructions 1) Connect Google Sheets (OAuth2) In n8n → Credentials → New → Google Sheets (OAuth2) Sign in with your Google account and grant access In each Google Sheets node, select your Spreadsheet and the appropriate Worksheet: data (raw spend) Lookup (channel reference table) render pivot (output tab) 2) Configure Summarize Node Group by: Name Summarize: Spend ($) → sum 3) Test the Workflow Execute the workflow manually Check your "render pivot" tab — it should display aggregated spend by Name 🛠️ How to customize Change grouping fields (e.g., by Channel, Campaign, or Region) Add more aggregations (e.g., average CPC, max impressions) Use the Merge node to join extra data sources before summarizing Schedule execution to run daily for fresh rollups 📋 Requirements n8n (Cloud or self-hosted) Google Sheets account with structured data in data and Lookup tabs 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your pivot)? 📧 rbreen@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by AI/ML API | D1m7asis
🧠 Telegram Search Assistant — Tavily + AI/ML API This n8n workflow lets users ask questions in Telegram and receive concise, fact-based answers. It performs a web search with Tavily, then uses AIMLAPI (GPT-5) to summarize results into a clear 3–4 sentence reply. The flow ensures grounded, non-hallucinated answers. 🚀 Features 📩 Telegram-based input ⌨️ Typing indicator for better UX 🔎 Web search with Tavily (JSON results) 🧠 Summarization with AIMLAPI (openai/gpt-5-chat-latest) 📤 Replies in the same chat/thread ✅ Guardrails against hallucinations 🛠 Setup Guide 1. 📲 Create Telegram Bot Talk to @BotFather Use /newbot → choose a name and username Save the bot token 2. 🔐 Set Up Credentials in n8n Telegram API**: use your bot token Tavily**: add your Tavily API key AI/ML API**: add your API key Base URL: https://api.aimlapi.com/v1 3. 🔧 Configure the Workflow Open the n8n editor and import the JSON Update credentials for Telegram, Tavily, and AIMLAPI ⚙️ Flow Summary | Node | Function | |--------------------------|-----------------------------------------------| | 📩 Receive Telegram Msg | Triggered when user sends text | | ⌨️ Typing Indicator | Shows “typing…” to user | | 🔎 Web Search | Queries Tavily with user’s message | | 🧠 LLM Summarize | Summarizes search JSON into a factual answer | | 📤 Reply to Telegram | Sends concise answer back to same thread | 📁 Data Handling By default: no data stored Optional: log queries & answers to Google Sheets or a database 💡 Example Prompt Flow User sends: When is the next solar eclipse in Europe? Bot replies: The next solar eclipse in Europe will occur on August 12, 2026. It will be visible as a total eclipse across Spain, with partial views in much of Europe. The maximum eclipse will occur around 17:46 UTC. 🔄 Customization Add commands: /help, /sources, /news Apply rate-limits per user Extend logging to Google Sheets / DB Add NSFW / profanity filters before search 🧪 Testing Test end-to-end in Telegram (not just “Execute Node”) Add a fallback reply if Tavily returns empty results Use sticky notes for debugging & best practices 📎 Resources 🔗 AI/ML API Docs 🔗 Tavily Search API
by Akash Kankariya
Easily ensure your n8n workflows are never lost! This template automates the process of backing up all your n8n workflows to a GitHub repository every 6 hours. Set it up once and enjoy worry-free workflow versioning and disaster recovery! 🔄✨ 📝 What This Workflow Does Schedules backups**: Triggers the workflow automatically every 6 hours—no manual steps needed. ⏰ Exports all current workflows**: Collects a JSON snapshot of every workflow in your n8n instance. 📦 Pushes backups to GitHub**: Commits each backup file to your specified GitHub repository with a time-stamped commit message for easy tracking. 🗂️🚀 Smart file handling**: Checks if a backup file already exists and creates or updates as needed, keeping your repository clean and organized. 🤖 ⚡️ Why Use This Template? Automate your workflow backups – never miss a backup again!** Seamless integration with GitHub** for team collaboration, change management, and rollback. Simple, reliable, and fully customizable** to match your backup intervals and repository setup. Peace of mind** that your critical automation assets are always protected. 📦 How the Template Works: Step-by-step Overview Scheduled Trigger: Fires every 6 hours to launch the backup sequence. Get All Workflows: Uses the HTTP Request node to fetch all n8n workflows from your instance as JSON data. Move Binary Data: Converts the JSON into a binary format, ready for GitHub storage. Edit/Create Backup File: Attempts to edit (update) an existing backup file in your GitHub repo. If the file does not exist, the workflow will create a new one. Conditional Logic: Checks after each run whether the backup file exists and ensures previous versions can be recovered or merged as needed. Repeat: The process auto-loops every 6 hours—no further intervention required! 🔧 How To Set Up On Your Server Import the template into your n8n instance. Configure your GitHub credentials in the workflow nodes. Update the GitHub repository details (owner, repository, and filePath) to use your own repo and desired file path. Set your n8n API key and update the API endpoint URL to match your deployment. Save and activate the workflow—now your backups are on autopilot! 👨💻 Example Use Cases Version control for rapidly changing automation environments. Safeguarding business-critical automation assets. Easy rollback in case of workflow corruption or accidental deletion. Team collaboration through GitHub's pull request and review process. 🌟 Pro Tips Adjust the backup interval in the Schedule Trigger node if you require more/less frequent backups. Use GitHub branch protection rules for enhanced workflow security. Pair this backup workflow with notifications (e.g., Slack or Email) for backup alerts. Protect your n8n workflows with automated, reliable, and versioned GitHub backups—set it and forget it! 🚦🔒
by Jessica
🎬 Auto Add AI Captions to Videos from Google Drive with ZapCap Description Stop wasting hours on video captioning. Upload your videos to a Google Drive folder, and ZapCap automatically generates professional subtitles for you. Download the finished video from your Google Drive and it’s ready to post. Fast, simple, and effortless. How It Works Google Drive Trigger – Watches your folder for new uploads. Send to ZapCap – Instantly creates accurate subtitles. Wait & Check Status – Automatically tracks progress. Download Captioned Video – Get your finished, captioned video. Upload Back to Drive – Saves it where you need it, ready to share. Why You’ll Love It 🚀 Save time — captions are added automatically. ⚡ Speed up content creation — get post-ready videos in minutes. 🎯 Professional results — subtitles are accurate and consistent. ☁️ Fully cloud-based — no local software, no manual work. Requirements ZapCap account & API key** — get your free API key here Google Drive account** (with OAuth credentials) n8n** (Cloud or self-hosted) Support Need help? Join our ZapCap Discord or email us at hi@zapcap.ai for assistance.
by David Olusola
Gmail Attachment Extractor to Google Drive Description: This workflow monitors your Gmail inbox for new emails, specifically those with attachments, and automatically saves those attachments to a designated folder in your Google Drive. Use Case: Automatically archive invoices, client documents, reports, or photos sent via email to a structured cloud storage. How It Works This workflow operates in three main steps: Gmail New Email Trigger: The workflow starts with a Gmail Trigger node, set to monitor for new emails in your specified Gmail inbox (e.g., your primary inbox). It checks for emails that contain attachments. Conditional Check (Optional but Recommended): An If node checks if the email actually has attachments. This prevents errors if an email without an attachment somehow triggers the workflow. Upload to Google Drive: A Google Drive node receives the email data and its attachments. It's configured to upload these attachments to a specific folder in your Google Drive. The attachments are named dynamically based on their original filenames. Setup Steps To get this workflow up and running, follow these instructions: Step 1: Create Gmail and Google Drive Credentials in n8n In your n8n instance, click on Credentials in the left sidebar. Click New Credential. Search for and select "Gmail OAuth2 API" and follow the authentication steps with your Google account. Save it. Click New Credential again. Search for and select "Google Drive OAuth2 API" and follow the authentication steps with your Google account. Save it. Make note of the Credential Names (e.g., "My Gmail Account", "My Google Drive Account"). Step 2: Create a Destination Folder in Google Drive Go to your Google Drive (drive.google.com). Create a new folder where you want to save the email attachments (e.g., Email Attachments Archive). Copy the Folder ID from the URL (e.g., https://drive.google.com/drive/folders/YOUR_FOLDER_ID_HERE).
by Grant Warfield
This workflow auto-generates and posts a tweet once per day using real-time insights from the web. It uses Perplexity to fetch trending topics, OpenAI to summarize them into a tweet, and the Twitter API to publish. ⚙️ Set up steps Set your Perplexity API key in the HTTP Request node. Add your OpenAI API key to the Message Model node. Authenticate your Twitter API credentials in the second HTTP Request node. Modify the schedule trigger to run daily at your preferred time. All logic is pre-configured — simply plug in your credentials and you're live.
by Chris Rudy
Who's it for Marketing teams, copywriters, and agencies who need to quickly generate and iterate on ad copies for Meta and TikTok campaigns. Perfect for brands that want AI-powered copy generation with human review and approval built into the workflow. What it does This workflow automates the ad copy creation process by: Collecting brand and product information through a form Using AI to generate tailored ad copies based on brand type (Fashion or Problem-Solution) Sending copies to Slack for team review and approval Handling revision requests with feedback incorporation Limiting revisions to 3 rounds to maintain efficiency How to set up Configure your OpenAI credentials in the OpenAI nodes Set up Slack integration and select your review channel in all Slack nodes Customize the AI prompts in the OpenAI nodes to match your brand voice Test the form to ensure file uploads and data collection work properly Activate the workflow when ready Requirements OpenAI API access (GPT-3.5 or GPT-4) Slack workspace with appropriate channel permissions Self-hosted n8n instance (for file upload functionality) How to customize Adjust the AI prompts in OpenAI nodes to match your specific industry or brand guidelines Modify the revision limit in the "Edit Fields: Revision Counter Max 3" node Add additional brand types in the form dropdown and corresponding AI nodes Customize Slack messages to match your team's communication style
by System Admin
No description available
by System Admin
Tagged with: , , , ,
by Automate With Marc
🎥 Auto-Caption Videos for Instagram with Google Drive + Submagic Description Save hours on video editing with this workflow! Whenever you upload a video to a specific Google Drive folder, it’s automatically sent to Submagic to generate engaging captions (using your chosen template). Once the captioned video is ready, it’s pulled back, downloaded, and uploaded into your Google Drive—fully captioned and Instagram-ready. Watch build along videos for workflows like these on: www.youtube.com/@automatewithmarc How It Works Google Drive Trigger – Listens for new video uploads in your chosen folder. Post to Submagic – Sends the video URL to Submagic’s API with your caption style (e.g. Hormozi). Wait Loop + Status Check – Polls Submagic until the captioning job is complete. Download Captioned Video – Retrieves the finished captioned video file. Upload to Google Drive – Saves the captioned version back into Drive, ready for Instagram posting. Why You’ll Love It 🎯 Zero manual steps — captioning happens automatically. ⚡ Faster IG content pipeline — ready-to-post reels in minutes. 🎨 Consistent style — apply your favorite Submagic caption templates every time. ☁️ Cloud-first — works entirely with Google Drive + Submagic, no local processing needed. Requirements Google Drive account (with OAuth credentials) Submagic API key n8n (Cloud or self-hosted)
by Adrian Kendall
Key Features Implements a simple round-robin distribution mechanism using a Data Table to track the last route used. Supports multiple downstream workflows or resources, balancing workload across them sequentially. Uses Switch and Code nodes for flexible routing logic. Designed for easy customization — replace placeholder “Route” nodes with sub-workflow calls or API triggers. Works with any trigger type, and includes merge logic to preserve input data. Nodes in Use | Node Name | Type | Purpose | |-------------------------------------------------|---------------|--------------------------------------------------------------------------| | When clicking ‘Execute workflow’ | Manual Trigger | Test entry point for manual execution. | | Calculate the next route to use | Data Table | Retrieves the last used route number. | | Code in JavaScript | Code | Increments the route counter (0–3 cycle). | | Update last_used in the datatable | Data Table | Updates the “Last_Used” field to track next route. | | Round Robin Router | Switch | Routes workflow execution to the correct path based on Last_Used value. | | Route 1 / Route 2 / Route 3 | NoOp | Placeholder routes — replace with your own workflows. | | Merge trigger data to pass to subworkflow if needed | Merge | Combines trigger data with routing data for sub-workflows. | | Sticky Notes | Annotations | Explain workflow logic and intended replacements. | How It Works The workflow starts when triggered manually (or by another workflow). The Data Table node fetches the current value of Last_Used, which identifies which route was last used. The Code node increments that value, resetting to 0 after 3, creating a round-robin cycle. The Data Table update node stores the new Last_Used value. The Switch node reads Last_Used and routes execution to the correct downstream branch. Each route can represent a duplicated workflow, resource, or API endpoint. Optionally, the Merge node reattaches trigger data before sending it to sub-workflows. Step-by-Step Trigger the workflow manually or via webhook/cron/etc. Retrieve current route index using the Data Table node. Increment route counter with the JavaScript Code node: If Last_Used = 3, it resets to 0. Otherwise, increments by 1. Update Data Table with the new Last_Used value. Route execution using the Switch node based on that value. Send data to corresponding subworkflow (Route 1, Route 2, Route 3). Replace the NoOp nodes with your target workflow or HTTP call nodes for real routing. Use Cases Distribute load across multiple API endpoints to prevent throttling. Run identical sub-workflows on different worker instances for parallel processing. Simulate load balancing during testing of N8N workflows. Sequentially alternate between external systems or servers handling similar tasks. Act as a proof-of-concept for balancing strategies before scaling up.
by Wayne Simpson
This template is a practical introduction to n8n Webhooks with built-in examples for all major HTTP methods and authentication types. It is designed as a learning resource to help you understand how webhooks work in n8n, how to connect them to a data store, and how to secure them properly. What’s included: Webhook nodes for GET, POST, PUT, PATCH, DELETE, and HEAD Demonstrations of Basic Auth, Header Auth, and JWT Auth Supabase integration for creating, retrieving, updating, and deleting rows Example response handling with Respond to Webhook nodes Sticky notes explaining each method, response type, and security option Use this template to: Learn how to configure and test webhooks in n8n Explore different authentication strategies Connect webhooks to a simple Supabase table Understand best practices for securing webhook endpoints This workflow is intended as an educational starting point. It shows you how to receive requests, map data, and return responses securely. For production use, adapt the structure, apply your own security policies, and extend the logic as needed. Check out the YouTube video here: https://www.youtube.com/watch?v=o6F36xsiuBk