by Reinhard Schmidbauer
Overview This template automatically exports Meta (Facebook) Ads campaign performance into Google Sheets — both daily and for historical backfills. It’s ideal for performance marketers, agencies, and analytics teams who want a reliable data pipeline from Meta Ads into their reporting stack. What this workflow does Runs a daily cron job to pull yesterday’s campaign-level performance from the Meta Ads Insights API. Flattens the API response and calculates key KPIs like CPL, CPA, ROAS, CTR, CPC, CPM, frequency and more. Appends one row per campaign per day to a Google Sheet (for dashboards and further analysis). Provides a separate Manual Backfill section to import historical data using a time_range parameter (e.g. last 12–24 months). Use cases Build Looker Studio / Power BI dashboards on top of a clean, daily Meta Ads dataset. Track ROAS, CPL, CPA, CTR, and frequency trends over time. Combine campaign data with CRM or ecommerce data in the same spreadsheet. Quickly backfill past performance when onboarding a new Meta Ads account. How it works Daily Incremental Flow A Schedule Trigger runs every day at 05:00. The Set config node defines ad account, date preset (yesterday), and Google Sheet details. The Meta Insights node calls the Facebook Graph insights edge at level=campaign. The Code node flattens the data and derives CPL, CPA, ROAS, and other KPIs. The Google Sheets node appends the rows to your Meta_Daily_Data sheet. Manual Backfill Flow A Manual Trigger lets you run the flow on demand. The Set backfill config node defines backfillSince and backfillUntil. The Meta Insights (time_range) node fetches performance for that historical range. The same transform logic is applied, and rows are appended to the same sheet. Prerequisites A Meta Business account with: A system user and a long-lived access token with ads_read / read_insights. A Google Sheet with a header row that matches the mapped column names. n8n credentials for: Facebook Graph API Google Sheets OAuth2 Setup steps Import this template into your n8n instance. Open the Set config and Set backfill config nodes: Set your adAccountId (e.g. act_123456789012345). Set your sheetId (Google Sheet ID) and sheet name (e.g. Meta_Daily_Data). Configure your Facebook Graph API and Google Sheets credentials in n8n. (Optional) Run the Manual Backfill section for your desired historical ranges (e.g. per quarter). Enable the workflow so the Daily Incremental section runs automatically. Customization Change level from campaign to adset or ad if you need more granular reporting. Add breakdowns (e.g. publisher_platform, platform_position) to split by platform and placement. Extend the transform code with additional KPIs or dimensions that match your reporting needs. Use a separate sheet for raw data and build dashboards on top of a cleaned or pivoted view. Consulting & support If you need help with: E-Commerce Strategy & Development** (Shopify, Shopware 6, Magento 2, SAP Commerce Cloud, etc.) Growth & Performance Marketing** (Google / Meta / Microsoft Ads, etc.) Data & Analytics Setups** (tracking, dashboards, attribution, gdpr, etc.) please reach out to Serendipity Technologies: 👉 https://www.serendipity.at We can help you turn this workflow into a full analytics stack and reporting system tailored to your business.
by Tristan V
Quickstart Guide: Facebook Messenger Chatbot with Pinecone RAG Step-by-step instructions to get this workflow running in n8n. Prerequisites Self-hosted n8n instance (v1.113.0+ recommended) Community nodes enabled in n8n Facebook Page (you must be an admin) OpenAI account Pinecone account (free Starter plan works) Workflow Architecture This workflow uses two webhooks with the same URL path but different HTTP methods: | Webhook | Method | Purpose | |---------|--------|---------| | Facebook Verification Webhook | GET | Handles Facebook's webhook verification | | Facebook Message Webhook | POST | Receives incoming messages | Both webhooks share the same URL: https://your-n8n.com/webhook/facebook-messenger-webhook n8n automatically routes requests based on HTTP method: GET** requests → Verification flow POST** requests → Message processing flow RAG Enhancement User Message → Batching → AI Agent ─┬─ OpenAI Chat Model ├─ Conversation Memory └─ Pinecone Assistant Tool (RAG) │ ▼ Search your documents │ ▼ Answer with citations Step 1: Install the Pinecone Assistant Community Node The Pinecone Assistant node is a community node that must be installed separately. In n8n, go to Settings → Community Nodes Click Install a community node Enter: @pinecone-database/n8n-nodes-pinecone-assistant Click Install Restart n8n if prompted > Note: Community nodes must be enabled in your n8n instance. For Docker, set N8N_COMMUNITY_PACKAGES_ALLOW_INSTALL=true. Step 2: Create Pinecone Account & Assistant 2.1 Create Pinecone Account Go to Pinecone Sign up for a free account (Starter plan includes 100 files per assistant) Complete the onboarding 2.2 Create an Assistant In the Pinecone console, go to Assistants Click Create Assistant Name it n8n-assistant (or choose your own name) Select your preferred region Click Create 2.3 Upload Your Documents Click on your newly created assistant Go to the Files tab Click Upload Files Upload your documents (PDFs, text files, etc.) Wait for processing to complete 2.4 Get Your Pinecone API Key In the Pinecone console, click on your profile/account Go to API Keys Copy your API key (or create a new one) Step 3: Get Your OpenAI API Key Go to OpenAI Platform Sign in with your OpenAI account Click Create new secret key Copy and save the API key Step 4: Create Facebook App & Get Page Access Token 4.1 Create Facebook App Go to Facebook Developers Click My Apps → Create App Select Other → Next Select Business → Next Enter app name and contact email Click Create App 4.2 Add Messenger Product In your app dashboard, scroll to Add products to your app Find Messenger and click Set up 4.3 Connect Your Facebook Page In Messenger settings, find Access Tokens section Click Add or Remove Pages Select your Facebook Page and grant permissions Click Done 4.4 Generate Page Access Token Back in Messenger settings, find your page in the list Click Generate Token Copy and save the token Step 5: Create Your Verify Token The verify token is a secret string for Facebook webhook verification. Create a random string (e.g., my-secret-token-12345) Save this value - you'll need it in Steps 7 and 10 Step 6: Create n8n Credentials 6.1 Pinecone Credential In n8n, go to Credentials → Add Credential Search for "Pinecone" (or "Pinecone API") Configure: Name: Pinecone API API Key: Paste your Pinecone API key from Step 2.4 Click Save 6.2 OpenAI API Credential In n8n, go to Credentials → Add Credential Search for "OpenAI API" Configure: Name: OpenAI API API Key: Paste your OpenAI API key from Step 3 Click Save 6.3 Facebook Graph API Credential In n8n, go to Credentials → Add Credential Search for "Facebook Graph API" Configure: Name: Facebook Page Access Token Access Token: Paste your Page Access Token from Step 4.4 Click Save Step 7: Import the Workflow In n8n, click Add Workflow → Import from File Select the workflow.json file from this folder The workflow will open in the editor 7.1 Configure the Verify Token Find the "Is Token Valid?" node Click on the node to open its settings In the conditions, find Value 2 that shows YOUR_VERIFY_TOKEN_HERE Replace it with your verify token from Step 5 7.2 Configure the Pinecone Assistant Name Find the "Get context snippets in Pinecone Assistant" node Click on the node to open its settings Change Assistant Name from n8n-assistant to your actual assistant name (from Step 2.2) Step 8: Connect Credentials to Nodes 8.1 Connect Facebook Credential Update these 3 nodes with your Facebook credential: Click on "Send Seen Indicator" → Select your Facebook Page Access Token credential Click on "Send Typing Indicator" → Select your Facebook Page Access Token credential Click on "Send Response to User" → Select your Facebook Page Access Token credential 8.2 Connect OpenAI Credential Click on "OpenAI Chat Model" → Select your OpenAI API credential 8.3 Connect Pinecone Credential Click on "Get context snippets in Pinecone Assistant" → Select your Pinecone API credential Step 9: Publish the Workflow Click Save to save the workflow Click the Publish button to make the workflow live Copy the webhook URL (e.g., https://your-n8n.com/webhook/facebook-messenger-webhook) Step 10: Configure Facebook Webhook Go to Facebook Developers → Your App → Messenger Settings Find Webhooks section Click Add Callback URL Enter: Callback URL: Your n8n webhook URL from Step 9 Verify Token: Same value from Step 5 Click Verify and Save After verification, subscribe to webhook fields: messages (required) messaging_postbacks (recommended) Step 11: Test Your Chatbot 11.1 Add Test Users (if needed) With Standard Access, only users with app roles can message the bot: Go to your Facebook App → App Roles → Roles Add users as Testers Those users must accept the invitation 11.2 Send a Test Message Open Facebook Messenger Search for your Facebook Page Try these test messages: "Hello!" - Should get a friendly greeting "What information do you have?" - Should search your documents "Tell me about [topic in your docs]" - Should return relevant information with context How Pinecone RAG Works User asks: "What are your return policies?" │ ▼ ┌─────────────────────────┐ │ AI Agent receives msg │ └─────────┬───────────────┘ │ ▼ ┌─────────────────────────┐ │ Calls Pinecone Tool │ → Searches your documents └─────────┬───────────────┘ │ ▼ ┌─────────────────────────┐ │ Gets relevant snippets │ ← "Return Policy.pdf: Items can be..." └─────────┬───────────────┘ │ ▼ ┌─────────────────────────┐ │ AI formulates answer │ → "According to our policy, items can be..." └─────────────────────────┘ Troubleshooting | Problem | Solution | |---------|----------| | "Pinecone Assistant Tool" not found | Ensure community node is installed (Step 1) | | "No relevant information found" | Upload more documents to your Pinecone Assistant | | Webhook verification fails | Check verify token matches in n8n and Facebook | | No response from bot | Check n8n execution logs for errors | | "Error validating access token" | Regenerate Page Access Token in Facebook | | AI Agent not using Pinecone tool | After import, open the "AI Agent1" node, make a small edit to the system message (e.g., add a space), and save. This re-initializes the tool bindings. | Customization Change the AI Behavior Edit the "AI Agent1" node's system message to: Adjust how it cites sources Change personality/tone Add specific instructions for your use case Change the Assistant Update the "Get context snippets in Pinecone Assistant" node to use a different assistant name. Adjust Response Length The workflow truncates responses to 1900 characters for Messenger. Edit the "Format Response" node to change this. Resources Pinecone Assistant Documentation Pinecone Assistant n8n Node (GitHub) Facebook Messenger Platform OpenAI API Documentation n8n Documentation
by WeblineIndia
Webhook from Payment Provider → Jira Finance Ticket → Slack Invoice Follow-up Automation This workflow automates failed subscription renewal processing by validating webhook data, using AI to analyze urgency and churn risk, creating a Jira Finance Task and notifying the finance team via Slack. If required fields are missing, it sends an error alert for manual review instead. ⚡ Quick Implementation Steps (Start Using in 60 Seconds) Import workflow JSON into n8n. Add Jira & Slack credentials. Configure webhook URL /payment-failed-renewal in payment provider. Test with: { "customerId": "C-101", "customerEmail": "user@example.com", "subscriptionId": "S-500", "amount": 39.99 } Activate workflow. What It Does This automation connects your payment system with your financial operations. When a subscription renewal fails, the payment provider sends a webhook. The workflow validates the fields, uses OpenAI to analyze the payment failure reason (determining urgency & churn risk), routes high-value failures to high priority, creates a Jira task with an AI-drafted recovery email and alerts the finance team on Slack. If required data is missing, the workflow prevents incomplete Jira tickets by routing the event to an error handler and sending a detailed Slack alert listing all missing fields and full payload for manual inspection. Who’s It For Finance & billing departments SaaS companies with recurring billing Teams using Jira for billing operations Slack-based financial support teams Companies wanting automated revenue recovery workflows Requirements to Use This Workflow n8n instance OpenAI API Key (or compatible LLM credential) Jira Software account with permissions for FIN project Slack bot token with channel posting rights Payment provider that supports POST JSON webhooks Webhook configured to: https://YOUR-N8N-URL/webhook/payment-failed-renewal How It Works & How To Set Up Step-by-Step Flow Webhook receives payment failure payload. Validation node checks required fields: customerId customerEmail subscriptionId amount AI Analysis: OpenAI analyzes failure reason, sets urgency, and drafts email. Logic: Switch node routes High Value (>$500) to 'High' priority. Jira Finance Task created (with AI draft). Slack message sent (with Churn Risk score). Setup Steps Step 1 — Webhook Setup Method: POST Path: payment-failed-renewal Step 2 — Jira Setup Select Jira credentials in Create Jira Finance Ticket node. Ensure: Project: FIN Issue type: Task Step 3 — Slack Setup Add Slack credentials to both Slack nodes. Select finance alert channel. Step 4 — OpenAI Setup Add OpenAI credentials in the AI Analysis node. Step 5 — Test { "customerId": "CUST-001", "customerEmail": "billing@example.com", "subscriptionId": "SUB-1001", "amount": 19.99 } Step 6 — Activate Enable the workflow. How To Customize Nodes Webhook Add Basic Auth Add token-based security Add JSON schema validation Validate Payload Enhance with: Email format validation Numeric validation for amount Auto-fallback values Jira Node Customize: Ticket summary structure Labels (billing-recovery, urgent, etc.) Add custom fields Change issue type or project Slack Nodes Enhance: Mentions: @finance-team Threads instead of channel posts Rich blocks, buttons, or attachments Add-ons (Optional Enhancements) Automated email to customer for payment recovery Retry count–based escalation (e.g., retry ≥ 3 → escalate to manager) Log data to Airtable / Google Sheets Sync events into CRM (HubSpot, Salesforce, Zoho) Notify Sales for high-value customer failures Use Case Examples Stripe renewal payment fails → Create Jira task → Slack finance alert. Chargebee retry attempts exhausted → Notify billing team immediately. Declined credit card → Jira ticket with failure reason. Razorpay/PayPal renewal failure → Automated follow-up. Webhook missing data → Slack error alert ensures nothing is silently ignored. Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|----------------|----------| | Webhook not triggering | Wrong URL / method | Use POST + correct endpoint | | Jira ticket missing | No permissions or invalid payload | Check Jira permissions + required fields | | Slack shows undefined values | Missing fields in payload | Confirm payload structure | | Error alert triggered incorrectly | Field names mismatch | Match exact names: customerId, customerEmail, subscriptionId, amount | | Payment provider not sending events | Firewall/CDN blocking | Whitelist the n8n webhook URL | | Workflow silent | Not activated | Turn workflow ON | Need Help? If you want help customizing this workflow or extending it into a complete revenue recovery automation suite: WeblineIndia can support you with: Jira & Slack automation pipelines Payment provider webhook integrations Finance workflow optimization AI-based billing insights End‑to‑end automation solutions Reach out anytime for expert implementation or enhancements.
by Meak
Firecrawl Web Search Agent → Google Sheets Logger with OpenRouter + n8n Most teams craft search operators by hand and copy results into spreadsheets. This workflow automates query generation, multi-operator searches, scraping, and logging — from a single webhook call. Benefits Auto-generate Firecrawl queries from natural language (OpenRouter Agent) Use pro operators: site:, inurl:, intitle:, exclusions, related Run parallel searches (site match, in-URL, exclusions, YouTube/intitle) Append titles/URLs/results to Google Sheets automatically Return results to the caller via webhook response Optional scraping of markdown + full-page screenshots How It Works Webhook receives a natural-language search request OpenRouter-powered Agent converts it to a Firecrawl query (+ limit) Firecrawl Search runs with scrapeOptions (markdown, screenshot) Parallel queries: site:, inurl:, negative filters, YouTube intitle:automation Collect results (title, url, data fields) from each call Append rows to Google Sheets (one per result) Respond to the webhook with the aggregated payload Ready to chain into alerts, enrichment, or CRM sync Who Is This For Researchers and content teams building source lists Growth/SEO teams needing precise operator queries Agencies automating discovery, monitoring, and logging Setup Connect OpenRouter (select your LLM; e.g., GPT-4.1-mini) Add Firecrawl API key and endpoint (/v1/search) Connect Google Sheets (Document ID + Sheet/Tab) Set webhook path and allow POST from your app Define default limit (fallback = 5) and scrapeOptions ROI & Monetization Save 3–6 hours/week on manual searching & copy/paste Offer as a $500–$2k/month research automation for clients Upsell alerts (cron/webhook) and data enrichment for premium retainers Strategy Insights In the full walkthrough, I show how to: Prompt the Agent to produce flawless site:/inurl/intitle/-exclusions Map Firecrawl data fields cleanly into Sheets Handle rate limits, empty results, and retries Extend with dedupe, domain filtering, and Slack/Telegram alerts Check Out My Channel For more advanced AI automation systems that generate real business results, check out my YouTube channel where I share the exact strategies I use to build automation agencies, sell high-value services, and scale to $20k+ monthly revenue.
by furuidoreandoro
Automated TikTok Repurposing & Video Generation Workflow Who’s it for This workflow is designed for content creators, social media managers, and marketers—specifically those in the career, recruitment, or "job change" (転職/就職) niches. It is ideal for anyone looking to automate the process of finding trending short-form content concepts and converting them into fresh AI-generated videos. How it works / What it does This workflow automates the pipeline from content research to video creation: Scrape Data: It triggers an Apify actor (clockworks/tiktok-scraper) to search and scrape TikTok videos related to "Job Change" (転職) and "Employment" (就職). Store Raw Data: It saves the scraped TikTok metadata (text, stats, author info) into a Google Sheet. AI Analysis & Prompting: An AI Agent (via OpenRouter) analyzes the scraped video content and creates a detailed prompt for a new video (concept, visual cues, aspect ratio). Log Prompts: The generated prompt is saved to a separate tab in the Google Sheet. Video Generation: The prompt is sent to Fal AI (Veo3 model) to generate a new 8-second, vertical (9:16) video with audio. Wait & Retrieve: The workflow waits for the generation to complete, then retrieves the video file. Cloud Storage: Finally, it uploads the generated video file to a specific Google Drive folder. How to set up Credentials: Configure the following credentials in n8n: Apify API: (Currently passed via URL query params in the workflow, recommended to switch to Header Auth). Google Sheets OAuth2: Connect your Google account. OpenRouter API: For the AI Agent. Fal AI (Header Auth): For the video generation API. Google Drive OAuth2: For uploading the final video. Google Sheets: Create a spreadsheet. Note the documentId and update the Google Sheets nodes. Ensure you have the necessary Sheet names (e.g., "シート1" for raw data, "生成済み" for prompts) and columns mapped. Google Drive: Create a destination folder. Update the Upload file node with the correct folderId. Apify: Update the token in the HTTP Request and HTTP Request1 URLs with your own Apify API token. Requirements n8n Version:** 1.x or higher (Workflow uses version 4.3 nodes). Apify Account:** With access to clockworks/tiktok-scraper and sufficient credits. Fal.ai Account:** With credits for the fal-ai/veo3 model. OpenRouter Account:** With credits for the selected LLM. Google Workspace:** Access to Drive and Sheets. How to customize the workflow Change the Niche:* Update the searchQueries JSON body in the first *HTTP Request** node (e.g., change "転職" to "Cooking" or "Fitness"). Adjust AI Logic:* Modify the *AI Agent** system prompt to change the style, tone, or structure of the video prompts it generates. Video Settings:* In the *Fal Submit** node, adjust bodyParameters to change the duration (e.g., 5s), aspect ratio (e.g., 16:9), or disable audio. Scale:* Increase the amount in the *Limit** node to process more than one video per execution.
by Rapiwa
Automatically Send WhatsApp Discount Codes to Shopify Customers Using Rapiwa Who is this for? This n8n workflow automatically sends WhatsApp promotional messages to top customers whenever a new discount code is created in Shopify. It’s perfect for store owners, marketers, sales teams, or support agents who want to engage their best customers effortlessly. The workflow fetches customer data, filters high-spending customers, verifies their WhatsApp numbers using the Rapiwa API, sends discount messages to verified contacts, and logs all activity in Google Sheets. Designed for non-technical users who don’t use the official WhatsApp Business API, this automation simplifies customer outreach and tracking without any manual work. What this Workflow Does This n8n workflow connects with a Google Sheet that contains a list of contacts. It reads rows marked for processing, cleans the phone numbers, checks their validity using Rapiwa's WhatsApp validation API, sends WhatsApp messages to valid numbers, and updates the status of each row accordingly. Key Features Runs Every 5 Minutes**: Automatically triggers the workflow Google Sheets Integration**: Reads and writes data from a specific sheet Phone Number Validation**: Confirms if a WhatsApp number is active via Rapiwa API Message Sending**: Sends a message using Rapiwa's /send-message endpoint Status Update**: Sheet is updated with success or failure status Safe API Usage**: Delays added between requests to prevent rate limits Batch Limit**: Processes max 60 rows per cycle Conditional Checks**: Skips rows without a "check" value Requirements A Google Sheet with necessary columns Rapiwa account** with active subscription (you can free 200 message) Your WhatsApp number connected to Rapiwa Valid Bearer Token n8n Instance** (self-hosted or cloud) Google Sheets node configured HTTP Request node access How to Use Step-by-Step Setup Webhook Receives Shopify Webhook (discount creation) via HTTP POST request. This is triggered when a discount is created in your Shopify store. Configure Google Sheets in n8n Use the Google Sheets node with OAuth2 access Get Rapiwa API Token Create an account on Rapiwa Connect your WhatsApp number Copy your Bearer Token from the Rapiwa dashboard Set Up HTTP Request Nodes Validate number via: https://app.rapiwa.com/api/verify-whatsapp Send message via: https://app.rapiwa.com/api/send-message Add your bearer token to the headers Google Sheet Column Structure A Google Sheet** formatted like this ➤ Sample | discount_code | created_at | shop_domain | name | number | verify | status | | -------------------------------------------- | ------- | ------------------------- | ----------------------- | -------------- | ------------- | ---------- | -------- | | V8ZGVRDFP5TB | 2025-09-25T05:26:40-04:00 | your_shop_domain | Abdul Mannan | 8801322827798| unverified | not sent | | V8ZGVRDFP5TB | 2025-09-25T05:26:40-04:00 | your_shop_domain | Abdul Mannan | 8801322827799| verified | sent | Support & Help Rapiwa Website:** https://rapiwa.com WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Alex Berman
Who is this for This workflow is for B2B sales teams, growth hackers, and revenue operators who need a reliable, low-cost pipeline of verified leads from Apollo.io -- without manually exporting CSVs or hitting Apollo's export limits. If you are prospecting into a specific industry, job title, or company size, this workflow automates the entire sourcing and storage process. How it works You configure your target audience once in the "Configure Search Parameters" node (job titles, industry, company size, lead count). The workflow sends a scrape request to ScraperCity's Apollo filter endpoint, which pulls verified contacts at $0.0039 per contact. Because scrapes run asynchronously and can take 10-60 minutes, the workflow polls ScraperCity every 60 seconds until the job completes. Once the scrape succeeds, the workflow downloads the results, parses the CSV data, removes duplicates, and appends clean rows to your Google Sheet. How to set up Create a ScraperCity account at scrapercity.com and copy your API key. In n8n, create an "Header Auth" credential named "ScraperCity API Key" with the header name "Authorization" and value "Bearer YOUR_KEY". Connect your Google Sheets OAuth2 credential. Set your Google Sheet document ID and sheet name in the "Save Leads to Google Sheets" node. Edit the "Configure Search Parameters" node with your target filters. Requirements ScraperCity account (scrapercity.com) Google Sheets OAuth2 credential A Google Sheet with headers matching the contact fields How to customize the workflow Change job titles, industry, company size, and lead count in "Configure Search Parameters". Swap Google Sheets for HubSpot, Airtable, or a webhook to push leads directly into your CRM. Add a Slack notification node after the final write step to alert your team when new leads arrive.
by WeblineIndia
(Retail Automation) Transfer Inventory Updates Across Systems This workflow automatically synchronizes inventory quantity updates between systems using a webhook-driven approach. When an inventory update is received, the workflow validates the source, prepares a clean payload, sends the update to a secondary system via an HTTP API and logs the update into Google Sheets for tracking and auditing. Quick Implementation Steps Import the workflow JSON into n8n. Configure the Webhook URL in your source system. Update the HTTP Request node with the secondary system API endpoint. Connect Google Sheets and select the target spreadsheet. Activate the workflow. What It Does This workflow listens for inventory update events sent from an external system such as an online store, POS, ERP or warehouse platform. Once an update is received, the workflow normalizes the incoming data by extracting key fields like product ID, SKU, stock quantity, source system and modification timestamp. To avoid circular synchronization issues, the workflow validates the origin of the update and ensures that updates originating from the secondary system are not reprocessed. Valid inventory updates are then transformed into a clean, API-ready payload and sent to a secondary system using an HTTP Request node. After the inventory update is successfully pushed to the secondary system, the workflow logs the inventory details into Google Sheets. This provides a simple audit trail for tracking inventory movements and troubleshooting sync issues. Who’s It For This workflow is suitable for: Retail businesses managing inventory across multiple systems Teams using WooCommerce, POS, ERP or warehouse tools Operations teams requiring inventory audit logs Developers building middleware-based inventory synchronization Businesses aiming to reduce overselling and manual stock corrections Prerequisites To use this workflow, you need: An active n8n instance (self-hosted or cloud) A source system capable of sending inventory updates via webhook SKU-based inventory management Access to a secondary system API endpoint Google Sheets account with edit permissions A Google Sheet with predefined column headers How to Use & Setup Import the workflow JSON into your n8n instance. Copy the Webhook URL from the Inventory Webhook node. Configure your source system to send inventory updates to this Webhook URL. Review the Normalize Inventory Data node to ensure required fields are mapped correctly. Verify the Check Sync Origin node to match your source system naming. Update the Send Inventory To Secondary API node with the correct API endpoint. Configure the Log Inventory Sync To Google Sheet node with your target spreadsheet. Save and activate the workflow. Once activated, the workflow runs automatically whenever an inventory update is received. How To Customize Nodes Normalize Inventory Data** Add or remove inventory-related fields as needed. Adjust field names to match your source system payload. Check Sync Origin** Modify the source comparison value to prevent loops in your setup. Prepare Inventory Payload** Change payload structure to match the secondary system API requirements. Send Inventory To Secondary API** Add authentication headers or modify HTTP method if required. Google Sheets Logging** Add additional columns such as execution ID or API response status. Add-ons (Optional Enhancements) This workflow can be extended to: Add retry logic for failed API requests Log failed sync attempts into a separate Google Sheet Send Slack or email alerts on sync failures Perform scheduled inventory reconciliation between systems Support bi-directional inventory synchronization Use Case Examples Sync inventory changes from WooCommerce to an ERP system. Push POS stock deductions to a warehouse management system. Maintain a centralized inventory audit log in Google Sheets. Prevent overselling across multiple sales channels. Monitor and troubleshoot inventory sync issues efficiently. There can be many more use cases depending on business requirements. Troubleshooting Guide | Issue | Possible Cause | Solution | |------|---------------|----------| | Workflow not triggering | Webhook URL not configured | Verify Webhook URL and HTTP method | | Inventory not syncing | Source validation blocking flow | Check source value in payload | | API request failing | Invalid endpoint or payload | Validate API URL and request body | | Google Sheet not updating | Incorrect sheet configuration | Verify sheet permissions and headers | | Duplicate updates | Missing source control | Ensure sync origin logic is correct | Need Help? If you need assistance setting up, customizing or extending this workflow or want to build similar automation workflows tailored to your business, feel free to contact n8n automation experts at WeblineIndia. Our team can help you design, optimize and deploy robust n8n automation solutions.
by Muhammad Anas Farooq
n8n Workflows GitHub Manager > A comprehensive n8n workflow that provides complete bidirectional sync between your n8n instance and GitHub - automatically backs up all your workflows with intelligent change detection AND restores them when needed. This workflow combines two powerful features in one: Backup**: Automatically detects new, edited, renamed, and deleted workflows in your n8n instance, then syncs them to GitHub with smart commit messages and an index tracking system. Restore**: Easily restore all workflows from your GitHub repository back to n8n - perfect for disaster recovery, new instance setup, or environment cloning. How It Works 🔄 Backup Mode (Automatic) Trigger: Runs automatically every day at 7 PM UTC (or manually when triggered via the Schedule Trigger). Get/Create Index: Attempts to fetch index.json from your GitHub repository. If found → Downloads and parses it. If not found → Creates a new empty index file and waits 3 seconds for GitHub to process. Fetch All Workflows: Retrieves all workflows from your n8n instance via the n8n API. Smart Comparison: The "C,E,D Checker" (Create, Edit, Delete) analyzes differences: CREATE → New workflow not in index. RENAME → Workflow name changed (deletes old file, creates new one). EDIT → Existing workflow (flagged for content comparison). DELETE → Workflow removed from n8n but still in GitHub. INDEX UPDATE → Triggered if any changes detected. Route Actions: Switch node routes each action to the appropriate branch: Create Branch → Creates new workflow files in GitHub. Edit Branch → Performs smart edit detection: Fetches current file from GitHub. Compares GitHub version vs. n8n version (normalized JSON). Only commits if content actually changed (avoids timestamp-only updates). Delete Branch → Removes workflow files from GitHub. Update Index Branch → Updates index.json with latest mappings. Commit Messages: Auto-generated with format: [Workflow Name] (Action) YYYY-MM-DD ⬇️ Restore Mode (Manual) Trigger: Manually execute via the "When clicking 'Execute workflow'" manual trigger. Set GitHub Details: Configure your repository owner and name. List Workflow Files: Fetches all workflow JSON files from the workflows/ folder in your GitHub repository. If folder not found → Workflow stops gracefully (ensure backup ran at least once first). Loop Through Files: Sequentially processes each workflow file: Downloads the JSON content from GitHub. Creates the workflow in your n8n instance via the n8n API. Sequential Processing: Handles one workflow at a time to prevent conflicts and respect rate limits. Result: All workflows from GitHub are restored to your n8n instance. Requirements GitHub OAuth2 Credentials**: Go to GitHub Developer Settings → OAuth Apps → New OAuth App. Set Authorization callback URL to your n8n instance URL (e.g., https://your-n8n.com/rest/oauth2-credential/callback). Copy Client ID and Client Secret. Add as OAuth2 credential in n8n (Credentials → New → GitHub OAuth2). GitHub Repository**: Create a new repository (public or private). Note your username (repo owner) and repository name. n8n API Credentials**: In your n8n instance → Settings → API → Create new API key. Add as n8n API credential in the workflow. How to Use Initial Setup Import the Workflow: Copy the provided JSON file. In your n8n instance → click Import Workflow → paste or upload the JSON. Create GitHub Repository: Go to GitHub → Create a new repository (e.g., n8n-workflows-manager). Leave it empty (no README, no .gitignore). Set Up GitHub OAuth2: In n8n → Credentials → New → GitHub OAuth2. Fill in: Client ID → from GitHub OAuth App. Client Secret → from GitHub OAuth App. Click Connect my account and authorize. Set Up n8n API Credentials: In n8n → Settings → API → Create new API key. Copy the key. In workflow → Credentials → New → n8n API → paste the key. Set Base URL to your n8n instance (e.g., https://your-n8n.com). Configure Repository Details: Find both "Set Github Data" nodes in the workflow (one for backup, one for restore). Edit the assignments in each: repo_owner: Replace "your-github-username" with your GitHub username. repo_name: Replace "your-github-repository-name" with your repository name. Connect Credentials to Nodes: Open each GitHub node (there are 8 total): Backup section: Create Index File, Get Download Url for Index File, Create New Files, Update Index File, Get Download Url for Github File, Delete Files, Edit Files Restore section: List Workflow Files Set Credential for GitHub OAuth2 to the one you created. Open the n8n API nodes (Get All Workflows, Create Workflow) → Set Credential for n8n API to the one you created. Using Backup Mode Test Backup: Click the "Schedule Trigger" node at the top of the workflow. Click "Test workflow". Monitor execution → All nodes in the backup section should turn green. Check your GitHub repository → Should see index.json and workflows/ folder with your workflows. Activate for Auto Backup: Once tested successfully, toggle the workflow to Active. It will now run automatically every day at 7 PM UTC. Using Restore Mode Test Restore (only after you have backups in GitHub): Click the "When clicking 'Execute workflow'" manual trigger node at the bottom. Click "Test workflow". Monitor execution → All nodes in the restore section should turn green. Check your n8n workflows list → All workflows from GitHub should now be present. When to Use Restore: Setting up a new n8n instance. Recovering after data loss. Cloning workflows to another environment. Rolling back to a previous state (manually download older commits from GitHub first). Important Notes Smart Edit Detection**: Uses normalized JSON comparison to avoid unnecessary commits when only timestamps change. Credentials**: Credential IDs are included but not actual secrets. You must reconnect credentials after restore. Restored Workflows: Created as new workflows with new IDs in **inactive state by default. File Structure**: index.json tracks all workflows; workflows/ folder contains individual workflow files. Security**: Use a private repository if workflows contain sensitive data. Credential secrets are never backed up. Customization Change Schedule**: Edit "Schedule Trigger" node → modify triggerAtHour (default: 19 = 7 PM UTC) File Path**: Modify filePath in GitHub nodes to change storage location Notifications**: Add email/notification nodes to get alerts on backup completion Selective Restore**: Add IF nodes to filter which workflows to restore Multiple Repos**: Duplicate workflow for separate prod/dev backups Author: Muhammad Anas Farooq
by Evoort Solutions
🚀 Website Traffic Monitoring with SEMrush API and Google Sheets Integration Leverage the powerful SEMrush Website Traffic Checker API to automatically fetch detailed website traffic insights and log them into Google Sheets for real-time monitoring and reporting. This no-code n8n workflow simplifies traffic analysis for marketers, analysts, and website owners. ⚙️ Node-by-Node Workflow Breakdown 1. 🟢 On Form Submission Trigger:** The workflow is initiated when a user submits a website URL via a form. This serves as the input for further processing. Use Case:** When you want to track multiple websites and monitor their performance over time. 2. 🌐 Website Traffic Checker API Request:* The workflow makes a POST request to the *SEMrush Website Traffic Checker API** via RapidAPI using the website URL that was submitted. API Data:** The API returns detailed traffic insights, including: Visits Bounce rate Page views Sessions Traffic sources And more! 3. 🔄 Reformat Parsing:** The raw API response is parsed to extract the relevant data under trafficSummary. Data Structure:** The workflow creates a clean dataset of traffic data, making it easy to store in Google Sheets. 4. 📄 Google Sheets Logging Data:** The traffic data is appended as a new row in your Google Sheet. Google Sheet Setup:** The data is organized and updated in a structured format, allowing you to track website performance over time. 💡 Use Cases 📊 SEO & Digital Marketing Agencies:** Automate client website audits by pulling live traffic data into reports. 🌐 Website Owners & Bloggers:** Monitor traffic growth and analyze content performance automatically. 📈 Data Analysts & Reporting Teams:** Feed traffic data into dashboards and integrate with other KPIs for deeper analysis. 🕵️ Competitor Tracking:** Regularly log competitor site metrics for comparative benchmarking. 🎯 Key Benefits ✅ Automated Traffic Monitoring — Run reports automatically on-demand or on a scheduled basis. ✅ Real-Time Google Sheets Logging — Easily centralize and structure traffic data for easy sharing and visualization. ✅ Zero Code Required — Powered by n8n’s visual builder, set up workflows quickly without writing a single line of code. ✅ Scalable & Flexible — Extend the workflow to include alerts, additional API integrations, or other automated tasks. 🔐 How to Get Your SEMrush API Key via RapidAPI Visit the API Listing 👉 SEMrush Website Traffic Checker API Sign In or Create an Account Log in to RapidAPI or sign up for a free account. Subscribe to the API Choose the appropriate pricing plan and click Subscribe. Access Your API Key Go to the Endpoints tab. Your API key is located under the X-RapidAPI-Key header. Secure & Use the Key Add your API key to the request headers in your workflow. Never expose the key publicly. 🔧 Step-by-Step Setup Instructions 1. Creating the Form to Capture URL In n8n, create a new workflow and add a Webhook trigger node to capture website URLs. Configure the webhook to accept URL submissions from your form. Add a form to your website or app that triggers the webhook when a URL is submitted. 2. Configure SEMrush API Request Node Add an HTTP Request node after the webhook. Set the method to POST and the URL to the SEMrush API endpoint. Add the necessary headers: X-RapidAPI-Host: semrush-website-traffic-checker.p.rapidapi.com X-RapidAPI-Key: [Your API Key] Pass the captured website URL from the webhook as a parameter in the request body. 3. Reformat API Response Add a Set node to parse and structure the API response. Extract only the necessary data, such as: trafficSummary.visits trafficSummary.bounceRate trafficSummary.pageViews trafficSummary.sessions Format the response to be clean and suitable for Google Sheets. 4. Store Data in Google Sheets Add the Google Sheets node to your workflow. Authenticate with your Google account. Select the spreadsheet and worksheet where you want to store the traffic data. Configure the node to append new rows with the extracted traffic data. Google Sheets Columns Setup A**: Website URL B**: Visits C**: Bounce Rate D**: Page Views E**: Sessions F**: Date/Time (optional, you can use a timestamp) 5. Test and Deploy Run a test submission through your form to ensure the workflow works as expected. Check the Google Sheets document to verify that the data is being logged correctly. Set up scheduling or additional workflows as needed (e.g., periodic updates). 📈 Customizing the Template You can modify the workflow to suit your specific needs: Add more data points**: Customize the SEMrush API request to fetch additional metrics (e.g., traffic sources, keywords, etc.). Create separate sheets**: If you're tracking multiple websites, you can create a different sheet for each website or group websites by category. Add alerts**: Set up email or Slack notifications if specific traffic conditions (like sudden drops) are met. Visualize data**: Integrate Google Sheets with Google Data Studio or other tools for more advanced visualizations. 🚀 Start Automating in Minutes Build your automated website traffic dashboard with n8n today — no coding required. 👉 Start with n8n for Free Save time, improve accuracy, and supercharge your traffic insights workflow!
by Daniel
Secure your n8n automations with this comprehensive template that automates periodic backups to Telegram for instant access while enabling flexible restores from Google Drive links or direct file uploads—ensuring quick recovery without data loss. 📋 What This Template Does This dual-branch workflow handles full n8n instance backups and restores seamlessly. The backup arm runs every 3 days, fetching all workflows via the n8n API, aggregating them into a JSON array, converting to a text file, and sending it to Telegram for offsite storage and sharing. The restore arm supports two entry points: manual execution to pull a backup from Google Drive or form-based upload for local files, then parses the JSON, cleans workflows for compatibility, and loops to create missing ones or update existing by name—handling batches efficiently to respect API limits. Scheduled backups with Telegram delivery for easy stakeholder access Dual restore paths: Drive download or direct file upload via form Intelligent create-or-update logic with data sanitization to avoid conflicts Looped processing with existence checks and error continuation 🔧 Prerequisites n8n instance with API enabled (self-hosted or cloud) Telegram account for bot setup Google Drive account (optional, for Drive-based restores) 🔑 Required Credentials n8n API Setup In n8n, navigate to Settings → n8n API Enable the API and generate a new key Add to n8n as "n8n API" credential type, pasting the key in the API Key field Telegram API Setup Message @BotFather on Telegram to create a new bot and get your token Find your chat ID by messaging @userinfobot Add to n8n as "Telegram API" credential type, entering the Bot Token Google Drive OAuth2 API Setup In Google Cloud Console, go to APIs & Services → Credentials Create an OAuth 2.0 Client ID for Web application, enable Drive API Add redirect URI: [your-n8n-instance-url]/rest/oauth2-credential/callback Add to n8n as "Google Drive OAuth2 API" credential type and authorize ⚙️ Configuration Steps Import the workflow JSON into your n8n instance Assign the n8n API, Telegram API, and Google Drive credentials to their nodes Update the Telegram chat ID in the "Send Backup to Telegram" node Set the Google Drive file ID in the "Download Backup from Drive" node (from file URL) Activate the workflow and test backup by executing the Schedule node manually Test restore: Run manual trigger for Drive or use the form for upload 🎯 Use Cases Dev teams backing up staging workflows to Telegram for rapid production restores during deployments Solo automators uploading local backups via form to sync across devices after n8n migrations Agencies sharing client workflow archives via Drive links for secure, collaborative restores Educational setups scheduling exports to Telegram for student template distribution and recovery ⚠️ Troubleshooting Backup file empty: Verify n8n API permissions include read access to workflows Restore parse errors: Check JSON validity in backup file; adjust Code node property reference if needed API rate limits hit: Increase Wait node duration or reduce batch size in Loop Form upload fails: Ensure file is valid JSON text; test with small backup first
by Ms. Phuong Nguyen (phuongntn)
An AI Recruiter that screens, scores, and ranks candidates in minutes — directly inside n8n. 🧠 Overview An AI-powered recruiter workflow that compares multiple candidate CVs with a single Job Description (JD). It analyzes text content, calculates fit scores, identifies strengths and weaknesses, and provides automated recommendations. ⚙️ How it works 🔹 Webhook Trigger – Upload one Job Description (JD) and multiple CVs (PDF or text) 🔹 File Detector – Auto-identifies JD vs CV 🔹 Extract & Merge – Reads text and builds candidate dataset 🔹 🤖 AI Recruiter Agent – Compares JD & CVs → returns Fit Score, Strengths, Weaknesses, and Recommendation 🔹 📤 Output Node – Sends structured JSON or summary table for HR dashboards or Chat UI Example: Upload JD.pdf + 3 candidate CVs → get instant JSON report with top match and recommendations. 🧩 Requirements OpenAI or compatible AI Agent connection (no hardcoded API keys). Input files in PDF or text format (English or Vietnamese supported). n8n Cloud or Self-Hosted v1.50+ with AI Agent nodes enabled. 🔸 “OpenAI API Key or n8n AI Agent credential required” 🧱 Customizing this workflow Swap the AI model with Gemini, Claude, or another LLM. Add a Google Sheets export node to save results. Connect to SAP HR or internal employee APIs. Adjust scoring logic or include additional attributes (experience, skills, etc.). 👩💼 Author https://www.linkedin.com/in/nguyen-phuong-17a71a147/ Empowering HR through intelligent, data-driven recruitment.