by Arlin Perez
🔍 Description: Effortlessly delete unused or inactive workflows from your n8n instance while automatically backing them up as .json files into your Google Drive. Keep your instance clean, fast, and organized — no more clutter slowing you down. This workflow is ideal for users managing large self-hosted n8n setups, or anyone who wants to maintain optimal performance while preserving full workflow backups. ✅ What it does: Accepts a full n8n Workflow URL via a form Retrieves workflow info automatically Converts the workflow’s full JSON definition into a file Uploads that file to Google Drive Deletes the workflow safely using the official n8n API Sends a Telegram notification confirming backup and deletion ⚙️ How it works: 📝 Form – Collects the full workflow URL from the user 🔍 n8n Node (Get Workflow) – Uses the URL to fetch workflow details 📦 Code Node ("JSON to File") – Converts the workflow JSON into a properly formatted .json file with UTF-8 encoding, ready to be uploaded to Google Drive. ☁️ Google Drive Upload – Uploads the .json backup file to your selected Drive folder 🗑️ n8n Node (Delete Workflow) – Deletes the workflow from your instance using its ID 📬 Telegram Notification – Notifies you that the workflow was backed up and deleted, showing title, ID, and date 📋 Requirements: Google Drive connected to your n8n account Telegram Bot connected to n8n An n8n instance with API access (self-hosted or Cloud) Your n8n API Key (Create one in the settings) 🛠️ How to Set Up: ✅ Add your Google Drive credentials ✅ Add your Telegram Bot credentials 🧾 In the “JSON to File” Code node, no additional setup is required — it automatically converts the workflow JSON into a downloadable .json file using the correct encoding and filename format. ☁️ In the Google Drive node: Binary Property: data Folder ID: your target folder in Google Drive 🔑 Create a new credential for the n8n node using: API Key: your personal n8n API key Base URL: your full n8n instance API path (e.g. https://your-n8n-instance.com/api/v1) ⚙️ Use this credential in both the Get Workflow and Delete Workflow n8n nodes 📬 In the Telegram node, use this message template: 🗑️ Workflow "{{ $json.name }}" (ID: {{ $json.id }}) was backed up to Google Drive and deleted from n8n. 📅 {{ $now }} 🔒 Important: This workflow backs up the entire workflow data to Google Drive. Please be careful with the permissions of your Google Drive folder and avoid sharing it publicly, as the backups may contain sensitive information. Ensuring proper security and access control is essential to protect your data. 🚀 Activate the workflow and you're ready to safely back up and remove workflows from your n8n instance
by Belen
This n8n template automatically transcribes GoHighLevel (GHL) call recordings and creates an AI-generated summary that is added as a note directly to the related contact in your GHL CRM. It’s designed for real estate investors, agencies, and sales teams that handle a large volume of client calls and want to keep detailed, searchable notes without spending hours on manual transcription. Who’s it for Sales and acquisitions teams that want instant call notes in their CRM Real estate wholesalers or agencies using GoHighLevel for deal flow Support and QA teams that need summarized transcripts for review Any business owner who wants to automatically document client conversations How it works A HighLevel automation workflow triggers when a call is marked “Completed” and automatically sends a webhook to n8n. The n8n workflow receives this webhook and waits briefly to ensure the call recording is ready. It retrieves the conversation and message IDs from the webhook payload. The call recording is fetched from GHL’s API. An AI transcription node converts the audio to text. A summarization node condenses the transcript into bullet points or a concise paragraph. A Code node formats the AI output into proper JSON for GHL’s “Create Note” endpoint. Finally, an HTTP Request node posts the summary to the contact’s record in GHL. How to set up Add your GoHighLevel OAuth credential and connect your agency account. Add your AI credential (e.g., OpenAI, Anthropic, or Gemini). Replace the sample webhook URL with your n8n endpoint. Test with a recent call and confirm the summary appears in the contact timeline. Requirements GoHighLevel account with API and OAuth access AI service for transcription and summarization (e.g., OpenAI Whisper + GPT) Customizing this workflow You can tailor this automation for your specific team or workflow: Add sentiment analysis or keyword extraction to the summary. Change the AI prompt to focus on “action items,” “objections,” or “next steps.” Send summaries to Slack, Notion, or Google Sheets for reporting. Trigger follow-up tasks automatically in your CRM based on keywords. Good to know AI transcription and summarization costs vary by provider — check your LLM’s pricing. GoHighLevel’s recording availability may take up to 1 minute after the call ends; adjust the delay accordingly. For OAuth setup help, refer to GHL’s OAuth documentation. Happy automating! ⚙️
by AFK Crypto
Try It Out! 🚀 Reddit Crypto Intelligence & Market Spike Detector ⸻ 🧠 Workflow Description Reddit Crypto Intelligence & Market Spike Detector is an automated market sentiment and price-monitoring workflow that connects social chatter with real-time crypto price analytics. It continuously scans new posts from r/CryptoCurrency, extracts recently mentioned coins, checks live price movements via CoinGecko, and alerts you on Discord when a significant spike or drop occurs. This automation empowers traders, analysts, and communities to spot early market trends before they become mainstream — all using free APIs and open data. ⸻ ⚙️ How It Works Monitor Reddit Activity ◦ Automatically fetches the latest posts from r/CryptoCurrency using Reddit’s free RSS feed. ◦ Captures trending titles, post timestamps, and mentions of coins or tokens (e.g., $BTC, $ETH, $SOL, $PEPE). Extract Coin Mentions ◦ A Code Node parses the feed using regex (\$[A-Za-z0-9]{2,10}) to identify any symbols or tickers discussed. ◦ Removes duplicates and normalizes all results for accurate data mapping. Fetch Market Data ◦ Each detected coin symbol is matched with CoinGecko’s public API to fetch live market data, including current price, market rank, and 24-hour price change. ◦ No API key required — completely free and reliable source. Detect Market Movement ◦ A second Code Node filters the fetched data to identify price movements greater than ±5% within the last 24 hours. ◦ This helps isolate meaningful market action from routine fluctuations. Generate and Send Alerts ◦ When a spike or dip is detected, the workflow composes a rich alert message including: ▪ 💎 Coin name and symbol ▪ 💰 Current price ▪ 📈 24h percentage change ▪ 🕒 Timestamp of detection ◦ The message is sent automatically to your Discord channel using a preconfigured webhook. ⸻ 💬 Example Output 🚨 Crypto Reddit Mention & Price Spike Alert! 🚨 💎 ETHEREUM (ETH) 💰 $3,945.23 📈 Change: +6.12% 💎 SOLANA (SOL) 💰 $145.88 📈 Change: +8.47% 🕒 Checked at: 2025-10-31T15:00:00Z If no coins cross the ±5% threshold: “No price spikes detected in the latest Reddit check.” 🔔 #MarketIntel #CryptoSentiment #PriceAlert ⸻ 🪄 Key Features • 🧠 Social + Market Intelligence – Combines Reddit sentiment with live market data to detect potential early signals. • 🔎 Automated Coin Detection – Dynamically identifies newly discussed tokens from live posts. • 📊 Smart Spike Filtering – Highlights only meaningful movements above configurable thresholds. • 💬 Discord Alerts – Delivers clear, structured, and timestamped alerts to your community automatically. • ⚙️ Fully No-Cost Stack – Utilizes free Reddit and CoinGecko APIs with no authentication required. ⸻ 🧩 Use Cases • Crypto Traders: Detect early hype or momentum shifts driven by social chatter. • Analysts: Automate social sentiment tracking tied directly to live market metrics. • Community Managers: Keep members informed about trending coins automatically. • Bots & AI Assistants: Integrate this logic to enhance automated trading signals or alpha alerts. ⸻ 🧰 Required Setup • Discord Webhook URL – For automatic alert posting. • (Optional) CoinGecko API endpoint (no API key required). • n8n Instance – Self-hosted or Cloud; free tier is sufficient. • Workflow Schedule – Recommended: hourly (Cron Node interval = 1 hour). ⸻ AFK Crypto Website: afkcrypto.com
by Edson Encinas
🧩 Template Description File Hash Reputation Checker is a security automation workflow that validates file hashes (MD5, SHA1, SHA256) and checks their reputation using the VirusTotal API. It is designed for SOC teams, security engineers, and automation pipelines that need fast and consistent malware verdicts from a single hash input. The workflow supports two input methods: An HTTP webhook for API-based integrations A Slack slash command (/hash-check) for quick analyst-driven checks directly from Slack Once a hash is submitted, the workflow normalizes and validates the input, queries VirusTotal for detection statistics, and determines whether the file is Malicious, Suspicious, Clean, or Unknown. Results are returned as a structured JSON response and also posted to Slack with severity-based formatting. ⚙️ How It Works A file hash is submitted via HTTP POST or Slack using /hash-check FILE_HASH. The hash is normalized (lowercased and trimmed). The workflow validates the hash format (MD5, SHA1, or SHA256). VirusTotal is queried for hash reputation data. Detection statistics are analyzed to calculate a verdict: Malicious Suspicious Clean Unknown A Slack message is sent for all verdicts, with alert-style formatting for malicious results. A structured JSON response is returned to the requester. 🛠️ Setup Steps VirusTotal API Create or use an existing VirusTotal account. Add your API key to n8n as VirusTotal API credentials. Slack Configuration Create a Slack App. Enable Slash Commands and create /hash-check. Set the Request URL to the n8n webhook endpoint. Connect your Slack account in n8n credentials. Activate the Workflow Activate the workflow in n8n. Test using: HTTP POST: { "text": "file_hash" } Slack: /hash-check FILE_HASH; 🎛️ Customization Ideas Route Slack messages to different channels based on severity. Add additional outputs (email, SIEM, ticketing systems). Extend the workflow to support multiple hashes per request.
by Fahmi Fahreza
TikTok Trend Analyzer with Apify + Gemini + Airtable Automatically scrape trending TikTok videos, analyze their virality using Gemini AI, and store insights directly into Airtable for creative research or content planning. Who’s it for? Marketing analysts, creators, and creative agencies looking to understand why videos go viral and how to replicate successful hooks and formats. How it works A scheduled trigger runs the Apify TikTok Trends Scraper weekly. The scraper collects trending video metadata. Data is stored in Airtable (views, likes, captions, sounds, etc.). When a specific video is submitted via webhook, the workflow fetches it from Airtable. Gemini AI analyzes the video and extracts structured insights: summary, visual hook, audio, and subtitle analysis. The workflow updates the Airtable record with these AI insights. How to set up Connect Apify and Airtable credentials, link Gemini or OpenAI keys, and adjust the schedule frequency. Add your Airtable base and table IDs. You can trigger analysis manually via the webhook endpoint.
by WeblineIndia
Webhook from Payment Provider → Jira Finance Ticket → Slack Invoice Follow-up Automation This workflow automates failed subscription renewal processing by validating webhook data, using AI to analyze urgency and churn risk, creating a Jira Finance Task and notifying the finance team via Slack. If required fields are missing, it sends an error alert for manual review instead. ⚡ Quick Implementation Steps (Start Using in 60 Seconds) Import workflow JSON into n8n. Add Jira & Slack credentials. Configure webhook URL /payment-failed-renewal in payment provider. Test with: { "customerId": "C-101", "customerEmail": "user@example.com", "subscriptionId": "S-500", "amount": 39.99 } Activate workflow. What It Does This automation connects your payment system with your financial operations. When a subscription renewal fails, the payment provider sends a webhook. The workflow validates the fields, uses OpenAI to analyze the payment failure reason (determining urgency & churn risk), routes high-value failures to high priority, creates a Jira task with an AI-drafted recovery email and alerts the finance team on Slack. If required data is missing, the workflow prevents incomplete Jira tickets by routing the event to an error handler and sending a detailed Slack alert listing all missing fields and full payload for manual inspection. Who’s It For Finance & billing departments SaaS companies with recurring billing Teams using Jira for billing operations Slack-based financial support teams Companies wanting automated revenue recovery workflows Requirements to Use This Workflow n8n instance OpenAI API Key (or compatible LLM credential) Jira Software account with permissions for FIN project Slack bot token with channel posting rights Payment provider that supports POST JSON webhooks Webhook configured to: https://YOUR-N8N-URL/webhook/payment-failed-renewal How It Works & How To Set Up Step-by-Step Flow Webhook receives payment failure payload. Validation node checks required fields: customerId customerEmail subscriptionId amount AI Analysis: OpenAI analyzes failure reason, sets urgency, and drafts email. Logic: Switch node routes High Value (>$500) to 'High' priority. Jira Finance Task created (with AI draft). Slack message sent (with Churn Risk score). Setup Steps Step 1 — Webhook Setup Method: POST Path: payment-failed-renewal Step 2 — Jira Setup Select Jira credentials in Create Jira Finance Ticket node. Ensure: Project: FIN Issue type: Task Step 3 — Slack Setup Add Slack credentials to both Slack nodes. Select finance alert channel. Step 4 — OpenAI Setup Add OpenAI credentials in the AI Analysis node. Step 5 — Test { "customerId": "CUST-001", "customerEmail": "billing@example.com", "subscriptionId": "SUB-1001", "amount": 19.99 } Step 6 — Activate Enable the workflow. How To Customize Nodes Webhook Add Basic Auth Add token-based security Add JSON schema validation Validate Payload Enhance with: Email format validation Numeric validation for amount Auto-fallback values Jira Node Customize: Ticket summary structure Labels (billing-recovery, urgent, etc.) Add custom fields Change issue type or project Slack Nodes Enhance: Mentions: @finance-team Threads instead of channel posts Rich blocks, buttons, or attachments Add-ons (Optional Enhancements) Automated email to customer for payment recovery Retry count–based escalation (e.g., retry ≥ 3 → escalate to manager) Log data to Airtable / Google Sheets Sync events into CRM (HubSpot, Salesforce, Zoho) Notify Sales for high-value customer failures Use Case Examples Stripe renewal payment fails → Create Jira task → Slack finance alert. Chargebee retry attempts exhausted → Notify billing team immediately. Declined credit card → Jira ticket with failure reason. Razorpay/PayPal renewal failure → Automated follow-up. Webhook missing data → Slack error alert ensures nothing is silently ignored. Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|----------------|----------| | Webhook not triggering | Wrong URL / method | Use POST + correct endpoint | | Jira ticket missing | No permissions or invalid payload | Check Jira permissions + required fields | | Slack shows undefined values | Missing fields in payload | Confirm payload structure | | Error alert triggered incorrectly | Field names mismatch | Match exact names: customerId, customerEmail, subscriptionId, amount | | Payment provider not sending events | Firewall/CDN blocking | Whitelist the n8n webhook URL | | Workflow silent | Not activated | Turn workflow ON | Need Help? If you want help customizing this workflow or extending it into a complete revenue recovery automation suite: WeblineIndia can support you with: Jira & Slack automation pipelines Payment provider webhook integrations Finance workflow optimization AI-based billing insights End‑to‑end automation solutions Reach out anytime for expert implementation or enhancements.
by Meak
Firecrawl Web Search Agent → Google Sheets Logger with OpenRouter + n8n Most teams craft search operators by hand and copy results into spreadsheets. This workflow automates query generation, multi-operator searches, scraping, and logging — from a single webhook call. Benefits Auto-generate Firecrawl queries from natural language (OpenRouter Agent) Use pro operators: site:, inurl:, intitle:, exclusions, related Run parallel searches (site match, in-URL, exclusions, YouTube/intitle) Append titles/URLs/results to Google Sheets automatically Return results to the caller via webhook response Optional scraping of markdown + full-page screenshots How It Works Webhook receives a natural-language search request OpenRouter-powered Agent converts it to a Firecrawl query (+ limit) Firecrawl Search runs with scrapeOptions (markdown, screenshot) Parallel queries: site:, inurl:, negative filters, YouTube intitle:automation Collect results (title, url, data fields) from each call Append rows to Google Sheets (one per result) Respond to the webhook with the aggregated payload Ready to chain into alerts, enrichment, or CRM sync Who Is This For Researchers and content teams building source lists Growth/SEO teams needing precise operator queries Agencies automating discovery, monitoring, and logging Setup Connect OpenRouter (select your LLM; e.g., GPT-4.1-mini) Add Firecrawl API key and endpoint (/v1/search) Connect Google Sheets (Document ID + Sheet/Tab) Set webhook path and allow POST from your app Define default limit (fallback = 5) and scrapeOptions ROI & Monetization Save 3–6 hours/week on manual searching & copy/paste Offer as a $500–$2k/month research automation for clients Upsell alerts (cron/webhook) and data enrichment for premium retainers Strategy Insights In the full walkthrough, I show how to: Prompt the Agent to produce flawless site:/inurl/intitle/-exclusions Map Firecrawl data fields cleanly into Sheets Handle rate limits, empty results, and retries Extend with dedupe, domain filtering, and Slack/Telegram alerts Check Out My Channel For more advanced AI automation systems that generate real business results, check out my YouTube channel where I share the exact strategies I use to build automation agencies, sell high-value services, and scale to $20k+ monthly revenue.
by Rapiwa
Automatically Send WhatsApp Discount Codes to Shopify Customers Using Rapiwa Who is this for? This n8n workflow automatically sends WhatsApp promotional messages to top customers whenever a new discount code is created in Shopify. It’s perfect for store owners, marketers, sales teams, or support agents who want to engage their best customers effortlessly. The workflow fetches customer data, filters high-spending customers, verifies their WhatsApp numbers using the Rapiwa API, sends discount messages to verified contacts, and logs all activity in Google Sheets. Designed for non-technical users who don’t use the official WhatsApp Business API, this automation simplifies customer outreach and tracking without any manual work. What this Workflow Does This n8n workflow connects with a Google Sheet that contains a list of contacts. It reads rows marked for processing, cleans the phone numbers, checks their validity using Rapiwa's WhatsApp validation API, sends WhatsApp messages to valid numbers, and updates the status of each row accordingly. Key Features Runs Every 5 Minutes**: Automatically triggers the workflow Google Sheets Integration**: Reads and writes data from a specific sheet Phone Number Validation**: Confirms if a WhatsApp number is active via Rapiwa API Message Sending**: Sends a message using Rapiwa's /send-message endpoint Status Update**: Sheet is updated with success or failure status Safe API Usage**: Delays added between requests to prevent rate limits Batch Limit**: Processes max 60 rows per cycle Conditional Checks**: Skips rows without a "check" value Requirements A Google Sheet with necessary columns Rapiwa account** with active subscription (you can free 200 message) Your WhatsApp number connected to Rapiwa Valid Bearer Token n8n Instance** (self-hosted or cloud) Google Sheets node configured HTTP Request node access How to Use Step-by-Step Setup Webhook Receives Shopify Webhook (discount creation) via HTTP POST request. This is triggered when a discount is created in your Shopify store. Configure Google Sheets in n8n Use the Google Sheets node with OAuth2 access Get Rapiwa API Token Create an account on Rapiwa Connect your WhatsApp number Copy your Bearer Token from the Rapiwa dashboard Set Up HTTP Request Nodes Validate number via: https://app.rapiwa.com/api/verify-whatsapp Send message via: https://app.rapiwa.com/api/send-message Add your bearer token to the headers Google Sheet Column Structure A Google Sheet** formatted like this ➤ Sample | discount_code | created_at | shop_domain | name | number | verify | status | | -------------------------------------------- | ------- | ------------------------- | ----------------------- | -------------- | ------------- | ---------- | -------- | | V8ZGVRDFP5TB | 2025-09-25T05:26:40-04:00 | your_shop_domain | Abdul Mannan | 8801322827798| unverified | not sent | | V8ZGVRDFP5TB | 2025-09-25T05:26:40-04:00 | your_shop_domain | Abdul Mannan | 8801322827799| verified | sent | Support & Help Rapiwa Website:** https://rapiwa.com WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Alex Berman
Who is this for This workflow is for B2B sales teams, growth hackers, and revenue operators who need a reliable, low-cost pipeline of verified leads from Apollo.io -- without manually exporting CSVs or hitting Apollo's export limits. If you are prospecting into a specific industry, job title, or company size, this workflow automates the entire sourcing and storage process. How it works You configure your target audience once in the "Configure Search Parameters" node (job titles, industry, company size, lead count). The workflow sends a scrape request to ScraperCity's Apollo filter endpoint, which pulls verified contacts at $0.0039 per contact. Because scrapes run asynchronously and can take 10-60 minutes, the workflow polls ScraperCity every 60 seconds until the job completes. Once the scrape succeeds, the workflow downloads the results, parses the CSV data, removes duplicates, and appends clean rows to your Google Sheet. How to set up Create a ScraperCity account at scrapercity.com and copy your API key. In n8n, create an "Header Auth" credential named "ScraperCity API Key" with the header name "Authorization" and value "Bearer YOUR_KEY". Connect your Google Sheets OAuth2 credential. Set your Google Sheet document ID and sheet name in the "Save Leads to Google Sheets" node. Edit the "Configure Search Parameters" node with your target filters. Requirements ScraperCity account (scrapercity.com) Google Sheets OAuth2 credential A Google Sheet with headers matching the contact fields How to customize the workflow Change job titles, industry, company size, and lead count in "Configure Search Parameters". Swap Google Sheets for HubSpot, Airtable, or a webhook to push leads directly into your CRM. Add a Slack notification node after the final write step to alert your team when new leads arrive.
by WeblineIndia
(Retail Automation) Transfer Inventory Updates Across Systems This workflow automatically synchronizes inventory quantity updates between systems using a webhook-driven approach. When an inventory update is received, the workflow validates the source, prepares a clean payload, sends the update to a secondary system via an HTTP API and logs the update into Google Sheets for tracking and auditing. Quick Implementation Steps Import the workflow JSON into n8n. Configure the Webhook URL in your source system. Update the HTTP Request node with the secondary system API endpoint. Connect Google Sheets and select the target spreadsheet. Activate the workflow. What It Does This workflow listens for inventory update events sent from an external system such as an online store, POS, ERP or warehouse platform. Once an update is received, the workflow normalizes the incoming data by extracting key fields like product ID, SKU, stock quantity, source system and modification timestamp. To avoid circular synchronization issues, the workflow validates the origin of the update and ensures that updates originating from the secondary system are not reprocessed. Valid inventory updates are then transformed into a clean, API-ready payload and sent to a secondary system using an HTTP Request node. After the inventory update is successfully pushed to the secondary system, the workflow logs the inventory details into Google Sheets. This provides a simple audit trail for tracking inventory movements and troubleshooting sync issues. Who’s It For This workflow is suitable for: Retail businesses managing inventory across multiple systems Teams using WooCommerce, POS, ERP or warehouse tools Operations teams requiring inventory audit logs Developers building middleware-based inventory synchronization Businesses aiming to reduce overselling and manual stock corrections Prerequisites To use this workflow, you need: An active n8n instance (self-hosted or cloud) A source system capable of sending inventory updates via webhook SKU-based inventory management Access to a secondary system API endpoint Google Sheets account with edit permissions A Google Sheet with predefined column headers How to Use & Setup Import the workflow JSON into your n8n instance. Copy the Webhook URL from the Inventory Webhook node. Configure your source system to send inventory updates to this Webhook URL. Review the Normalize Inventory Data node to ensure required fields are mapped correctly. Verify the Check Sync Origin node to match your source system naming. Update the Send Inventory To Secondary API node with the correct API endpoint. Configure the Log Inventory Sync To Google Sheet node with your target spreadsheet. Save and activate the workflow. Once activated, the workflow runs automatically whenever an inventory update is received. How To Customize Nodes Normalize Inventory Data** Add or remove inventory-related fields as needed. Adjust field names to match your source system payload. Check Sync Origin** Modify the source comparison value to prevent loops in your setup. Prepare Inventory Payload** Change payload structure to match the secondary system API requirements. Send Inventory To Secondary API** Add authentication headers or modify HTTP method if required. Google Sheets Logging** Add additional columns such as execution ID or API response status. Add-ons (Optional Enhancements) This workflow can be extended to: Add retry logic for failed API requests Log failed sync attempts into a separate Google Sheet Send Slack or email alerts on sync failures Perform scheduled inventory reconciliation between systems Support bi-directional inventory synchronization Use Case Examples Sync inventory changes from WooCommerce to an ERP system. Push POS stock deductions to a warehouse management system. Maintain a centralized inventory audit log in Google Sheets. Prevent overselling across multiple sales channels. Monitor and troubleshoot inventory sync issues efficiently. There can be many more use cases depending on business requirements. Troubleshooting Guide | Issue | Possible Cause | Solution | |------|---------------|----------| | Workflow not triggering | Webhook URL not configured | Verify Webhook URL and HTTP method | | Inventory not syncing | Source validation blocking flow | Check source value in payload | | API request failing | Invalid endpoint or payload | Validate API URL and request body | | Google Sheet not updating | Incorrect sheet configuration | Verify sheet permissions and headers | | Duplicate updates | Missing source control | Ensure sync origin logic is correct | Need Help? If you need assistance setting up, customizing or extending this workflow or want to build similar automation workflows tailored to your business, feel free to contact n8n automation experts at WeblineIndia. Our team can help you design, optimize and deploy robust n8n automation solutions.
by Muhammad Anas Farooq
n8n Workflows GitHub Manager > A comprehensive n8n workflow that provides complete bidirectional sync between your n8n instance and GitHub - automatically backs up all your workflows with intelligent change detection AND restores them when needed. This workflow combines two powerful features in one: Backup**: Automatically detects new, edited, renamed, and deleted workflows in your n8n instance, then syncs them to GitHub with smart commit messages and an index tracking system. Restore**: Easily restore all workflows from your GitHub repository back to n8n - perfect for disaster recovery, new instance setup, or environment cloning. How It Works 🔄 Backup Mode (Automatic) Trigger: Runs automatically every day at 7 PM UTC (or manually when triggered via the Schedule Trigger). Get/Create Index: Attempts to fetch index.json from your GitHub repository. If found → Downloads and parses it. If not found → Creates a new empty index file and waits 3 seconds for GitHub to process. Fetch All Workflows: Retrieves all workflows from your n8n instance via the n8n API. Smart Comparison: The "C,E,D Checker" (Create, Edit, Delete) analyzes differences: CREATE → New workflow not in index. RENAME → Workflow name changed (deletes old file, creates new one). EDIT → Existing workflow (flagged for content comparison). DELETE → Workflow removed from n8n but still in GitHub. INDEX UPDATE → Triggered if any changes detected. Route Actions: Switch node routes each action to the appropriate branch: Create Branch → Creates new workflow files in GitHub. Edit Branch → Performs smart edit detection: Fetches current file from GitHub. Compares GitHub version vs. n8n version (normalized JSON). Only commits if content actually changed (avoids timestamp-only updates). Delete Branch → Removes workflow files from GitHub. Update Index Branch → Updates index.json with latest mappings. Commit Messages: Auto-generated with format: [Workflow Name] (Action) YYYY-MM-DD ⬇️ Restore Mode (Manual) Trigger: Manually execute via the "When clicking 'Execute workflow'" manual trigger. Set GitHub Details: Configure your repository owner and name. List Workflow Files: Fetches all workflow JSON files from the workflows/ folder in your GitHub repository. If folder not found → Workflow stops gracefully (ensure backup ran at least once first). Loop Through Files: Sequentially processes each workflow file: Downloads the JSON content from GitHub. Creates the workflow in your n8n instance via the n8n API. Sequential Processing: Handles one workflow at a time to prevent conflicts and respect rate limits. Result: All workflows from GitHub are restored to your n8n instance. Requirements GitHub OAuth2 Credentials**: Go to GitHub Developer Settings → OAuth Apps → New OAuth App. Set Authorization callback URL to your n8n instance URL (e.g., https://your-n8n.com/rest/oauth2-credential/callback). Copy Client ID and Client Secret. Add as OAuth2 credential in n8n (Credentials → New → GitHub OAuth2). GitHub Repository**: Create a new repository (public or private). Note your username (repo owner) and repository name. n8n API Credentials**: In your n8n instance → Settings → API → Create new API key. Add as n8n API credential in the workflow. How to Use Initial Setup Import the Workflow: Copy the provided JSON file. In your n8n instance → click Import Workflow → paste or upload the JSON. Create GitHub Repository: Go to GitHub → Create a new repository (e.g., n8n-workflows-manager). Leave it empty (no README, no .gitignore). Set Up GitHub OAuth2: In n8n → Credentials → New → GitHub OAuth2. Fill in: Client ID → from GitHub OAuth App. Client Secret → from GitHub OAuth App. Click Connect my account and authorize. Set Up n8n API Credentials: In n8n → Settings → API → Create new API key. Copy the key. In workflow → Credentials → New → n8n API → paste the key. Set Base URL to your n8n instance (e.g., https://your-n8n.com). Configure Repository Details: Find both "Set Github Data" nodes in the workflow (one for backup, one for restore). Edit the assignments in each: repo_owner: Replace "your-github-username" with your GitHub username. repo_name: Replace "your-github-repository-name" with your repository name. Connect Credentials to Nodes: Open each GitHub node (there are 8 total): Backup section: Create Index File, Get Download Url for Index File, Create New Files, Update Index File, Get Download Url for Github File, Delete Files, Edit Files Restore section: List Workflow Files Set Credential for GitHub OAuth2 to the one you created. Open the n8n API nodes (Get All Workflows, Create Workflow) → Set Credential for n8n API to the one you created. Using Backup Mode Test Backup: Click the "Schedule Trigger" node at the top of the workflow. Click "Test workflow". Monitor execution → All nodes in the backup section should turn green. Check your GitHub repository → Should see index.json and workflows/ folder with your workflows. Activate for Auto Backup: Once tested successfully, toggle the workflow to Active. It will now run automatically every day at 7 PM UTC. Using Restore Mode Test Restore (only after you have backups in GitHub): Click the "When clicking 'Execute workflow'" manual trigger node at the bottom. Click "Test workflow". Monitor execution → All nodes in the restore section should turn green. Check your n8n workflows list → All workflows from GitHub should now be present. When to Use Restore: Setting up a new n8n instance. Recovering after data loss. Cloning workflows to another environment. Rolling back to a previous state (manually download older commits from GitHub first). Important Notes Smart Edit Detection**: Uses normalized JSON comparison to avoid unnecessary commits when only timestamps change. Credentials**: Credential IDs are included but not actual secrets. You must reconnect credentials after restore. Restored Workflows: Created as new workflows with new IDs in **inactive state by default. File Structure**: index.json tracks all workflows; workflows/ folder contains individual workflow files. Security**: Use a private repository if workflows contain sensitive data. Credential secrets are never backed up. Customization Change Schedule**: Edit "Schedule Trigger" node → modify triggerAtHour (default: 19 = 7 PM UTC) File Path**: Modify filePath in GitHub nodes to change storage location Notifications**: Add email/notification nodes to get alerts on backup completion Selective Restore**: Add IF nodes to filter which workflows to restore Multiple Repos**: Duplicate workflow for separate prod/dev backups Author: Muhammad Anas Farooq
by Evoort Solutions
🚀 Website Traffic Monitoring with SEMrush API and Google Sheets Integration Leverage the powerful SEMrush Website Traffic Checker API to automatically fetch detailed website traffic insights and log them into Google Sheets for real-time monitoring and reporting. This no-code n8n workflow simplifies traffic analysis for marketers, analysts, and website owners. ⚙️ Node-by-Node Workflow Breakdown 1. 🟢 On Form Submission Trigger:** The workflow is initiated when a user submits a website URL via a form. This serves as the input for further processing. Use Case:** When you want to track multiple websites and monitor their performance over time. 2. 🌐 Website Traffic Checker API Request:* The workflow makes a POST request to the *SEMrush Website Traffic Checker API** via RapidAPI using the website URL that was submitted. API Data:** The API returns detailed traffic insights, including: Visits Bounce rate Page views Sessions Traffic sources And more! 3. 🔄 Reformat Parsing:** The raw API response is parsed to extract the relevant data under trafficSummary. Data Structure:** The workflow creates a clean dataset of traffic data, making it easy to store in Google Sheets. 4. 📄 Google Sheets Logging Data:** The traffic data is appended as a new row in your Google Sheet. Google Sheet Setup:** The data is organized and updated in a structured format, allowing you to track website performance over time. 💡 Use Cases 📊 SEO & Digital Marketing Agencies:** Automate client website audits by pulling live traffic data into reports. 🌐 Website Owners & Bloggers:** Monitor traffic growth and analyze content performance automatically. 📈 Data Analysts & Reporting Teams:** Feed traffic data into dashboards and integrate with other KPIs for deeper analysis. 🕵️ Competitor Tracking:** Regularly log competitor site metrics for comparative benchmarking. 🎯 Key Benefits ✅ Automated Traffic Monitoring — Run reports automatically on-demand or on a scheduled basis. ✅ Real-Time Google Sheets Logging — Easily centralize and structure traffic data for easy sharing and visualization. ✅ Zero Code Required — Powered by n8n’s visual builder, set up workflows quickly without writing a single line of code. ✅ Scalable & Flexible — Extend the workflow to include alerts, additional API integrations, or other automated tasks. 🔐 How to Get Your SEMrush API Key via RapidAPI Visit the API Listing 👉 SEMrush Website Traffic Checker API Sign In or Create an Account Log in to RapidAPI or sign up for a free account. Subscribe to the API Choose the appropriate pricing plan and click Subscribe. Access Your API Key Go to the Endpoints tab. Your API key is located under the X-RapidAPI-Key header. Secure & Use the Key Add your API key to the request headers in your workflow. Never expose the key publicly. 🔧 Step-by-Step Setup Instructions 1. Creating the Form to Capture URL In n8n, create a new workflow and add a Webhook trigger node to capture website URLs. Configure the webhook to accept URL submissions from your form. Add a form to your website or app that triggers the webhook when a URL is submitted. 2. Configure SEMrush API Request Node Add an HTTP Request node after the webhook. Set the method to POST and the URL to the SEMrush API endpoint. Add the necessary headers: X-RapidAPI-Host: semrush-website-traffic-checker.p.rapidapi.com X-RapidAPI-Key: [Your API Key] Pass the captured website URL from the webhook as a parameter in the request body. 3. Reformat API Response Add a Set node to parse and structure the API response. Extract only the necessary data, such as: trafficSummary.visits trafficSummary.bounceRate trafficSummary.pageViews trafficSummary.sessions Format the response to be clean and suitable for Google Sheets. 4. Store Data in Google Sheets Add the Google Sheets node to your workflow. Authenticate with your Google account. Select the spreadsheet and worksheet where you want to store the traffic data. Configure the node to append new rows with the extracted traffic data. Google Sheets Columns Setup A**: Website URL B**: Visits C**: Bounce Rate D**: Page Views E**: Sessions F**: Date/Time (optional, you can use a timestamp) 5. Test and Deploy Run a test submission through your form to ensure the workflow works as expected. Check the Google Sheets document to verify that the data is being logged correctly. Set up scheduling or additional workflows as needed (e.g., periodic updates). 📈 Customizing the Template You can modify the workflow to suit your specific needs: Add more data points**: Customize the SEMrush API request to fetch additional metrics (e.g., traffic sources, keywords, etc.). Create separate sheets**: If you're tracking multiple websites, you can create a different sheet for each website or group websites by category. Add alerts**: Set up email or Slack notifications if specific traffic conditions (like sudden drops) are met. Visualize data**: Integrate Google Sheets with Google Data Studio or other tools for more advanced visualizations. 🚀 Start Automating in Minutes Build your automated website traffic dashboard with n8n today — no coding required. 👉 Start with n8n for Free Save time, improve accuracy, and supercharge your traffic insights workflow!