by PollupAI
Who’s it for This workflow is built for B2B SaaS and CX teams that are drowning in unstructured customer feedback across tools. It’s ideal for Customer Success, Product and Support leaders who want a light “voice of customer engine” without rebuilding their stack: Gmail for interactions, Slack for conversations, Pipedrive for notes and Zendesk for tickets, plus Notion for follow-up tasks. How it works / What it does The workflow runs on a schedule or manual trigger and first sets the CSM’s email address. It then uses an AI “Data agent” to pull recent customer signals from multiple sources: Gmail messages, Slack messages, Pipedrive notes and Zendesk tickets. A “Signals agent” compresses each piece of feedback into a concise, neutral summary, which is then grouped by topic via a “Clustering agent”. Each cluster gets a label, count and examples. Finally, an “Action agent” routes clusters based on their label: Create Zendesk tickets for product/performance issues Post to a dedicated Slack channel for billing / contract topics Create Notion tasks for sales-related feedback Send targeted Gmail messages to the CSM for high-risk or engagement-related items How to set up Import the workflow into n8n. Connect credentials for Gmail, Slack, Pipedrive, Zendesk, Notion and OpenAI. Update the CSM email in the “Set CSM email” node. Adjust date filters, send-to addresses and Slack channel IDs as needed. Enable the schedule trigger for weekly or daily digests. Requirements Active accounts & credentials for: Gmail, Slack, Pipedrive, Zendesk and Notion OpenAI (or compatible) API key for the LLM node At least one Slack channel for posting feedback (e.g. #billing-feedback) How to customize the workflow Change the time window or filters (sender, channel, query) for each data source. Edit the clustering and routing prompts to match your own categories and teams. Add new destinations (e.g. Jira, HubSpot) by connecting more tools to the Action agent. Modify thresholds (e.g. minimum count) before a cluster triggers an action. Localize labels and email copy to your team’s language and tone.
by Cheng Siong Chin
Introduction Upload invoices via Telegram, receive structured data instantly. Perfect for accountants and finance teams. How It Works Telegram bot receives invoices, downloads files, extracts data using OpenAI, then returns analysis. Workflow Template Telegram Trigger → Document Check → Get File → HTTP Download → AI Extract → Format Response → Send to Telegram Workflow Steps Telegram Trigger: Listens for uploads. Document Check: Validates files; routes errors. Get File: Retrieves metadata. HTTP Download: Fetches content. AI Extract: OpenAI parses invoice fields. Format Response: Structures data. Send Analysis: Delivers to chat. Setup Instructions Telegram Bot: Create via BotFather, add credentials. OpenAI Agent: Add API key and extraction prompt. HTTP Node: Set authentication. Parser: Define invoice schema. Error Handling: Configure fallbacks. Prerequisites n8n instance Telegram Bot Token OpenAI API key Customization Database storage Accounting software integration Benefits Eliminates manual entry Reduces errors
by Nick Canfield
Try It Out! This n8n template uses AI to automatically respond to your Gmail inbox by drafting response for your approval via email. How it works Gmail Trigger** monitors your inbox for new emails AI Analysis** determines if a response is needed based on your criteria Draft Generation** creates contextually appropriate replies using your business information Human Approval** sends you the draft for review before sending Auto-Send** replies automatically once approved Setup Connect your Gmail account to the Gmail Trigger node Update the "Your Information" node with: Entity name and description Approval email address Resource guide (FAQs, policies, key info) Response guidelines (tone, style, formatting preferences) Configure your LLM provider (OpenAI, Claude, Gemini, etc.) with API credentials Test with a sample email Requirements n8n instance (self-hosted or cloud) Gmail account with API access LLM provider API key Need Help? Email Nick @ nick@tropicflare.com
by Ronnie Craig
Healthcare Email Autoresponder - Daily Outreach 📧 A production-ready n8n workflow for automated healthcare email marketing with AI-powered personalization. 🎯 What This Workflow Does This automated email system sends daily personalized healthcare-themed emails to your contact list. Perfect for: Healthcare professionals building patient relationships Medical practices maintaining client engagement Wellness coaches staying connected with clients Health educators sharing daily motivation ✨ Key Features AI-Powered Personalization**: Uses OpenAI to customize each email with recipient's name Smart Rate Limiting**: Random 2-5 minute delays between emails to avoid spam filters Batch Processing**: Limits to 10 emails per run for better deliverability Email Tracking**: Updates Google Sheets to prevent duplicates and track progress Professional Templates**: Healthcare-themed content with customizable signatures Automated Scheduling**: Runs daily at 1 PM (customizable) 🛠️ Setup Instructions Prerequisites n8n instance (cloud or self-hosted) Gmail account for sending emails Google Sheets for contact management OpenAI API key Step 1: Import the Workflow Download the Healthcare_Email_Autoresponder_Community_Template.json file In n8n, go to Templates and click "Import from File" Select the downloaded JSON file The workflow will be imported as inactive Step 2: Configure Credentials Gmail OAuth2 Setup: Click on the "Send Email" node Create new Gmail OAuth2 credential Follow n8n's Gmail setup guide Test the connection Google Sheets Setup: Click on the "Healthcare_Contact_List" node Create new Google Sheets OAuth2 credential Replace YOUR_GOOGLE_SHEET_ID_HERE with your actual sheet ID Ensure your sheet has these columns: First Name Email Emailed (for tracking timestamps) OpenAI API Setup: Click on the "OpenAI Chat Model" node Create new OpenAI credential Add your OpenAI API key Select your preferred model (gpt-4o-mini recommended for cost efficiency) Step 3: Customize Your Email Template Click on the "AI Email Generator" node Edit the system message to include your details: Replace [YOUR NAME HERE] with your actual name Replace [YOUR TITLE HERE] with your professional title Replace [YOUR COMPANY HERE] with your company name Replace [YOUR PHONE NUMBER] with your phone number Replace [YOUR EMAIL] with your email address Replace [YOUR WEBSITE] with your website URL Step 4: Prepare Your Contact List Create a Google Sheet with the following structure: | First Name | Email | Emailed | |------------|-------------------|------------| | John | john@example.com | | | Jane | jane@example.com | | Important Notes: Leave the "Emailed" column empty initially The workflow will populate timestamps as emails are sent Only contacts with empty "Emailed" cells will receive emails Step 5: Test and Activate Test the workflow with a few sample contacts Check that emails are being generated and sent correctly Verify that Google Sheets is being updated with timestamps Once satisfied, activate the workflow 📊 Google Sheets Structure Your contact sheet should include these columns: First Name** (required): Used for personalization Email** (required): Recipient email address Emailed** (required): Timestamp tracking (leave empty initially) Optional columns you can add: Last Name Company Phone Notes ⚙️ Customization Options Change Email Frequency Edit the "Daily Trigger (1 PM)" node Modify the schedule (hourly, daily, weekly) Set preferred time zones Adjust Batch Size Edit the "Limit to 10 Contacts" node Change maxItems value (recommend staying under 50) Modify Wait Times Edit the "Random Wait (2-5min)" node Adjust the random delay formula Current: {{ Math.floor(Math.random() * 4) + 2 }} (2-5 minutes) Update Email Content Edit the system message in "AI Email Generator" node Change the joke, signature, or entire email structure Add seasonal content or special promotions 🔧 Troubleshooting Common Issues: Emails not sending Verify Gmail credentials are active Check email quota limits Ensure recipient emails are valid Google Sheets not updating Confirm sheet ID is correct Check column names match exactly Verify Google Sheets credentials AI not generating content Validate OpenAI API key Check API quota and billing Test with different model if needed Rate limiting issues Increase wait times between emails Reduce batch size Check Gmail sending limits 📈 Best Practices Start Small: Begin with 5-10 contacts to test deliverability Monitor Metrics: Track open rates and responses Respect Privacy: Include unsubscribe options Stay Relevant: Update content regularly Follow Regulations: Comply with CAN-SPAM and GDPR 🤝 Contributing to the Community This template is designed to be: Easy to understand**: Clear node names and documentation Production ready**: Includes error handling and rate limiting Customizable**: Template placeholders for personalization Well documented**: Comprehensive setup instructions Feel free to adapt this workflow for your specific healthcare niche! 📄 License This workflow template is provided free to the n8n community under MIT License. 🆘 Support For questions or issues: Check the n8n community forum Review n8n's official documentation Test each node individually to isolate problems Made with ❤️ for the n8n community
by GYEONGJUN CHAE
Get top Binance Earn yields sent to Email This workflow automates the tracking of passive income opportunities on Binance by fetching real-time Flexible Earn APY rates, calculating potential returns, and delivering a daily summary to your inbox. Manually checking crypto savings rates is tedious. This template handles the complex authentication required by Binance (HMAC-SHA256 signing), filters through hundreds of assets to find the highest yields, and calculates exactly how much you would earn daily on a simulated capital amount. 👥 Who is this for? Crypto Investors** looking to maximize passive income on idle assets. DeFi/CeFi Analysts** monitoring market rate fluctuations. Hodlers** who want to ensure their assets are generating the best possible yield without daily manual checks. ⚙️ How it works Schedule Trigger: Initiates the workflow every 24 hours (customizable). Authentication Handler: Generates a timestamp and creates a secure HMAC-SHA256 signature required by the Binance API. Data Fetching: specific HTTP Request retrieves the full list of "Flexible Earn" products. Processing: The workflow splits the data, standardizes field names, and sorts all assets by Annual Percentage Rate (APR) in descending order. Analysis: It limits the list to the top 15 assets and calculates estimated daily earnings based on a $10,000 reference capital (customizable via code). Reporting: A formatted HTML email is sent via Gmail with the top opportunities. 🛠️ Setup Guide Binance API: Log in to Binance and create a Read-Only API Key. Configure Nodes: Open the Set Credentials 1 node: Paste your API_KEY and SECRET_KEY in each field. Email Setup: Open the Send Email node. Select or connect your Gmail credentials. Update the "To" address to your email. 🔌 Requirements An active Binance account. Binance API Key & Secret (Read-Only permissions are sufficient). A Gmail account (or you can swap the Gmail node for Outlook/Slack/Telegram). 🎨 How to customize Change Capital Calculation:** Open the "Filter & Analyze" Code node and change const capital = 10000; to your actual portfolio size to see real projected earnings. Filter Assets:** Add an If node after the "Split Out" node to only filter for specific assets (e.g., USDT, BTC, ETH) if you don't want to see altcoins. Change Frequency:** Open the "Daily Trigger" node to run this hourly or weekly.
by Florent
Restore workflows & credentials from FTP - Remote Backup Solution This n8n template provides a safe and intelligent restore solution for self-hosted n8n instances, allowing you to restore workflows and credentials from FTP remote backups. Perfect for disaster recovery or migrating between environments, this workflow automatically identifies your most recent FTP backup and provides a manual restore capability that intelligently excludes the current workflow to prevent conflicts. Works seamlessly with date-organized backup folders stored on any FTP/SFTP server. Good to know This workflow uses n8n's native import commands (n8n import:workflow and n8n import:credentials) Works with date-formatted backup folders (YYYY-MM-DD) stored on FTP servers The restore process intelligently excludes the current workflow to prevent overwriting itself Requires FTP/SFTP server access and proper Docker volume configuration All downloaded files are temporarily stored server-side before import Compatible with backups created by n8n's export commands and uploaded to FTP Supports selective restoration: restore only credentials, only workflows, or both How it works Restore Process (Manual) Manual trigger with configurable pinned data options (credentials: true/false, worflows: true/false) The Init node sets up all necessary paths, timestamps, and configuration variables using your environment settings The workflow connects to your FTP server and scans for available backup dates Automatically identifies the most recent backup folder (latest YYYY-MM-DD date) Creates temporary restore folders on your local server for downloaded files If restoring credentials: Lists all credential files from FTP backup folder Downloads credential files to temporary local folder Writes files to disk using "Read/Write Files from Disk" node Direct import using n8n's import command Credentials are imported with their encrypted format intact If restoring workflows: Lists all workflow JSON files from FTP backup folder Downloads workflow files to temporary local folder Filters out the credentials subfolder to prevent importing it as a workflow Writes workflow files to disk Intelligently excludes the current restore workflow to prevent conflicts Imports all other workflows using n8n's import command Optional email notifications provide detailed restore summaries with command outputs Temporary files remain on server for verification (manual cleanup recommended) How to use Prerequisites Existing n8n backups on FTP server in date-organized folder structure (format: /ftp-backup-folder/YYYY-MM-DD/) Workflow backups as JSON files in the date folder Credentials backups in subfolder: /ftp-backup-folder/YYYY-MM-DD/n8n-credentials/ FTP/SFTP access credentials configured in n8n For new environments: N8N_ENCRYPTION_KEY from source environment (see dedicated section below) Initial Setup Configure your environment variables: N8N_ADMIN_EMAIL: Your email for notifications (optional) FTP_BACKUP_FOLDER: FTP path where backups are stored (e.g., /n8n-backups) N8N_PROJECTS_DIR: Projects root directory (e.g., /files/n8n-projects-data) GENERIC_TIMEZONE: Your local timezone (e.g., Europe/Paris) N8N_ENCRYPTION_KEY: Required if restoring credentials to a new environment (see dedicated section below) Create your FTP credential in n8n: Add a new FTP/SFTP credential Configure host, port, username, and password/key Test the connection Update the Init node: (Optional) Configure your email here: const N8N_ADMIN_EMAIL = $env.N8N_ADMIN_EMAIL || 'youremail@world.com'; Set PROJECT_FOLDER_NAME to "Workflow-backups" (or your preferred name) Set FTP_BACKUP_FOLDER to match your FTP backup path (default: /n8n-backups) Set credentials to "n8n-credentials" (or your backup credentials folder name) Set FTPName to a descriptive name for your FTP server (used in notifications) Configure FTP credentials in nodes: Update the FTP credential in "List Credentials Folders" node Verify all FTP nodes use the same credential Test connection by executing "List Credentials Folders" node Optional: Configure SMTP for email notifications: Add SMTP credential in n8n Activate "SUCCESS email Credentials" and "SUCCESS email Workflows" nodes Or remove email nodes if not needed Performing a Restore Open the workflow and locate the "Start Restore" manual trigger node Edit the pinned data to choose what to restore: { "credentials": true, "worflows": true } credentials: true - Restore credentials from FTP worflows: true - Restore workflows from FTP (note: typo preserved from original) Set both to true to restore everything Update the node's notes to reflect your choice (for documentation) Click "Execute workflow" on the "Start Restore" node The workflow will: Connect to FTP and find the most recent backup Download selected files to temporary local folders Import credentials and/or workflows Send success email with detailed operation logs Check the console logs or email for detailed restore summary Important Notes The workflow automatically excludes itself during restore to prevent conflicts Credentials are restored with their encryption intact. If restoring to a new environment, you must configure the N8N_ENCRYPTION_KEY from the source environment (see dedicated section below) Existing workflows/credentials with the same names will be overwritten Temporary folders are created with date prefix (e.g., 2025-01-15-restore-credentials) Test in a non-production environment first if unsure Critical: N8N_ENCRYPTION_KEY Configuration Why this is critical: n8n generates an encryption key automatically on first launch and saves it in the ~/.n8n/config file. However, if this file is lost (for example, due to missing Docker volume persistence), n8n will generate a NEW key, making all previously encrypted credentials inaccessible. When you need to configure N8N_ENCRYPTION_KEY: Restoring to a new n8n instance When your data directory is not persisted between container recreations Migrating from one server to another As a best practice to ensure key persistence across updates How credentials encryption works: Credentials are encrypted with a specific key unique to each n8n instance This key is auto-generated on first launch and stored in /home/node/.n8n/config When you backup credentials, they remain encrypted but the key is NOT included If the key file is lost or a new key is generated, restored credentials cannot be decrypted Setting N8N_ENCRYPTION_KEY explicitly ensures the key remains consistent Solution: Retrieve and configure the encryption key Step 1: Get the key from your source environment Check if the key is defined in environment variables docker-compose exec n8n printenv N8N_ENCRYPTION_KEY If this command returns nothing, the key is auto-generated and stored in n8n's data volume: Enter the container docker-compose exec n8n sh Check configuration file cat /home/node/.n8n/config Exit container exit Step 2: Configure the key in your target environment Option A: Using .env file (recommended for security) Add to your .env file N8N_ENCRYPTION_KEY=your_retrieved_key_here Then reference it in docker-compose.yml: services: n8n: environment: N8N_ENCRYPTION_KEY=${N8N_ENCRYPTION_KEY} Option B: Directly in docker-compose.yml (less secure) services: n8n: environment: N8N_ENCRYPTION_KEY=your_retrieved_key_here Step 3: Restart n8n docker-compose restart n8n Step 4: Now restore your credentials Only after configuring the encryption key, run the restore workflow with credentials: true. Best practice for future backups: Always save your N8N_ENCRYPTION_KEY in a secure location alongside your backups Consider storing it in a password manager or secure vault Document it in your disaster recovery procedures Requirements FTP Server FTP or SFTP server with existing n8n backups Read access to backup folder structure Network connectivity from n8n instance to FTP server Existing Backups on FTP Date-organized backup folders (YYYY-MM-DD format) Backup files created by n8n's export commands or compatible format Credentials in subfolder structure: YYYY-MM-DD/n8n-credentials/ Environment Self-hosted n8n instance (Docker recommended) Docker volumes mounted with write access to project folder Access to n8n CLI commands (n8n import:credentials and n8n import:workflow) Proper file system permissions for temporary folder creation Credentials FTP/SFTP credential configured in n8n Optional: SMTP credentials for email notifications Technical Notes FTP Connection and Download Process Uses n8n's built-in FTP node for all remote operations Supports both FTP and SFTP protocols Downloads files as binary data before writing to disk Temporary local storage required for import process Smart Workflow Exclusion During workflow restore, the current workflow's name is cleaned and matched against backup files This prevents the restore workflow from overwriting itself The exclusion logic handles special characters and spaces in workflow names A bash command removes the current workflow from the temporary restore folder before import Credentials Subfolder Filtering The "Filter out Credentials sub-folder" node checks for binary data presence Only items with binary data (actual files) proceed to disk write Prevents the credentials subfolder from being imported as a workflow Timezone Handling All timestamps use UTC for technical operations Display times use local timezone for user-friendly readability FTP backup folder scanning works with YYYY-MM-DD format regardless of timezone Security FTP connections should use SFTP or FTPS for encrypted transmission Credentials are imported in n8n's encrypted format (encryption preserved) Temporary files stored in project-specific folders Consider access controls for who can trigger restore operations No sensitive credential data is logged in console output Troubleshooting Common Issues FTP connection fails: Verify FTP credentials are correctly configured and server is accessible No backups found: Ensure the FTP_BACKUP_FOLDER path is correct and contains date-formatted folders (YYYY-MM-DD) Permission errors: Ensure Docker user has write access to N8N_PROJECTS_DIR for temporary folders Path not found: Verify all volume mounts in docker-compose.yml match your project folder location Import fails: Check that backup files are in valid n8n export format Download errors: Verify FTP path structure matches expected format (date folder / credentials subfolder / files) Workflow conflicts: The workflow automatically excludes itself, but ensure backup files are properly named Credentials not restored: Verify the FTP backup contains a n8n-credentials subfolder with credential files Credentials decrypt error: Ensure N8N_ENCRYPTION_KEY matches the source environment Error Handling "Find Last Backup" node has error output configured to catch FTP listing issues "Download Workflow Files" node continues on error to handle presence of credentials subfolder All critical nodes log detailed error information to console Email notifications include stdout and stderr from import commands Version Compatibility Tested with n8n version 1.113.3 Compatible with Docker-based n8n installations Requires n8n CLI access (available in official Docker images) Works with any FTP/SFTP server (Synology NAS, dedicated FTP servers, cloud FTP services) This workflow is designed for FTP/SFTP remote backup restoration. For local disk backups, see the companion workflow "n8n Restore from Disk". Works best with backups from: "Automated n8n Workflows & Credentials Backup to Local/Server Disk & FTP"
by Jitesh Dugar
Streamline your manufacturing quality control process with automated inspection tracking, compliance documentation, and real-time alerts. This workflow eliminates manual QC paperwork while ensuring ISO compliance and instant visibility into product quality. 🎯 Use Case Perfect for manufacturing facilities that need to: Document quality inspections for compliance audits Track product defects and non-conformities Generate certificates of compliance automatically Alert teams instantly when products fail inspection Maintain ISO 9001:2015 documentation requirements ✨ Key Features Automated Data Collection Accepts inspection data from web forms (Typeform) or Google Sheets Processes measurements against predefined specifications Calculates PASS/FAIL status automatically Smart Documentation Stores all inspection records in Google Drive Maintains searchable tracking spreadsheet Generates HTML compliance certificates Creates audit-ready documentation trail Real-Time Alerts Instant Slack notifications for failed inspections Detailed non-conformity reporting Escalation to quality managers Daily Analytics Automated daily quality summaries at 8 AM Pass rate calculations and trend analysis Product and inspector performance metrics 🔧 Setup Requirements Google Workspace - For Sheets and Drive storage Slack - For team notifications Jotform (optional) - For web-based inspection forms Email (SMTP) - For sending compliance certificates 📝 Customization Tips Modify specifications in the "Process Inspection Data" node to match your products Add custom fields for industry-specific requirements Adjust alert thresholds and notification channels Extend certificate templates with your company branding 🏭 Industries Ideal for: Electronics, Automotive Parts, Medical Devices, Consumer Goods, Food & Beverage, Aerospace Components 💡 Example Scenario A electronics manufacturer uses this workflow to inspect PCB assemblies. When an inspector submits measurements via Jotform, the workflow automatically checks if dimensions and weight meet specifications, stores the report, and generates a certificate. If any board fails, the quality manager receives an immediate Slack alert with details. Time Saved: ~2 hours daily on documentation and reporting Error Reduction: 90% fewer data entry mistakes Compliance: 100% audit-ready documentation
by Avkash Kakdiya
How it works This workflow acts as an instant SDR that replies to new inbound leads across multiple channels in real time. It first captures and normalizes all incoming lead data into a unified structure. The workflow then evaluates IST working days and hours, generates a context-aware AI response, and routes the reply to the correct channel. Finally, it logs the full interaction, response status, and timing into Google Sheets. Step-by-step Step 1: Lead intake & normalization** Incomming Lead whatsapp1 – Receives new WhatsApp lead messages via webhook. Incomming Lead facebook1 – Captures incoming Facebook lead messages. Incomming Lead instagram1 – Listens for Instagram lead messages. Incomming Lead linkdin1 – Captures LinkedIn lead messages. Incomming Lead Website1 – Receives website form submissions. Normalize Lead Data6 – Normalizes WhatsApp lead fields. Normalize Lead Data7 – Normalizes Facebook lead fields. Normalize Lead Data8 – Normalizes Instagram lead fields. Normalize Lead Data9 – Normalizes LinkedIn lead fields. Normalize Lead Data5 – Normalizes website lead data. Switch2 – Merges all normalized leads into a single processing path. Step 2: Working hours & AI response** Extract Day and Hours1 – Converts timestamps to IST and extracts day and time. Is Working Day and Working Hour?1 – Determines whether the lead arrived during business hours. Code in JavaScript3 – Builds the AI prompt using lead details and timing context. Get Ai Response1 – Generates a short, human-like response. Step 3: Send reply & log data** Code in JavaScript4 – Combines AI output with normalized lead data. Switch3 – Routes the response based on the source channel. Send message – Sends WhatsApp replies. Send Instagram Message1 – Sends Instagram responses. Send Facebook Messages1 – Sends Facebook replies. Send Linkdin Messages1 – Sends LinkedIn responses. Send a message1 – Sends email replies for website leads. Code in JavaScript5 – Finalizes response status and metadata. google-sheet-name – Appends or updates lead and response data in Google Sheets. Why use this? Replies instantly to leads across all major inbound channels Keeps all lead data standardized and easy to manage Automatically respects IST working days and hours Reduces manual SDR workload without losing response quality Maintains a complete response log for reporting and follow-up
by Abdullah Alshiekh
What Problem Does It Solve? SEO professionals and marketers spend hours manually searching keywords to analyze competitor content. Copying and pasting SERP results into spreadsheets is tedious and prone to formatting errors. Analyzing "why" a page ranks requires significant mental effort and time for every single keyword. This workflow solves these by: Automatically fetching live Google search results for a list of keywords. Using AI to instantly analyze the top ranking pages for Intent, Strengths, and Weaknesses. Delivering a consolidated, strategic SEO report directly to your email inbox. How to Configure It API Setup: Connect your Decodo credentials (for scraping Google results).- Connect your Google Gemini credentials (for the AI analysis). Connect your Gmail account (to send the final report). Keyword Input: Open the "Edit Fields" node and replace the placeholder items (keyword_1, etc.) with the actual search terms you want to track. Email Recipient: Update the "Send a message" node with your email address. How It Works The workflow triggers manually (or can be scheduled). It loops through your defined list of keywords one by one. Decodo performs a real-time Google search for each term and extracts organic results. A JavaScript node cleans the data, removing ads and irrelevant snippets. The AI Agent acts as an expert SEO analyst, processing the top results to generate a concise audit. Finally, the workflow compiles all insights into a single email report and sends it to you. Customization Ideas Change the output: Save the analysis to a Google Sheet or Notion database instead of Email. Adjust the AI Persona: Modify the system prompt to focus on specific metrics (e.g., content gaps or backlink opportunities). Automate the Input: Connect a Google Sheet to dynamically pull new keywords every week. Schedule It: Replace the Manual Trigger with a Cron node to run this report automatically every Monday morning. If you need any help Get in Touch
by Kirill Khatkevich
This workflow continuously monitors the Meta Ads Library for new creatives from a specific competitor pages, logs them into Google Sheets, and sends a concise Telegram notification with the number of newly discovered ads. It is built as a safe, idempotent loop that can run on a schedule without creating duplicates in your sheet. Use Case Manually checking the Meta Ads Library for competitor creatives is time‑consuming, and it’s easy to lose track of which ads you’ve already seen. This workflow is ideal if you want to: Track competitor creatives over time** in a structured Google Sheet. Avoid duplicates** by matching ads via their unique id field. Get lightweight notifications* in Telegram that tell you *how many new ads appeared, without spamming you with full ad lists. Run the process on autopilot** (daily, weekly, etc.) with a single schedule. How it Works The workflow is organized into three logical blocks: 1. Fetch Ads & Handle Pagination Configuration:** The Add parameters Set node stores all key request variables: ad_active_status (e.g. active), search_page_ids (competitor page IDs), ad_reached_countries, access_token. Routing:** Page or keywords routes execution into one of two HTTP Request nodes: Facebook Ads API by page — the main branch that queries ads by page ID. Facebook Ads API by keywords — an optional branch for keyword‑based searches. Normalization:** Facebook Ads API by ... returns the raw ads_archive response. Check the pagination then: extracts data (array of ad objects) into a dedicated field, reads paging.next into next_url for pagination. Pagination Loop:** If checks whether next_url is not empty. Set Next URL assigns next_url to a generic url field. Facebook Ads API pagination requests the next page and feeds it back into Check the pagination. This loop continues until there is no next_url, ensuring all pages of the Ads Library response are processed. 2. De‑duplicate Ads & Log to Google Sheets Load Existing IDs:** Read existing IDs pulls the existing id column from your Google Sheet (configured to read a specific column/range). Collect ID list converts these into a unique, normalized string array existingIds, which represents all ads you have already logged. Attach State:** Attach existing ids (Merge node) combines, for each execution, the freshly fetched Meta response (data) with the historical existingIds array from Sheets. Filter New Creatives:** Filter new creatives Code node compares each ad’s id (string) against the existingIds set and builds a new data array containing only ads that are not yet present in the sheet. It also protects against duplicates inside the same batch by tracking seen IDs in a local Set. Write New Ads:** Split Out expands the filtered data array into individual items (one item per new ad). Add to sheet then performs an appendOrUpdate into Google Sheets, mapping core fields such as id, ad_creation_time, page_name, ad_creative_bodies, ad_snapshot_url, languages, publisher_platforms, and link fields. The column mapping uses id as the matching column so that existing rows can be updated if needed. 3. Count New Ads & Notify in Telegram Count:** In parallel with the write step, Split Out also feeds into Count new ads. This Code node returns a single summary item with newCount = items.length, i.e. the total number of new creatives processed in this run. Guard:** Any new ads? checks whether newCount is greater than 0. If not, the workflow ends silently and no message is sent, avoiding noise. Notify:** When there are new creatives, Send a text message sends a Telegram message to the configured chatId. The message includes {{$json.newCount}} and a fixed link to the Google Sheet, giving you a quick heads‑up without listing individual ads. Setup Instructions To use this template, configure the following components. 1. Credentials Meta Ads / HTTP Header Auth:** Configure the Meta Ads HTTP Header credentials used by: Facebook Ads API by page, Facebook Ads API by keywords, Facebook Ads API pagination. Google Sheets:** Connect your Google account in: Read existing IDs, Add to sheet. Telegram:** Connect your Telegram account credentials in Send a text message. 2. The Add parameters Node Open the Add parameters Set node and customize: ad_active_status: Which ads to monitor (active, all, etc.). search_page_ids: The numeric ID of the competitor Facebook Page you want to track. ad_reached_countries: Comma‑separated list of country codes (US, US, CA, etc.). access_token: A valid long‑lived access token with permission to query the Ads Library. 3. Google Sheets Configuration Read existing IDs** Set documentId and sheetName to your tracking spreadsheet and sheet (e.g. an ads tab). Configure the range to read only the column holding the ad id values. Add to sheet** Point documentId and sheetName to the same spreadsheet/sheet. Make sure your sheet has the columns expected by the node (e.g. id, creation time, page, title, description, delivery_start_time, snapshot, languages, platforms, link). Confirm that id is included in matchingColumns so de‑duplication works correctly. 4. Telegram Configuration In Send a text message, set: chatId: Your target Telegram chat or channel ID. text: Customize the message template as needed, but keep {{$json.newCount}} to show the number of new creatives. 5. Schedule Open Schedule Trigger and configure when you want the workflow to run (e.g. every morning). Save and activate the workflow. Further Ideas & Customization This workflow is a solid foundation for systematic competitor monitoring. You can extend it to: Track multiple competitors** by turning search_page_ids into a list and iterating over it with a loop or separate executions. Enrich the log with performance data** by creating a second workflow that reads the sheet, pulls spend/impressions/CTR for each logged ad_id from Meta, and merges the metrics back. Add more notification channels** such as Slack or email, or send a weekly summary that aggregates new ads by page, format, or country. Tag or categorize creatives** (e.g. “video vs image”, “country”, “language”) directly in the sheet to make later analysis easier.
by Deepak Singh
How it works This workflow automatically generates a daily Indian marketing & advertising newsletter. It fetches articles from Campaign India and Economic Times BrandEquity feeds, merges them, and evaluates each story using an AI relevance filter. Only meaningful updates, such as brand launches, marketing campaigns, and changes to digital media, are retained. Relevant stories are stored in an n8n Data Table and later used to build a clean HTML newsletter. Every day at 7:30 PM IST, the workflow composes the email and sends it via Gmail, with an optional SMTP fallback if Gmail fails. After sending, processed entries are removed to keep the next day’s digest fresh. Set up steps Add your Gmail and (optional) SMTP credentials. Update the recipient email inside the Gmail/SMTP nodes. Confirm the Data Table exists or let n8n create it automatically. Adjust schedule timing if you want the newsletter at a different time. Add or remove RSS feeds as needed (inside the brown “RSS Fetching Block”). (Full explanations for each block are included as sticky notes inside the workflow.) By Deepak Singh If you need help or want custom automations: deepakbiz@outlook.com
by Panth1823
Automate Personalized HR Email Outreach with Rate Limiting This workflow streamlines HR outreach by fetching contact data, validating emails, enforcing daily sending limits, and sending personalized emails with attachments, all while logging activity. How it works Read HR contact data from Google Sheets. Remove duplicates and validate email formats. Apply dynamic daily email sending limits. Generate personalized email content. Download resumes for attachments. Send emails via Gmail with attachments. Log sending status (success/failure) to Google Sheets. Setup Configure Google Sheets credentials. Configure Gmail OAuth2 credentials. Update 'Google Sheets - Read HR Data' with your document and sheet IDs. Define email content in 'Email Creator' node. Set 'Download Resume' URL to your resume repository. Update 'Log to Google Sheets' with your tracking sheet IDs. Customization Adjust the 'Rate Limiter' node's RAMP_START and LIMIT_BY_WEEK variables to match your desired sending schedule and volume.