by Guy
General principles This template automates the management of marketing campaign from a CRM based on Airtable, using Brevo for sending and tracking emails. It is based on two main steps: Sending emails to targeted companies in Airtable, creating an interaction record for each sent email. Real-time tracking of events (email delivered / opened / clicked, unsubscribe request) via a n8n webhook, updating the corresponding interaction or company record. This workflow provides precise tracking of marketing actions and facilitates the history of interactions with prospects or clients. Prerequisites Airtable with at least 3 tables: Company: Contains information about your clients or prospects. Interaction: Log of exchanges and events. Campaign: Information about the ongoing campaign. Brevo: A predefined email template, which can include a link to a form. API access for sending and tracking emails. Step-by-step description Part 1: Sending Emails Selecting targeted companies The workflow queries the Company table in Airtable, filtering on a tag or a specific field related to the campaign (e.g., Campaign = "1"). Preparing data for Brevo For each company, retrieves the necessary information: email address. Sending the email via Brevo Uses a Brevo Send Template node with a predefined template. Brevo returns a unique identifier for each email sent. Creating an interaction in Airtable A record is added to the Interaction table containing: Company Date and time of interaction Media = email Brevo email identifier Part 2: Tracking Events Receiving Brevo events Brevo triggers a webhook to n8n for each event: Delivered Opened Clicked Unsubscribed Matching with the interaction Based on the Brevo identifier received, the workflow finds the corresponding Interaction in Airtable. Updating the interaction The interaction status is updated based on the event (e.g., "Opened" with date and time). Managing unsubscribes In case of an unsubscribe: The workflow finds the Company associated via the interaction. A specific field in the Company table (e.g., Opt-in) is updated to "No" in order to exclude this company from future campaigns. โ Benefits of this template Automated sending and real-time tracking. Interaction history stored in the CRM. RGPD (European regulations) compliance through automatic unsubscribe management. Clear view of campaign effectiveness (open rates, clicks, etc.).
by Davide
This workflow automates the process of validating email addresses stored in a Google Sheets file by using the Anymail Finder API. Key Advantages โ Automated Lead Validation No need for manual copy-paste or bulk uploads emails are verified directly inside your existing Google Sheets. ๐ Improved Data Quality Ensures your CRM or outreach campaigns only target valid and deliverable email addresses, reducing bounce rates. ๐ Real-Time Updates Results are automatically updated in the spreadsheet, making it easy for your team to see which leads are safe to contact. ๐ Improves lead quality for outreach, reduces bounce rates Of course. Here is a description of the provided n8n workflow. Workflow Description: Verify Leads Email Address with Anymail Finder This workflow automates the process of checking the validity of email addresses from a Google Sheets list using the Anymail Finder API and writing the results back to the sheet. How It Works The workflow operates in a loop to process a list of leads from a Google Sheet one by one. Manual Trigger: The workflow is started manually by a user within n8n. Get Leads: The "Get Leads" node reads data from a specified Google Sheet. It is configured with a filter to only fetch rows where the "VERIFY" column is empty, ensuring it only processes new leads that haven't been checked yet. Loop Over Items: The "Split In Batches" node is used to iterate over each row (lead) retrieved from the Google Sheet. This sends each lead individually to the next node for processing. Check Email Status: For each lead, the "HTTP Request" node sends a POST request to the Anymail Finder API (/v5.1/verify-email). The email address from the current sheet row ($json["EMAIL "]) is sent in the request body. Update Email Status: The response from Anymail Finder (which contains the verification status) is received. The "Update email status" node then takes this result and writes it back to the "VERIFY" column of the original Google Sheet. It uses the row_number to identify the correct row to update, ensuring data is placed accurately. Set Up Steps To use this workflow, you need to configure the following credentials and nodes: Google Sheets Credentials: Create a credential named "Google Sheets account" of type OAuth2. Follow n8n's guide to authenticate with Google Sheets. This will give the workflow permission to read from and write to your spreadsheet. Anymail Finder Credentials: Create a credential named "Anymail Finder" of type HTTP Header Auth. In the "Name" field, enter Authorization (or the header name required by Anymail Finder's API documentation). In the "Value" field, enter your Anymail Finder API key (in the format YOUR_API_KEY). Configure Google Sheet: Ensure your Google Sheet has at least the following columns: COMPANY NAME, EMAIL (note the space), and VERIFY. The "VERIFY" column must be empty for rows you want to verify. The "Get Leads" node is filtered to only process rows where this column is blank. Activate and Execute: Once the credentials are set and the sheet ID is configured, activate the workflow. Click on the "Manual Trigger" node and execute the workflow. It will begin processing all leads with an empty "VERIFY" field. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Davide
This workflow automates the management of Zoom OAuth tokens and the creation of new Zoom users through the Zoom API. This workflow automates the process of creating a new Zoom user by first ensuring a valid OAuth access token is available. It is designed to handle the fact that Zoom access tokens are short-lived (1 hour) by using a longer-lived refresh token (90 days) stored in an n8n Data Table. It includes two main phases: Token Generation & Management The workflow initially requests a Zoom access token using the OAuth 2.0 โauthorization codeโ method. The resulting access token (valid for 1 hour) and refresh token (valid for 90 days) are stored in an n8n Data Table. When executed again, the workflow checks for the most recent token, refreshes it using the refresh token, and updates the Data Table automatically. User Creation in Zoom Once a valid token is retrieved, the workflow collects the userโs first name, last name, and email (set in the โDataโ node). It then generates a secure random password for the new user. Using the Zoom API, it sends a POST request to create the new user, automatically triggering an invitation email from Zoom. Key Features โ Full Automation of Zoom Authentication Eliminates manual token handling by automatically refreshing and updating OAuth credentials. โ Centralized Token Storage Securely stores access and refresh tokens in an n8n Data Table, simplifying reuse across workflows. โ Error Prevention Ensures that expired tokens are replaced before API requests, avoiding failed Zoom operations. 4.โ Automatic User Provisioning Creates Zoom users automatically with prefilled credentials and triggers Zoomโs built-in invitation process. โ Scalability Can be easily extended to handle bulk user creation, role assignments, or integration with other systems (e.g., HR, CRM). โ Transparency & Modularity Each node is clearly labeled with โSticky Notesโ explaining every step, making maintenance and handover simple. How it works Trigger and Data Retrieval: The workflow starts manually. It first retrieves user data (first name, last name, email) from the "Data" node. In parallel, it fetches all stored token records from a Data Table. Token Management: The retrieved tokens are sorted and limited to get only the most recent one. This latest token (which contains the refresh_token) is then used in an HTTP Request to Zoom's OAuth endpoint to generate a fresh, valid access_token. User Creation: The new access_token and refresh_token are saved back to the Data Table for future use. The workflow then generates a random password for the new user, merges this password with the initial user data, and finally sends a POST request to the Zoom API to create the new user. If the creation is successful, Zoom automatically sends an invitation email to the new user. Set up steps Prepare the Data Table: Create a new Data Table in your n8n project. Add two columns to it: accessToken and refreshToken. Configure Zoom OAuth App: Create a standard OAuth app in the Zoom Marketplace (not a Server-to-Server app). Note your Zoom account_id. Encode your Zoom app's client_id and client_secret in Base64 format (as client_id:client_secret). In both the "Get new token" and "Zoom First Access Token" nodes, replace the "XXX" in the Authorization header with this Base64-encoded string. Generate Initial Tokens (First Run Only): Manually execute the "Zoom First Access Token" node once. This node uses an authorization code to fetch the first-ever access and refresh tokens and saves them to your Data Table. The main workflow will use these stored tokens from this point forward. Configure User Data: In the "Data" node, set the default values for the new Zoom user by replacing the "XXX" placeholders for first_name, last_name, and email. After these setup steps, the main workflow (triggered via "When clicking 'Execute workflow'") can be run whenever you need to create a new Zoom user. It will automatically refresh the token and use the provided user data to create the account. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Sk developer
An automated workflow that scrapes Shopify store information and product data using the Shopify Scraper API from RapidAPI, triggered by a user submitting a website URL, then logs data into Google Sheets for easy access and analysis. Node-by-Node Explanation On form submission** Triggers when a user submits a Shopify store website URL. Store Info Scrap Request** Sends a POST request to shopify-scraper4.p.rapidapi.com/shopinfo.php to fetch store metadata (name, location, domain, etc.). Products Scarp Request** Sends a POST request to shopify-scraper4.p.rapidapi.com/products.php to retrieve detailed product data (titles, prices, tags, etc.). Append Store Info Google Sheets** Appends store metadata into the "Shop Info" sheet in Google Sheets. Append Products Data In Google Sheets** Appends product data into the "Products" sheet in Google Sheets. Use Case Ideal for businesses or analysts who want to quickly gather Shopify store insights and product catalogs without manual data collection, enabling data-driven decision-making or competitive analysis. Benefits Automates Shopify data extraction with the powerful Shopify Scraper API on RapidAPI. Saves time by collecting and organizing data automatically into Google Sheets. Easily scalable and adaptable for multiple Shopify stores. ๐ How to Get API Key from RapidAPI Shopify Scraper Follow these steps to get your API key and start using it in your workflow: Visit the API Page ๐ Click here to open Shopify Scraper API on RapidAPI Log in or Sign Up Use your Google, GitHub, or email account to sign in. If you're new, complete a quick sign-up. Subscribe to a Pricing Plan Go to the Pricing tab on the API page. Select a plan (free or paid, depending on your needs). Click Subscribe. Access Your API Key Navigate to the Endpoints tab. Look for the X-RapidAPI-Key under Request Headers. Copy the value shown โ this is your API key. Use the Key in Your Workflow In your n8n workflow (HTTP Request node), replace: "x-rapidapi-key": "your key" with: "x-rapidapi-key": "YOUR_ACTUAL_API_KEY" `
by CentralStationCRM
Workflow Overview This workflow benefits anyone who: wants to automate writing new contacts (with an associated 'Newsletter' tag) in CentralStationCRM to a Rapidmail list Tools in this workflow CentralStationCRM, the simple and intuitive CRM for small teams. Here is our API documentation if you want to modify the workflow. Slack, brings people and information together Rapidmail, the really good newsletter software Workflow Screenshot Workflow Description This workflow consists of: a schedule trigger (set to 5 pm/daily) three HTTP Requests to the CentralStationCRM API two if-Nodes as logic gates a Slack node a "Do nothing, end workflow" node The schedule trigger The schedule trigger is set to trigger at 5 pm, every day. The thinking here: If you added new contacts into CentralStationCRM during your workday, the workflow will collect them afterwards. You can of course set your own time. First HTTP Request: Get new people of the day This node talks to the CentralStationCRM API and gets every newly created person of the day. Included are the tags, adresses, associated companies, and emails of the person. First if-gate: Does the person have a "Newsletter" tag? If you tagged your person with "Newsletter" in CentralStationCRM after creating it, this if-node will see it. -> if true: go to last HTTP Request (Rapidmail list) -> if false: go to Slack node Slack node: ask if the person should get a "Newsletter" tag Pretty much what the title says. The slack user sees a private message with the question "Should <person> be on your newsletter list" and can click "yes" or "no". second if-gate: Did the user click the "Yes"-button? -> if false: end workflow (do nothing node) -> if true: go to second HTTP request node second HTTP request: give person a "Newsletter" tag in CentralStationCRM This node again talks to the CentralStationCRM API and creates a new tag for the person, called "Newsletter". This is so you know later that this person is also receiving your newsletter. last HTTP request node: write person's name and email on Rapidmail list This node uses the Rapidmail API and the Rapidmail list ID to write a person with the "Newsletter" tag on a pre-created Rapidmail list for your newsletter. Customization ideas With this workflow, you could change the interval the workflow triggers, i.e. to once per week. you'd have to change the JSON code in the second node to fetch all the new people per week instead of per day customize the slack approval message to include more information on the person than the name, i.e. the company the person works for take the beginning - give tag to person in CentralStationCRM, trigger workflow, check for tag, ask for approval in slack - and then do something else with the last node, i.e. write the person a mail in Gmail or write their info in a Google sheet or do a web research with ai on the person. Go experiment ab bit! Preconditions For this workflow, you need: a CentralStationCRM account with API access an n8n account with API access a Rapidmail account with API access a Rapidmail recipient list Have fun with our workflow!
by Robert Breen
This workflow transforms raw marketing data from Google Sheets into a pivot-like summary table. It merges lookup data, groups spend by name, and appends the results into a clean reporting tab โ all automatically, without needing to manually build pivot tables in Sheets. ๐งโ๐ป Whoโs it for Marketing analysts who track channel spend across campaigns Small businesses that rely on Google Sheets for reporting Teams that need automated daily rollups without rebuilding pivot tables manually โ๏ธ How it works Get Marketing Data (Google Sheets) โ Pulls raw spend data. Vlookup Data (Google Sheets) โ Brings in reference/lookup fields (e.g., channel labels). Merge Tables โ Joins marketing data and lookup data on the Channel column. Summarize โ Groups data by Name and sums up Spend ($). Clear Sheet โ Wipes the reporting tab to avoid duplicates. Append to Pivot Sheet โ Writes the aggregated results into the "render pivot" sheet. The result: a pivot-style summary table inside Google Sheets, automatically refreshed by n8n. ๐ Setup Instructions 1) Connect Google Sheets (OAuth2) In n8n โ Credentials โ New โ Google Sheets (OAuth2) Sign in with your Google account and grant access In each Google Sheets node, select your Spreadsheet and the appropriate Worksheet: data (raw spend) Lookup (channel reference table) render pivot (output tab) 2) Configure Summarize Node Group by: Name Summarize: Spend ($) โ sum 3) Test the Workflow Execute the workflow manually Check your "render pivot" tab โ it should display aggregated spend by Name ๐ ๏ธ How to customize Change grouping fields (e.g., by Channel, Campaign, or Region) Add more aggregations (e.g., average CPC, max impressions) Use the Merge node to join extra data sources before summarizing Schedule execution to run daily for fresh rollups ๐ Requirements n8n (Cloud or self-hosted) Google Sheets account with structured data in data and Lookup tabs ๐ฌ Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your pivot)? ๐ง rbreen@ynteractive.com ๐ Robert Breen ๐ ynteractive.com
by AI/ML API | D1m7asis
๐ง Telegram Search Assistant โ Tavily + AI/ML API This n8n workflow lets users ask questions in Telegram and receive concise, fact-based answers. It performs a web search with Tavily, then uses AIMLAPI (GPT-5) to summarize results into a clear 3โ4 sentence reply. The flow ensures grounded, non-hallucinated answers. ๐ Features ๐ฉ Telegram-based input โจ๏ธ Typing indicator for better UX ๐ Web search with Tavily (JSON results) ๐ง Summarization with AIMLAPI (openai/gpt-5-chat-latest) ๐ค Replies in the same chat/thread โ Guardrails against hallucinations ๐ Setup Guide 1. ๐ฒ Create Telegram Bot Talk to @BotFather Use /newbot โ choose a name and username Save the bot token 2. ๐ Set Up Credentials in n8n Telegram API**: use your bot token Tavily**: add your Tavily API key AI/ML API**: add your API key Base URL: https://api.aimlapi.com/v1 3. ๐ง Configure the Workflow Open the n8n editor and import the JSON Update credentials for Telegram, Tavily, and AIMLAPI โ๏ธ Flow Summary | Node | Function | |--------------------------|-----------------------------------------------| | ๐ฉ Receive Telegram Msg | Triggered when user sends text | | โจ๏ธ Typing Indicator | Shows โtypingโฆโ to user | | ๐ Web Search | Queries Tavily with userโs message | | ๐ง LLM Summarize | Summarizes search JSON into a factual answer | | ๐ค Reply to Telegram | Sends concise answer back to same thread | ๐ Data Handling By default: no data stored Optional: log queries & answers to Google Sheets or a database ๐ก Example Prompt Flow User sends: When is the next solar eclipse in Europe? Bot replies: The next solar eclipse in Europe will occur on August 12, 2026. It will be visible as a total eclipse across Spain, with partial views in much of Europe. The maximum eclipse will occur around 17:46 UTC. ๐ Customization Add commands: /help, /sources, /news Apply rate-limits per user Extend logging to Google Sheets / DB Add NSFW / profanity filters before search ๐งช Testing Test end-to-end in Telegram (not just โExecute Nodeโ) Add a fallback reply if Tavily returns empty results Use sticky notes for debugging & best practices ๐ Resources ๐ AI/ML API Docs ๐ Tavily Search API
by Akash Kankariya
Easily ensure your n8n workflows are never lost! This template automates the process of backing up all your n8n workflows to a GitHub repository every 6 hours. Set it up once and enjoy worry-free workflow versioning and disaster recovery! ๐โจ ๐ What This Workflow Does Schedules backups**: Triggers the workflow automatically every 6 hoursโno manual steps needed. โฐ Exports all current workflows**: Collects a JSON snapshot of every workflow in your n8n instance. ๐ฆ Pushes backups to GitHub**: Commits each backup file to your specified GitHub repository with a time-stamped commit message for easy tracking. ๐๏ธ๐ Smart file handling**: Checks if a backup file already exists and creates or updates as needed, keeping your repository clean and organized. ๐ค โก๏ธ Why Use This Template? Automate your workflow backups โ never miss a backup again!** Seamless integration with GitHub** for team collaboration, change management, and rollback. Simple, reliable, and fully customizable** to match your backup intervals and repository setup. Peace of mind** that your critical automation assets are always protected. ๐ฆ How the Template Works: Step-by-step Overview Scheduled Trigger: Fires every 6 hours to launch the backup sequence. Get All Workflows: Uses the HTTP Request node to fetch all n8n workflows from your instance as JSON data. Move Binary Data: Converts the JSON into a binary format, ready for GitHub storage. Edit/Create Backup File: Attempts to edit (update) an existing backup file in your GitHub repo. If the file does not exist, the workflow will create a new one. Conditional Logic: Checks after each run whether the backup file exists and ensures previous versions can be recovered or merged as needed. Repeat: The process auto-loops every 6 hoursโno further intervention required! ๐ง How To Set Up On Your Server Import the template into your n8n instance. Configure your GitHub credentials in the workflow nodes. Update the GitHub repository details (owner, repository, and filePath) to use your own repo and desired file path. Set your n8n API key and update the API endpoint URL to match your deployment. Save and activate the workflowโnow your backups are on autopilot! ๐จโ๐ป Example Use Cases Version control for rapidly changing automation environments. Safeguarding business-critical automation assets. Easy rollback in case of workflow corruption or accidental deletion. Team collaboration through GitHub's pull request and review process. ๐ Pro Tips Adjust the backup interval in the Schedule Trigger node if you require more/less frequent backups. Use GitHub branch protection rules for enhanced workflow security. Pair this backup workflow with notifications (e.g., Slack or Email) for backup alerts. Protect your n8n workflows with automated, reliable, and versioned GitHub backupsโset it and forget it! ๐ฆ๐
by Angel Menendez
This workflow enables seamless, privacy-first capture of meeting notes from your iPhone. It pairs with an iOS Shortcut that leverages Appleโs on-device transcription from the Voice Memos app and optionally passes the output to ChatGPT or a local AI model for summarization. Who it's for Anyone who wants fast, secure note capture on iOS Professionals (e.g., lawyers, therapists) who require on-device processing for privacy Obsidian users who want to sync mobile notes via Google Drive What it does You record a voice memo in the iOS Voice Memos app. The Shortcut transcribes it locally (no API or cloud involved). Optionally, a summarization step is done via ChatGPT or a replaceable local model. The data is sent to an n8n webhook, where itโs converted into a .md file. The Markdown file is uploaded to a Google Drive folder synced with your Obsidian vault. Key Benefits ๐ง Keeps your meeting notes private โ no cloud APIs required ๐๏ธ Easily searchable in Obsidian as structured Markdown files ๐ ๏ธ Fully local if you swap out ChatGPT for a local model (can be less stable) Limitations Transcriptions longer than ~1 hour may fail or produce unstable results. Some setup required to replace ChatGPT with a local model in the Shortcut. Setup Install and configure the iOS Shortcut Replace the ChatGPT step in the Shortcut if you need full local-only processing Point the webhook in the Shortcut to your n8n instance Make sure your Obsidian vault is synced with the Google Drive folder used in the workflow Update any of the prompts in the iOS shortcut to ensure its personalized. > โก Pro Tip: Use a Set node early on to clearly define the filename, title, and text so it's easier to modify this for other note types (e.g., journals, therapy sessions, etc.)
by Jessica
๐ฌ Auto Add AI Captions to Videos from Google Drive with ZapCap Description Stop wasting hours on video captioning. Upload your videos to a Google Drive folder, and ZapCap automatically generates professional subtitles for you. Download the finished video from your Google Drive and itโs ready to post. Fast, simple, and effortless. How It Works Google Drive Trigger โ Watches your folder for new uploads. Send to ZapCap โ Instantly creates accurate subtitles. Wait & Check Status โ Automatically tracks progress. Download Captioned Video โ Get your finished, captioned video. Upload Back to Drive โ Saves it where you need it, ready to share. Why Youโll Love It ๐ Save time โ captions are added automatically. โก Speed up content creation โ get post-ready videos in minutes. ๐ฏ Professional results โ subtitles are accurate and consistent. โ๏ธ Fully cloud-based โ no local software, no manual work. Requirements ZapCap account & API key** โ get your free API key here Google Drive account** (with OAuth credentials) n8n** (Cloud or self-hosted) Support Need help? Join our ZapCap Discord or email us at hi@zapcap.ai for assistance.
by David Olusola
Gmail Attachment Extractor to Google Drive Description: This workflow monitors your Gmail inbox for new emails, specifically those with attachments, and automatically saves those attachments to a designated folder in your Google Drive. Use Case: Automatically archive invoices, client documents, reports, or photos sent via email to a structured cloud storage. How It Works This workflow operates in three main steps: Gmail New Email Trigger: The workflow starts with a Gmail Trigger node, set to monitor for new emails in your specified Gmail inbox (e.g., your primary inbox). It checks for emails that contain attachments. Conditional Check (Optional but Recommended): An If node checks if the email actually has attachments. This prevents errors if an email without an attachment somehow triggers the workflow. Upload to Google Drive: A Google Drive node receives the email data and its attachments. It's configured to upload these attachments to a specific folder in your Google Drive. The attachments are named dynamically based on their original filenames. Setup Steps To get this workflow up and running, follow these instructions: Step 1: Create Gmail and Google Drive Credentials in n8n In your n8n instance, click on Credentials in the left sidebar. Click New Credential. Search for and select "Gmail OAuth2 API" and follow the authentication steps with your Google account. Save it. Click New Credential again. Search for and select "Google Drive OAuth2 API" and follow the authentication steps with your Google account. Save it. Make note of the Credential Names (e.g., "My Gmail Account", "My Google Drive Account"). Step 2: Create a Destination Folder in Google Drive Go to your Google Drive (drive.google.com). Create a new folder where you want to save the email attachments (e.g., Email Attachments Archive). Copy the Folder ID from the URL (e.g., https://drive.google.com/drive/folders/YOUR_FOLDER_ID_HERE).
by Grant Warfield
This workflow auto-generates and posts a tweet once per day using real-time insights from the web. It uses Perplexity to fetch trending topics, OpenAI to summarize them into a tweet, and the Twitter API to publish. โ๏ธ Set up steps Set your Perplexity API key in the HTTP Request node. Add your OpenAI API key to the Message Model node. Authenticate your Twitter API credentials in the second HTTP Request node. Modify the schedule trigger to run daily at your preferred time. All logic is pre-configured โ simply plug in your credentials and you're live.