by Nazmy
Bearer Token Validation This n8n template helps you manage and validate tokens easily using: n8n as your backend workflow engine Airtable as your lightweight token store 🚀 What It Does Stores user tokens securely in Airtable with expiry or usage metadata. Validates incoming tokens in your workflows (e.g., webhook APIs). Rejects invalid or expired tokens automatically for security. Can be extended to generate, rotate, or revoke tokens for user management. How It Works Webhook node receives requests with a Bearer header. Airtable Query looks up the provided token. Validation Logic (Code node): Checks if the token exists. Verifies expiry or usage limits if configured. Returns success if valid, or error if error with describing the issue. Note: This is the simplest way to do auth, just for simplification Why Use This No need for a full backend to manage secure token validation. Clean, modular, and ready for your SaaS workflows. Enjoy building secure automations with n8n + Airtable! 🚀 Built by: Nazmy
by Meak
Google Maps Email Scraper System Most lead generation tools charge $2–$5 per lead and lock you into expensive subscriptions. This workflow lets you scrape unlimited business emails from Google Maps for free — no paid APIs required. Benefits Zero API costs – scrape data directly from Google Maps Unlimited leads – extract thousands of emails per day Geographic targeting – search by city, region, or business type Complete automation – from search to clean email list Built-in data cleaning – removes duplicates & invalid entries How It Works Reads search queries from a Google Sheet (e.g., "Calgary dentist") Sends HTTP requests to Google Maps and scrapes business listings Extracts website URLs with custom JavaScript regex Visits each site, scrapes HTML, and finds email addresses Cleans and validates data Exports organized lead list back to Google Sheets Who Is This For B2B sales teams generating leads for outreach Marketing agencies building client lead databases Local businesses researching competitors & partners Real estate professionals analyzing target neighborhoods Franchise developers scouting new markets Setup Create a Google Sheet with two tabs: “searches” & “emails” Add search queries to the “searches” tab (one per row) Connect Google Sheets OAuth credentials in n8n Configure HTTP request nodes with SSL ignore enabled Add custom JavaScript regex code for URL and email extraction ROI & Monetization $0 per lead vs. $2–$5 from paid tools Generate 1,000+ leads per day without hitting API limits Sell lead lists or offer as a $500–$2,000 per niche/location service Perfect upsell for agencies offering outreach or local SEO Strategy Insights In the YouTube walkthrough, I show how to: Write custom JavaScript + regex for clean URL extraction Build a robust loop system with error handling & rate limiting Avoid IP blocking with batching & delays Sell lead generation as a high-margin recurring service Automate outreach to monetize the leads you scrape Check Out My Channel For more advanced AI automation systems that generate real business results, check out my YouTube channel where I share the exact strategies I use to build automation agencies, sell high-value services, and scale to $20k+ monthly revenue.
by Evoort Solutions
AI-Powered Image Background Removal Workflow with Google Sheets Integration Flow Description: This workflow utilizes AI-powered image background removal integrated with Google Sheets to create a fully automated and streamlined process for handling and managing image files. The flow is triggered when a user uploads an image through a form. The image is sent to the API Background Remover AI, where it undergoes automatic background removal. Upon successful processing, the new image is uploaded to a temporary file storage service using the Temp File Upload. Afterward, the relevant data, including the image link and status, is logged in a Google Sheets document for easy access and tracking. In case the process fails, the system automatically logs a failure status in the same Google Sheet, along with the reason (if applicable). This allows users to have a transparent, organized, and real-time view of both successful and failed background removal attempts. Used APIs: Background Remover AI: An AI-powered service that removes backgrounds from images. This service offers a fast, accurate, and scalable solution for background removal in images. Temp File Upload: This API facilitates the upload of processed images to a temporary file storage service, making it easy to access and manage files before permanent storage. Use Case: This workflow is highly beneficial for businesses and developers who need to process multiple images automatically. It helps automate tedious tasks such as background removal, making it an efficient tool for industries like: E-commerce**: Automatically removing backgrounds from product images for clean, professional-looking listings across online platforms such as Amazon, eBay, or Shopify. Content Creation**: Content creators can quickly remove backgrounds from images for blogs, social media posts, and marketing campaigns, saving significant time in photo editing. Real Estate**: Real estate businesses can use this workflow to enhance property images by removing unwanted backgrounds, making them look more polished and appealing for listings. Advertising & Marketing**: This workflow simplifies image preparation for digital ads, banners, and promotional content by automatically cleaning up images for a more professional look. Benefits: Time-Saving: By automating the background removal process via the **Background Remover AI API, you eliminate the need for manual image editing, saving time and resources. AI-Powered Accuracy**: The AI-powered background removal service ensures precise and high-quality results consistently. Seamless Integration with Google Sheets: All successful and failed image processing attempts are automatically logged into a **Google Sheets document, ensuring you have a transparent, real-time record of each operation. Error Tracking**: In case of failure, detailed error logs are created in Google Sheets, allowing easy tracking and troubleshooting. Efficient Cloud Storage: The **Temp File Upload API stores processed images securely in the cloud, offering a temporary solution before permanent storage. Google Sheets Table Example: The data from the workflow will be automatically added to a Google Sheets document, creating an organized table with information about the processed images. The table will have the following columns: | Image Name | Link | Status | Expire At | |-----------------|----------|------------|---------------------| | image1.jpg | Link | Success | 2025-07-25T12:00:00Z | | image2.jpg | Link | Success | 2025-07-25T12:00:00Z | | image3.jpg | Not found | Failed | 2025-07-24T12:00:00Z | | image4.jpg | Link | Success | 2025-07-25T12:00:00Z | Columns Explained: Image Name**: The name of the image file uploaded by the user. Link**: A direct link to the processed image stored in temporary file storage. Status: Indicates whether the background removal was **Successful or Failed. Expire At**: The expiration date and time when the temporary file link will no longer be accessible. This table provides real-time tracking of each image processing event, offering full visibility of the workflow results. It is ideal for businesses or developers who need to keep a record of their image-processing operations. Additional Features: Automatic Error Logging**: If the background removal fails for any reason, a failure entry is recorded in Google Sheets with a timestamp and an error message. Custom Expiry Time**: The system automatically sets an expiry time for the processed image, allowing temporary access before it expires and is removed from storage. Scalable Process**: The workflow can easily handle multiple form submissions and process images in bulk, making it scalable for various use cases. Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Tom
This simple workflow demonstrates how to get an end user's browser to download a file. It makes use of the Content-Disposition header to set a filename and control the browser behaviour. A use case could be the download of a PDF file at the end of an application process or to export data from a database without replacing the current page content in the browser. With this approach, the current page remains open and the file is simply downloaded instead: The original idea was first present here by @dickhoning in the n8n community.
by Rosh Ragel
Automatically Send Weekly Sales Reports from Square via Outlook What It Does This workflow automatically connects to the Square API and generates a weekly sales summary report for all your Square locations. The report matches the figures displayed in Square Dashboard > Reports > Sales Summary. It's designed to run weekly and pull the previous week’s sales into a CSV file, which is then sent to a manager/finance team for analysis. This workflow builds on my previous template, which allows users to automatically pull data from the Square API into n8n for processing. (See here: https://n8n.io/workflows/6358) Prerequisites To use this workflow, you'll need: A Square API credential (configured as a Header Auth credential) A Microsoft Outlook credential How to Set Up Square Credentials: Go to Credentials > Create New Choose Header Auth Set the Name to Authorization Set the Value to your Square Access Token (e.g., Bearer <your-api-key>) How It Works Trigger: The workflow runs every Monday at 4:00 AM Fetch Locations: An HTTP request retrieves all Square locations linked to your account Fetch Orders: For each location, an HTTP request pulls completed orders for the previous week (e.g., Monday to Sunday) Filter Empty Locations: Locations with no sales are ignored Aggregate Sales Data: A Code node processes the order data and produces a summary identical to Square’s built-in Sales Summary report Create CSV File: A CSV file is created containing the relevant data Send Email: An email is sent via Microsoft Outlook to the chosen third party Example Use Cases Automatically send weekly Square sales data to management to improve the quality of planning and scheduling decisions Automatically send data to an external third party, such as a landlord or agent, who is paid via commission Automatically send data to a bookkeeper for entry into QuickBooks How to Use Configure both HTTP Request nodes to use your Square API credential Set the workflow to Active so it runs automatically Enter the email address of the person you want to send the report to and update the message body If you want to remove the n8n attribution, you can do so in the last node Customization Options Add pagination to handle locations with more than 1,000 orders per week Why It's Useful This workflow saves time, reduces manual report pulling from Square, and enables smarter automation around sales data — whether for operations, finance, or performance monitoring.
by Rahi
Workflow 1: Domain and Email Health 🩺 This part of the workflow is triggered every 5 hours by the Schedule Trigger1 node. Its purpose is to pull health metrics for both email domains and individual email addresses. How it Works: ++Schedule Trigger:++ The Schedule Trigger1 node initiates the workflow every 5 hours. ++API Requests:++ Two separate HTTP Request nodes, HTTP Request5 and HTTP Request6, make API calls to Smartlead. ++HTTP Request5 calls++ the endpoint for domain-wise health metrics. ++HTTP Request6 calls++ the endpoint for email-wise health metrics. Both requests use the same api_key and a date range from 2025-07-04 to the current day. ++Data Splitting:++ The Split Out5 and Split Out6 nodes take the JSON response from the API calls and split the data into individual items. This is necessary so each row of data can be processed and added to Google Sheets separately. ++Google Sheets Integration:++ Finally, the Append or update row in sheet5 and Append or update row in sheet6 nodes update two different Google Sheets: ++Append or update row in sheet5 adds++ or updates rows in the DomainHealth sheet, matching on the domain column. ++Append or update row in sheet6 adds++ or updates rows in the EmailHealth sheet, matching on the from_email column. Workflow 2: Global and Campaign-Specific Analytics 📊 This second part of the workflow is triggered every 2 hours by the Schedule Trigger node. Its goal is to get a day-by-day overview of email engagement and campaign-specific performance. How it Works: Schedule Trigger: The Schedule Trigger node starts this workflow every 2 hours. ++API Requests:++ Two HTTP Request nodes, HTTP Request and HTTP Request1, call different Smartlead API endpoints. ++HTTP Request++ retrieves day-wise overall stats for email engagement. ++HTTP Request1 ++retrieves overall stats for each campaign. ++Data Splitting:++ The Split Out and Split Out1 nodes separate the JSON responses into individual data items for processing. ++Google Sheets Integration:++ The Append or update row in sheet and Append or update row in sheet1 nodes then write the data to Google Sheets. ++Append or update row in sheet++ updates the Sheet1 sheet with day-wise metrics, using the date as a matching column. ++Append or update row in sheet1++ updates the CampaignWise sheet with campaign performance metrics, using the campaign id to match rows.
by Nima Salimi
Overview Automate your daily contact imports from NocoDB to Brevo.The workflow updates the record status in NocoDB at each step. For every email campaign, it’s essential to keep your Brevo contact list updated so you can send personalized and targeted emails. This flow automates that process. ✅ Tasks ⏰ Runs automatically every day 🗂 Fetches only new/unimported records from NocoDB 🔍 Checks for missing required fields 🚫 Filters out disposable/temporary emails 📬 Creates contacts in Brevo 📝 Updates NocoDB status after each step 🛠 How to Use 1️⃣ Set your schedule The Schedule Trigger node runs the flow daily adjust to your preferred time. 2️⃣ Prepare your table in NocoDB Your NocoDB table should contain at least: id first_name last_name email status (default: 0-not-imported) 3️⃣ Configure your credentials Connect your NocoDB API Token in the NocoDB nodes. Connect your Brevo API Key in the Brevo node. 4️⃣ Map your fields In the Brevo: Create Contact node, make sure first name, last name, and email match your NocoDB column names. 📌 Notes 🛡 Make sure your NocoDB project/table IDs match the ones in this template. 🚀 This workflow processes contacts one-by-one to avoid heavy API calls and rate limit issues with Brevo. ✅ status values: 0-not-imported → new record 1-empty-fields → missing required fields 2-disposal-email → disposable email detected 3-contact-created → successfully created in Brevo
by Abdullah
What it does Automatically respond to Google Form entries submitted via Google Sheets. This workflow notifies your Slack team, sends a personalized Gmail response to the user, and adds the user to Google Contacts — all triggered instantly upon new row addition in your connected Sheet. Who's it for Perfect for lead capture forms, client inquiries, or feedback submissions. Trigger: When a new row is added to a connected Google Sheet (usually linked to a Google Form). Slack Notification: Sends a Slack message to your selected channel with the form data. Gmail Message: Sends an automatic email reply to the submitter (using their email from the form). Add Google Contact: Automatically creates a new contact in Google Contacts using the form data. This setup is ideal for automating client communication and internal team alerts without manual input.
by NanaB
This n8n workflow provides a comprehensive solution for user authentication and management, leveraging Airtable as the backend database. It includes flows for user sign-up and login, aswell as the sample crud operations retrieving user details, and updating user information. Youtube Video of me explaining the flow: https://www.youtube.com/watch?v=gKcGfyq3dPM How it Works User Sign-Up Flow Receives POST request: A webhook listens for POST requests containing new user details (email, first name, last name, password). Checks for existing email: The workflow queries Airtable to see if the submitted email already exists. Handles email in use: If the email is found, it responds with {"response": "email in use"}. Creates new user: If the email is unique, the password is SHA256 hashed (Base64 encoded), and the user's information (including the hashed password) is stored in Airtable. A successful response of {"response": "success"} is then sent. User Login Flow Receives POST request: A webhook listens for POST requests with user email and password for login. Verifies user existence: It checks Airtable for a user with the provided email. If no user is found, it responds with a failure message ("wrong email"). Compares passwords: If a user is found, the submitted password is hashed (SHA256, Base64 encoded) and compared with the stored hashed password in Airtable. Responds with JWT or error: If passwords match, a JWT token containing the user's ID and email is issued. If they don't match, a "wrong password" response is sent. Flows for a Logged-In User These flows require a JWT-authenticated request. Get User Details:** Webhook (GET): Receives a JWT-authenticated request. Airtable (Read): Fetches the current user’s record using the jwtPayload.id. Set Node ("Specify Current Details"): Maps fields like "First Name," "Last Name," "Email," and "Date" from Airtable to a standard output format. Update User Details:** Webhook (POST): Receives updated user data (email, name, password). Airtable (Upsert): Updates the record matching jwtPayload.id using the submitted fields. Set Node ("Specify New Details"): Outputs the updated data in a standard format. Set Up Steps (Approx. 5 Minutes) Step 1: Set up your Airtable Base and Table You'll need an Airtable Base and a table to store your user data. Ensure your table has at least the following columns: Email** (Single Line Text) First Name** (Single Line Text) Last Name** (Single Line Text) Password** (Single Line Text - this will store the hashed password) Date** (Date - optional, for user sign-up date) Step 2: Obtain an Airtable Personal Access Token Go to the Airtable website and log in to your account. Navigate to your personal access token page (usually found under your developer settings or by searching for "personal access tokens"). Click "Create new token." Give your token a name (e.g., "n8n User Management"). Grant necessary permissions: Scope: data.records:read, data.records:write for the specific base you will be using. Base: Select the Airtable base where your user management table resides. Generate the token and copy it immediately. You won't be able to see it again. Store it securely. Step 3: Create a JWT Auth Credential in n8n In your n8n instance, go to "Credentials" (usually found in the left-hand sidebar). Click "New Credential" and search for "JWT Auth". Give the credential a name (e.g., "UserAuthJWT"). For the "Signing Secret," enter a strong, random string of characters. This secret will be used to sign and verify your JWT tokens. Keep this secret highly confidential. Save the credential. Customization Options This workflow is designed to be highly adaptable: Database Integration**: Easily switch from Airtable to other databases like PostgreSQL, MySQL, MongoDB, or even Google Sheets by replacing the Airtable nodes with the appropriate database nodes in n8n. Authentication Methods**: Extend the authentication to include multi-factor authentication (MFA), social logins (Google, Facebook), or integrate with existing identity providers (IdP) by adding additional nodes. User Profile Fields**: Add or remove user profile fields (e.g., phone number, address, user roles) by adjusting the Airtable table columns and the Set nodes in the workflow. Notification System**: Integrate notification systems (e.g., email, SMS) for events like new user sign-ups, password resets, or account changes. Admin Panel**: Build an admin panel using n8n to manage users directly, including functionalities for adding, deleting, or updating user records, and resetting passwords. This workflow provides a solid foundation for building robust user management systems, adaptable to a wide range of applications and security requirements. Need Assistance or Customization? Do you have specific integrations in mind, or are you looking to add more user management features to this workflow? If you need help setting this up, or want to adapt it for a unique use case, don't hesitate to reach out! You can contact me directly at nanabrownsnr@gmail.com. I'd be glad to assist you.
by Wessel Bulte
What this template does Receives meeting data via a webform, cleans/structures it, fills a Word docx template, uploads the file to SharePoint, appends a row to Excel 365, and sends an Outlook email with the document attached. Good to know Uses a community node: DocxTemplater to render the DOCX from a template. Install it from the Community Nodes catalog. The template context is the workflow item JSON. In your docx file, use placeholders. Includes a minimal HTML form snippet (outside n8n) you can host anywhere. Replace the placeholder WEBHOOK_URL with your Webhook URL before testing. Microsoft nodes require Azure app credentials with correct permissions (SharePoint, Excel/Graph, Outlook). How it works Webhook — Receives meeting form JSON (POST). Code (Parse Meeting Data) — Parses/normalizes fields, builds semicolon‑separated strings for attendees/absentees, and flattens discussion points / action items. SharePoint (Download) — Fetches the DOCX template (e.g., meeting_minutes_template.docx). Merge — Combines template binary + JSON context by position. DocxTemplater — Renders meeting_{{now:yyyy-MM-dd}}.docx using the JSON context. SharePoint (Upload) — Saves the generated DOCX to a target folder (e.g., /Meetings). Microsoft Excel 365 (Append) — Appends a row to your sheet (Date, Time, Attendees, etc.). Microsoft Outlook (Send message) — Emails the generated DOCX as an attachment. Requirements Community node DocxTemplater installed Microsoft 365 access with credentials for: SharePoint (download template + upload output) Excel 365 (append to table/worksheet) Outlook (send email) A Word template with placeholders matching the JSON keys Need Help 🔗 LinkedIn – Wessel Bulte
by Hugues Stock
What does this template do? This workflow sets a small "lock" value in Redis so that only one copy of a long job can run at the same time. If another trigger fires while the job is still busy, the workflow sees the lock, stops early, and throws a clear error. This protects your data and keeps you from hitting rate limits. Because the workflow also stores simple progress flags ("working", "loading", "finishing"), you can poll the current status and show live progress for very long jobs. Use Case Great when the same workflow can be called many times in parallel (for example by webhooks, cron jobs, or nested Execute Workflow calls) and you need an "only run once at a time" guarantee without building a full queue system. What the Workflow Does ⚡ Starts through Execute Workflow Trigger called by another workflow 🔄 A Switch sends the run to Get, Set, or Unset actions 💾 Redis reads or writes a key named process_status_<key> with a time‑to‑live (default 600 s) 🚦 If nodes check the key and decide to continue or stop ⏱️ Wait nodes stand in for the slow part of your job (replace these with your real work) 📈 Updates the key with human‑readable progress values that another workflow can fetch with action = get 🏁 When done, the lock is removed so the next run can start Apps & Services Used Redis Core n8n nodes (Switch, If, Set, Wait, Stop and Error) Pre‑requisites A Redis server that n8n can reach Redis credentials stored in n8n A second workflow that calls this one and sends: action set to get, set, or unset key set to a unique name for the job Optional timeout in seconds Customization Tips Increase or decrease the TTL in the Set Timeout node to match how long your job usually runs Add or rename status values ("working", "loading", "finishing", and so on) to show finer progress Replace Stop and Error with a Slack or email alert, or even push the extra trigger into a queue if you prefer waiting instead of failing Use different Redis keys if you need separate locks for different tasks Build a small "status endpoint" workflow that calls this one with action = get to display real‑time progress to users Additional Use Cases 🛑 Telegram callback spam filter If a Telegram bot sends many identical callbacks in a burst, call this workflow first to place a lock. Only the first callback will proceed; the rest will exit cleanly until the lock clears. This keeps your bot from flooding downstream APIs. 🧩 External API rate‑limit protection Run heavy API syncs one after the other so parallel calls do not break vendor rate limits. 🔔 Maintenance window lock Block scheduled maintenance tasks from overlapping, making sure each window finishes before the next starts.
by Grigory Frolov
WordPress Blog to Google Sheets Sync Posts • Categories • Tags • Media 🧩 Overview This n8n workflow automatically syncs your WordPress website content — including posts, categories, tags, and media — into Google Sheets. It helps automate content reporting, SEO analysis, and data backups. The workflow can run on schedule or on demand via a webhook. 💡 Use cases Maintain a live database of blog posts in Google Sheets. Create dashboards in Google Data Studio or Looker Studio. Track new articles for newsletters or social media scheduling. Backup all WordPress content and media outside of your CMS. ⚙️ Prerequisites Before importing the workflow, ensure you have: A WordPress website with the REST API enabled (default in WP 4.7+). Authentication: either Application Passwords or Basic Auth credentials. A Google Sheet with the following tabs: Posts Categories Tags Media The following credentials configured in n8n: HTTP Basic Auth (for WordPress) Google Sheets OAuth2 🚀 Setup instructions Import the workflow into your n8n instance. Replace all example WordPress API URLs with your domain, for example: https://yourdomain.com/wp-json/wp/v2/ Connect your HTTP Basic Auth credentials (WordPress username + Application Password). Connect your Google Sheets OAuth2 account. Update the spreadsheet ID in each Google Sheets node with your own. Adjust the Schedule Trigger (e.g. run daily at 2:00 AM). Run once manually to verify data sync. 🧠 Workflow structure | Section | Description | |----------|--------------| | Schedule / Webhook Trigger | Starts the workflow manually or automatically | | Variables & Loop Vars | Initialize pagination for REST API requests | | Get Posts → Split Out → Update Posts | Fetch and update all WordPress posts | | Get Categories → Update Categories | Sync WordPress categories | | Get Tags → Update Tags | Sync WordPress tags | | Get Media → Split Out → Update Media | Sync media library (images, videos, etc.) | | IF Loops | Handles pagination logic until all items are retrieved | ⚠️ Notes & Limitations Works with standard WordPress REST API endpoints only. Custom post types require editing endpoint URLs. The per_page value defaults to 10; increase for faster syncs. For large sites, consider increasing n8n memory or adding execution logs. Avoid running the workflow too frequently to prevent API rate limits. 🎥 Video Tutorial A step-by-step setup guide is available here: 👉 https://www.youtube.com/watch?v=czSMWyD6f-0 Please subscribe to my YouTube channel to support me: 👉 https://www.youtube.com/@gregfrolovpersonal 👨💻 Author Created by: Grigory Frolov SEO & Automation Specialist — helping businesses integrate WordPress, AI, and data tools with n8n. 🧾 License This workflow is provided under the MIT License. Feel free to use, modify, and share improvements with the community.