by Miquel Colomer
This workflow allows extracting data from multiple pages website. The workflow: 1) Starts in a country list at https://www.theswiftcodes.com/browse-by-country/. 2) Loads every country page (https://www.theswiftcodes.com/albania/) 3) Paginates every page in the country page. 4) Extracts data from the country page. 5) Saves data to MongoDB. 6) Paginates through all pages in all countries. It uses getWorkflowStaticData('global') method to recover the next page (saved from the previous page), and it goes ahead with all the pages. There is a first section where the countries list is recovered and extracted. Later, I try to read if a local cache page is available and I recover the cached page from the disk. Finally, I save data to MongoDB, and we paginate all the pages in the country and for all the countries. I have applied a cache system to save a visited page to n8n local disk. If I relaunch workflow, we check if a cache file exists to discard non-required requests to the webpage. If the data present in the website changes, you can apply a Cron node to check the website once per week. Finally, before inserting data in MongoDB, the best way to avoid duplicates is to check that swift_code (the primary value of the collection) doesn't exist. I recommend using a proxy for all requests to avoid IP blocks. A good solution for proxy plus IP rotation is scrapoxy.io. This workflow is perfect for small data requirements. If you need to scrape dynamic data, you can use a Headless browser or any other service. If you want to scrape huge lists of URIs, I recommend using Scrapy + Scrapoxy.
by Zacharia Kimotho
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. What does this workflow do? This workflow helps speed up the analysis process of the top ranking titles and meta descriptions to identify paterns and styles that will help us rank on Google for a given keyword How does it work? We provide a keyword we are interested in on our Google sheet. When executed, We scrap the top 10 pages using Bright Data serp API and analyse the style and patterns of the top ranking pages and generate a new title and meta description Techncial setup Make a copy of this Google sheet Update your desired keywords on the cell/row Set your Bright data credentials on the Update the zone to your preset zone We are getting the results as a JSON. You can update this setting on the url https://www.google.com/search?q={{ $json.search_term .replaceAll(" ", "+")}}&start=0&brd_json=1 by removing the brd_json=1 query Store the generated results on the Duplicated sheet Run the workflow Setting up the Serp Scraper in Bright Data On Bright Data, go to the Proxies & Scraping tab Under SERP API, create a new zone Give it a suitable name and description. The default is serp_api Add this to your account Add your credentials as a header credential and rename to Bright data API
by Emmanuel Bernard
🎉 Do you want to master AI automation, so you can save time and build cool stuff? I’ve created a welcoming Skool community for non-technical yet resourceful learners. 👉🏻 Join the AI Atelier 👈🏻 Unlock streamlined Zoom Meeting organization and exclusive access management with this n8n workflow. Designed for educators, event organizers, and businesses, this tool automates your event logistics, so you can focus on delivering valuable content. Features Zoom Meetings Creation:** Instantly generate new Zoom meetings with the n8n built-in form. Collect Payments Using Stripe:** Effortlessly monetize your events with secure, automatically created Stripe payment pages for each meeting. Exclusive Gated Access:** Ensure your content remains exclusive by sending Zoom meeting passwords only to verified subscribers who have completed their payment through Stripe. Participants Email Notifications:** Automate the distribution of Zoom meeting details post-payment, eliminating the need for manual email management and ensuring participants are promptly informed. Instant and Easy Participants Overview:** Manage and track your event registrations with ease. All related data is stored in a Google Sheets document that you own. You're notified via email with each new subscription, simplifying participant management. Set Up Steps Connect your Zoom, Stripe, Gmail and Google Sheet credentials. Create an empty Google Sheet in your Google Drive. Fill the config node (Sheet URL, email and currency). Edit email text. This n8n workflow template is designed to minimize setup time and maximize efficiency, allowing you to focus on delivering value to your subscribers. With just a few clicks, you can automate the entire process of organizing and monetizing your Zoom meetings. Created by the n8ninja.
by galelem
This n8n workflow automates the entire pipeline of generating, formatting, and publishing SEO-rich blog posts to a Blogger site—ideal for auto service businesses. What it does: ⏱ Runs on a schedule via the Schedule Trigger 📰 Fetches trending news from Mediastack (technology category) 🖼 Generates relevant images using the Pexels API 🧠 Creates SEO-optimized content using AI agents (LangChain & OpenRouter) 📝 Formats content into Blogger-compatible HTML, including title, metadata, images, FAQs, and internal linking 🔄 Posts directly to Blogger via authenticated Google Blogger API 📢 Sends Telegram notifications with previews and publishing confirmations 🔐 Uses secure credentials (no hardcoded API keys) Ideal For: Bloggers and marketers looking to automate content creation Auto repair, dealership, or detailing businesses maintaining a content strategy Agencies managing multiple Blogger-based SEO campaigns
by Julien DEL RIO
This template is inspired by Save your workflows into a GitHub repository by hikerspath and Back Up Your n8n Workflows To Github by jon-n8n. Basic Retrieve all workflows from an n8n instance and save it on a gitlab project. If the workflow exist, il will only save the changes. Flow What the workflow does : Sets custom parameters Gets workflows Iterates through each workflow one by one Get the file from Gitlab if exists Compare the files as objects (not as strings) Return a status on the workflow Create, Edit or ignore the file depending on the status Return a list of status for each workflow Configuration Select a credential in each Gitlab nodes. Edit the data in node "Globals" : repo.owner : slug of the user or team owning the repo repo.name : slug of the repository repo.branch : branch to commit on repo.path : from root of the repository. Should end with / Comments Error on gitlab nodes will not stop the run but will list the current workflow as error in the results Some fields are ignored to determined if there is changes : updatedAt : should be ignored if only ignores fields are changed globals : it's running information, no need to follow the changes
by Oneclick AI Squad
This automated n8n workflow performs daily forecasting of sales and raw material needs for a restaurant. By analyzing historical data and predicting future usage with AI, businesses can minimize food waste, optimize inventory, and improve operational efficiency. The forecast is stored in Google Sheets and sent via email for easy review by staff and management. What is AI Forecast Generator? The AI Forecast Generator is a machine learning component that analyzes historical sales data, weather patterns, and seasonal trends to predict future food demand and recommend optimal inventory levels to minimize waste. Good to Know AI forecasting accuracy improves over time with more historical data Weather and seasonal factors significantly impact food demand predictions Google Sheets access must be properly authorized to avoid data sync issues Email notifications help ensure timely review of daily forecasts The system works with two main data sources: historical food wastage data and predicted low-waste food requirements How It Works Daily Trigger - Initiates the workflow every day to perform food waste prediction Fetch Historical Sales Data - Reads past food usage & sales data from Google Sheets to understand trends Format Data for AI Forecasting - Cleans and organizes raw data into a structured format for AI processing AI Forecast Generator - Uses Gemini AI to forecast food demand and recommend waste reduction strategies Clean & Structure AI Output - Parses AI response into structured and actionable format for reporting Log Forecast to Google Sheets - Stores AI-generated forecast back into Google Sheets for historical tracking Create Email Summary - Creates a concise, human-friendly summary of the forecast findings Send Email Forecast Report - Delivers the forecast report via email to decision makers and management Data Sources The workflow utilizes two Google Sheets: Food Wastage Data Sheet - Contains historical data with columns: Date (date) Food Item (text) Quantity Wasted (number) Cost Impact (currency) Category (text) Reason for Waste (text) Predicted Food Data Sheet - Contains AI predictions with columns: Date (date) Food Item (text) Predicted Demand (number) Recommended Order Quantity (number) Waste Risk Level (text) Optimization Notes (text) How to Use Import the workflow into n8n Configure Google Sheets API access and authorize the application Set up email credentials for forecast report delivery Create the two required Google Sheets with the specified column structures Configure the AI model credentials (Gemini API key) Test with sample historical data to verify predictions and email delivery Adjust forecasting parameters based on your restaurant's specific needs Monitor and refine the system based on actual vs. predicted results Requirements Google Sheets API access Email service credentials (Gmail, SMTP, etc.) AI model API credentials (Gemini AI) Historical food wastage data for initial training Customizing This Workflow Modify the AI Forecast Generator prompts to focus on specific food categories, seasonal adjustments, or local market conditions. Adjust the email summary format to match your restaurant's reporting preferences and add additional data sources like supplier information or menu planning data.
by Joachim Hummel
This workflow connects a USB scanner to Nextcloud via ScanservJS and the integrated API. It checks for new scans at a scheduled time (e.g., every 5 minutes). If there are any, they are automatically retrieved via HTTP request and then saved to a desired Nextcloud folder. Ideal for home offices, offices, or maker projects with Raspberry Pi and network scanners. Nodes used: Schedule Trigger – starts the flow cyclically HTTP Request – retrieves document data from ScanservJS Nextcloud Node – uploads the file directly to your Nextcloud account Requirements: Local installation of ScanservJS (e.g., on a Raspberry Pi) Configured USB scanner Nextcloud access with write permissions in the target folder
by Ari Nakos
This n8n workflow automates lead generation by searching Reddit for relevant posts based on keywords, filtering them, using OpenRouter AI to analyze and summarize content, and logging the findings (link, summary, etc.) to Google Sheets. Watch the full setup tutorial on how I setup this ETL pipeline using n8n: https://youtu.be/F3-fbU3UmYQ Required Authentication: To run this workflow, you need to set up credentials in n8n for: Reddit: Uses OAuth 2.0. Requires creating an app on Reddit to get a Client ID & Secret. (YT Tutorial for Reddit App Creation: https://youtu.be/zlGXtW4LAK8) OpenRouter: Uses an API Key. Generate this key directly from your OpenRouter account settings. (YT Tutorial : https://youtu.be/Cq5Y3zpEhlc) Google Sheets: Uses OAuth 2.0. Requires setup in Google Cloud Console (enable Sheets API, create OAuth Client ID with n8n redirect URI) to get a Client ID & Secret. Ensure these credentials are created and selected in the respective n8n nodes (Get Posts, OpenRouter Chat Model nodes, Output The Results).
by Nazmy
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. OAuth Token Generator and Validator This n8n template helps you generate, validate, and store tokens for your customers securely using: n8n** as your backend automation engine Airtable** as your lightweight client and token store 🚀 What It Does Accepts client_id and client_secret via POST webhook. Validates client credentials against Airtable. Generates a long token on success. Stores the generated token in Airtable with metadata. Responds with a JSON containing the token, expiry, and type. Returns clear error messages if validation fails. How It Works Webhook node receives client_id and client_secret. Validator (Code node) checks: Body contains only client_id and client_secret. Rejects missing or extra fields. Airtable search: Looks up the client_id. Rejects if not found. Secret validation (If node): Compares provided client_secret with stored value. Rejects if incorrect. Token generation (Code node): Generates a 128-character secure token. Airtable create: Stores token, client ID, creation date, and type. Webhook response: Returns JSON { access_token, expires_in, token_type } on success. Returns appropriate JSON error messages on failure. Related Workflow You can also use it with the published Bearer Token Validation workflow: 👉 Validate API Requests with Bearer Token Authentication and Airtable to securely validate tokens you generate with this workflow across your protected endpoints. Why Use This Provides OAuth-like flows without a complex backend. Uses n8n + Airtable for client management and token storage. Clean, modular, and ready for your SaaS or internal API automations. Extendable for token expiry, refresh, and rotation handling. Enjoy building secure token-based APIs using n8n + Airtable! 🚀 Built by: Nazmy
by Edoardo Guzzi
This template integrates OpenAI's image generation and editing endpoints via the GPT-Image-1 model to visually create and manipulate images based on prompts. It features base64 conversion, binary handling, and prompt chaining. Perfect for marketing, design, product visuals and creative workflows. 🛠️ Requirements OpenAI account with access to gpt-image-1(probably you need organizations verifications for access to that model) OpenAI API credentials configured in n8n A self-hosted or cloud n8n instance Basic familiarity with the n8n UI (no programming required) 🔧 Step-by-step Instructions Step 1: Manual Trigger Starts the workflow on click. Ideal for testing the generation and edit logic. Step 2: Generate Image The Create image call node sends a prompt to OpenAI and returns a base64 image. Example prompt: A cyberpunk city at night with flying cars and neon lights Step 3: Convert to Binary The base64 image is converted into a usable binary PNG file with the Convert json binary to File node. Step 4: Edit the Image The binary file is passed to OpenAI’s /images/edits endpoint. A new prompt applies changes to the image. Example: Add a glowing robot in the foreground with a neon sword ✅ Supports model: gpt-image-1 ⚠️ Requires binary file (not base64) Step 5: Final Conversion Converts the final edited image from base64 to file so it can be downloaded or used in other nodes. 🎯 Real-World Use Cases 🎨 Artists & Creators: concept art and illustration variations 🛍️ E-commerce: auto-generate product mockups 📰 Marketing: create eye-catching blog or social visuals 💡 Bonus Ideas Add a Telegram or Slack node to generate or edit images via chat Use a Webhook to feed prompts from a form or frontend Add a mask to restrict edits to specific areas (e.g., background only)
by Preston Zeller
How It Works This N8N workflow creates an automated system for discovering high-potential real estate investment opportunities. The workflow runs on a customizable schedule to scan the market for properties that match your specific criteria, then alerts your team about the most promising leads. The process follows these steps: Connects to BatchData API on a regular schedule to search for properties matching your parameters Compares new results with previous scans to identify new listings and property changes Applies intelligent filtering to focus on high-potential opportunities (high equity, absentee owners, etc.) Retrieves comprehensive property details and owner information for qualified leads Delivers formatted alerts through multiple channels (email and Slack/Teams) Each email alert includes detailed property information, owner details, equity percentage, and a direct Google Maps link to view the property location. The workflow also posts concise notifications to your team's communication channels for quick updates. Who It's For This workflow is designed for: Real Estate Investors: Find off-market properties with high equity and motivated sellers Real Estate Agents: Identify potential listing opportunities before they hit the market Property Acquisition Teams: Streamline the lead generation process with automated scanning Real Estate Wholesalers: Discover properties with significant equity spreads for potential deals REITs and Property Management Companies: Monitor market changes and expansion opportunities The workflow is especially valuable for professionals who want to: Save hours of manual market research time Get early notifications about high-potential properties Access comprehensive property and owner information in one place Focus their efforts on the most promising opportunities About BatchData BatchData is a powerful property data platform for real estate professionals. Their API provides access to comprehensive property and owner information across the United States, including: Property details (bedrooms, bathrooms, square footage, year built, etc.) Valuation and equity estimates Owner information (name, mailing address, contact info) Transaction history and sales data Foreclosure and distressed property status Demographic and neighborhood data The platform specializes in providing accurate, actionable property data that helps real estate professionals make informed decisions and identify opportunities efficiently. BatchData's extensive database covers millions of properties nationwide and is regularly updated to ensure data accuracy. The API's flexible search capabilities allow you to filter properties based on numerous criteria, making it an ideal data source for automated lead generation workflows like this one.
by Cheney Zhang
Create a Paul Graham Essay Q&A System with OpenAI and Milvus Vector Database How It Works This workflow creates a question-answering system based on Paul Graham essays. It has two main steps: Data Collection & Processing: Scrapes Paul Graham essays Extracts text content Loads them into a Milvus vector store Chat Interaction: Provides a question-answering interface using the stored vector embeddings Utilizes OpenAI embeddings for semantic search Set Up Steps Set up a Milvus server following the official guide Create a collection named "my_collection" Run the workflow to scrape and load Paul Graham essays Start chatting with the QA system The workflow handles the entire process from fetching essays, extracting content, generating embeddings via OpenAI, storing vectors in Milvus, and providing retrieval for question answering.