by Simon Mayerhofer
This workflow allows you to batch update/insert Airtable rows in groups of 10, significantly reducing the number of API calls and increasing performance. 🚀 How It Works Copy the 3 Nodes Copy the three nodes inside the red note box into your workflow. Set Your Fields In the Set Fields node, define the fields you want to update. ➤ Only include fields that match column names in your Airtable table. ➤ Make sure the field names are spelled exactly as they appear in Airtable. ➤ Make sure the field type are correctly set. So numbers columns in Airtable need numbers type set as the field. Configure the Airtable Batch Node Enter your Airtable Base ID The part with app... in the URL: airtable\.com / app8pqOLeka1Cglwg / tblnXZOdy8VtkAAJD/... Enter your Airtable Table ID The part with tbl... in the URL: airtable\.com / app8pqOLeka1Cglwg / tblXXZOdy8VtkAAJD /... Set Matching Fields (fieldsToMergeOn) Provide a string array that tells Airtable how to match existing rows. Examples: Match by one field (e.g. TaskID): {{["TaskID"]}} Match by multiple fields (e.g. firstname and lastname): {{["firstname", "lastname"]}} Choose the Mode (mode field) Available options: upsert: Update if a record exists, otherwise insert a new one insert: Always insert as new records update: Only update existing records (you must provide a field named id)
by Raquel Giugliano
++HOW IT WORKS:++ This workflow automates the processing of invoices sent via Telegram. It extracts the data using LlamaIndex OCR, logs it in Google Sheets, and optionally pushes the structured data to SAP Business One 🔹 1. Receive Invoice via Telegram:** A user sends a PDF of an invoice through Telegram A Telegram Trigger node listens for incoming messages and captures the file and metadata The document is downloaded and prepared for OCR 🔹 2. OCR with LlamaIndex: The file is uploaded to the LlamaIndex OCR API. The workflow polls the API until the processing status returns SUCCESS Once ready, the parsed content is fetched in Markdown format 🔹 3. Data Extraction via LLM (editable): The Markdown content is sent to a language model (LLM) using LangChain A Structured Output Parser transforms the result into a clean, structured editable JSON 🔹 4. Save to Google Sheets: The structured JSON is split into: Header (main invoice metadata) Detail (individual line items) Each part is stored in a dedicated tab within a connected Google Sheets file 🔹 5. Ask for SAP Confirmation: The bot replies to the user via Telegram: "Do you want to send the data to SAP?" If the user clicks "Yes", the next automation path is triggered. 🔹 6. Push Data to SAP B1: A connection is made to SAP Business One's Service Layer API Header and detail data are fetched from Google Sheets The invoice structure is rebuilt as required by SAP (DocumentLines, CardCode, etc.) A POST request creates the Purchase Invoice in SAP A confirmation message with the created DocEntry is sent back to the user on Telegram ++SET UP STEPS:++ Follow these steps to properly configure the workflow before execution: 1️⃣ Create Required Credentials: Go to Credentials > + New Credential and create the following: Telegram API (set your bot token get it from BotFather) Google Sheets OpenAI 2️⃣ Set Up Environment Variables (Optional but Recommended): LLAMAINDEX_API_KEY SAP_USER SAP_PASSWORD SAP_COMPANY_DB SAP_URL 3️⃣ Prepare Google Sheets: Ensure your Google Spreadsheet has the following: ➤ Sheet 1: Header ➤ Sheet 2: Details Contains columns for invoice lines
by Joseph LePage
🤖 This n8n workflow creates an intelligent Telegram bot that processes multiple types of messages and provides automated responses using AI capabilities. The bot serves as a personal assistant that can handle text, voice messages, and images through a sophisticated processing pipeline. Core Components Message Reception and Validation 📥 🔄 Implements webhook-based message reception for real-time processing 🔐 Features a robust user validation system that verifies sender credentials 🔀 Supports both testing and production webhook endpoints for development flexibility Message Processing Pipeline ⚡ 🔄 Uses a smart router to detect and categorize incoming message types 📝 Processes three main message formats: 💬 Text messages 🎤 Voice recordings 📸 Images with captions AI Integration 🧠 🤖 Leverages OpenAI's GPT-4 for message classification and processing 🗣️ Incorporates voice transcription capabilities for audio messages 👁️ Features image analysis using GPT-4 Vision API for processing visual content Technical Architecture Webhook Management 🔌 🌐 Maintains separate endpoints for testing and production environments 📊 Implements automatic webhook status monitoring ⚡ Provides real-time webhook configuration updates Error Handling ⚠️ 🔍 Features comprehensive error detection and reporting 🔄 Implements fallback mechanisms for unprocessable messages 💬 Provides user feedback for failed operations Message Classification System 📋 🏷️ Categorizes incoming messages into tasks and general conversation 🔀 Implements separate processing paths for different message types 🧩 Maintains context awareness across message processing Security Features User Authentication 🔒 ✅ Validates user credentials against predefined parameters 👤 Implements first name, last name, and user ID verification 🚫 Restricts access to authorized users only Response System Intelligent Responses 💡 🤖 Generates contextual responses based on message classification
by Dataki
Important Notes: Check Legal Regulations: This workflow involves scraping, so ensure you comply with the legal regulations in your country before getting started. Better safe than sorry! Workflow Description: 😮💨 Tired of struggling with XPath, CSS selectors, or DOM specificity when scraping ? This AI-powered solution is here to simplify your workflow! With a vision-based AI Agent, you can extract data effortlessly without worrying about how the DOM is structured. This workflow leverages a vision-based AI Agent, integrated with Google Sheets, ScrapingBee, and the Gemini-1.5-Pro model, to extract structured data from webpages. The AI Agent primarily uses screenshots for data extraction but switches to HTML scraping when necessary, ensuring high accuracy. Key Features: Google Sheets Integration**: Manage URLs to scrape and store structured results. ScrapingBee**: Capture full-page screenshots and retrieve HTML data for fallback extraction. AI-Powered Data Parsing**: Use Gemini-1.5-Pro for vision-based scraping and a Structured Output Parser to format extracted data into JSON. Token Efficiency**: HTML is converted to Markdown to optimize processing costs. This template is designed for e-commerce scraping but can be customized for various use cases.
by Ezema Kingsley Chibuzo
🧠 What It Does This n8n workflow collects leads from Google Maps, scrapes their websites via direct HTTP requests, and extracts valid email addresses — all while mimicking real user behavior to improve scraping reliability. It rotates User-Agent headers, introduces randomized delays, and refines URLs by removing only query parameters and fragments to preserve valid page paths (like social media links). The workflow blends Apify actors, raw HTTP requests, HTML-to-Markdown conversion, and smart email extraction to deliver clean, actionable lead data — ready to be sent to Airtable, Google Sheets, or any CRM. Perfect for lean, scalable B2B lead generation using n8n’s native logic and no external scrapers. 💡Why this workflow Modest lead scrapers rely on heavy tools or APIs like Firecrawl. This workflow: Uses lightweight HTTP requests (with randomized user-agents) to scrape websites. Adds natural wait times to avoid rate limits and IP bans. Avoid full-page crawlers, yet still pulls emails effectively. Works great for freelancers, marketers, or teams targeting niche B2B leads. Designed for stealth and resilience. 👤 Who it’s for Lead generation freelancers or consultants. B2B marketers looking to extract real contact info. Small businesses doing targeted outreach. Developers who want a fast, low-footprint scraper. Anyone who wants email + website leads from Google Maps. ⚙️ How It Works 1. 📥 Form Submission (Lead Input) A Form Trigger collects: Keyword Location No. of Leads (defaults to 10) This makes the workflow dynamic and user-friendly — ready for multiple use cases and teams. 2. 📊 Scrape Business Info (via Apify) Apify’s Google Maps Actor searches for matching businesses. The Dataset Node fetches all relevant business details. A Set Node parses key fields like name, phone, website, and category. A Limit Node ensures the workflow only processes the desired number of leads. 3. 🔁 First Loop – Visit & Scrape Website Each business website is processed in a loop. A Code Node cleans the website URL by removing only query parameters/fragments — keeping full paths like /contact. A HTTP Request Node fetches the raw HTML of the site: Uses randomized User-Agent headers (5 variants) to mimic real devices and browsers. This makes requests appear more human and reduces the risk of detection or blocking. HTML is converted to Markdown using the Markdown Node, making it easier to scan for text patterns. A Wait Node introduces a random delay between 2-7 seconds: Helps avoid triggering rate limits, Reduces likelihood of being flagged as a bot. A Merge Node combines scraped markdown + lead info for use in the second loop. 4. 🔁 Second Loop – Extract Emails In this second loop, the markdown data is processed. A Code Node applies regex to extract the first valid email address. If no email is found, "N/A" is returned. A brief 1 second Wait Node simulates realistic browsing time. Another Merge Node attaches the email result to the original lead data. 5. ✅ Filter, Clean & Store A Filter Node removes all entries with "N/A" or invalid email results. A Set Node ensures only required fields (like website, email, and company name) are passed forward. The clean leads are saved to Airtable (or optionally, Google Sheets) using an upsert-style insert to avoid duplicates. 🛡️ Anti-Flagging Design This workflow is optimized for stealth: No scraping tools or headless browsers (like Puppeteer or Firecrawl). Direct HTTP requests with rotating User-Agents. Randomized wait intervals (2-7s). Only non-intrusive parsing — no automation footprints. 🛠 How to Set It Up Open n8n (Cloud or Self-Hosted). Install Apify node search Apify and click on Install. Do this before importing your file. Import the provided .json file into your n8n editor. Set up the required credentials: 🔑 Apify API Key** (used for Google Maps scraping) 🔑 Airtable API Key** (or connect Google Sheets instead) Recommended Prepare your Airtable base or Google Sheet with fields like: Email, Website, Phone, Company Name. Review the Set node if you'd like to collect more fields from Apify (e.g., Ratings, Categories, etc.). 🔁 Customization Tips The Apify scraper returns rich business data. By default, this workflow collects name, phone, and website — but you can add more in the "Grab Desired Fields" node. Need safer scraping at scale? Swap the HTTP Request for Firecrawl’s Single URL scraper (or any headless service like Browserless, Oxylabs, Bright Date, or ScrapingBee) — they handle rendering and IP rotation. Want to extract from internal pages (like /contact or /about)? Use Firecrawl’s async crawl mode — just note it takes longer. For speed and efficiency, this built-in HTTP + Markdown setup is usually the fastest way to grab emails.
by Jeyson Orozco
Description This template creates a nightly backup of all n8n workflows and saves them to a Google Drive folder. Each night, the previous night's backups are moved to an “n8n_old” folder and renamed with the corresponding date. Backups older than a specified age are automatically deleted (this feature is active for 30 days, you can remove it if you don't want the backups to be deleted). Prerequisites Google Drive account and credentials Get from the following link. Link n8n version from v 1.63.4 to 1.70.1 or higher N8n api key Guide from the following link. Link A destination folder for backups: “n8n_old” “n8n_backups” (if it doesn't exist, create it) Configuration Update all Google Drive nodes with your credentials. Edit the Schedule Trigger node with the desired time to run the backup. If you want to automatically purge old backups. Edit the “PURGE DAYS” node to specify the age of the backups you want to delete. Enable the “PURGE DAYS” node and the 3 subsequent nodes. Enable the workflow to run on the specified schedule. Last updated January 2025
by Harshil Agrawal
This workflow demonstrates the use of the HTTP Request node to upload binary files for form-data-multipart type. This example workflow updates the Twitter banner. HTTP Request node: This node fetches an image from Unsplash. Replace this node with any other node to fetch the image file. HTTP Request1 node: This node uploads the Twitter Profile Banner. The Twitter API requires OAuth 1.0 authentication. Follow the Twitter documentation to learn how to configure the authentication.
by The { AI } rtist
Ghost + Sendy Integration Está es una integración del CMS Ghost hacia Sendy Sendy ( www.sendy.co ) Ghost ( www.ghost.org ) Con esta integración podrás importar los miembros del CMS Ghost en su nueva versión que incluye la parte de Membresía hacía el Software de newsletter sendy. Está integración además nos avisa si se ha registrado un nuevo miembro via telegram. Para realizar esta customización es necesaria la creación de una custom integration en Ghost. Para ello desde el panel de Administración vamos a CUSTOM INTEGRATIONS / + Add custom Integration Una vez allí nos solicitará un nombre le ponemos el que queramos y añadimos un nuevo Hook: En Target URL debe ir La url que nos genera nuestro webhook dentro de n8n: Pegamos la URL y acamos de rellenar los datos del HTTP REQUEST1 con los datos de nuestra lista rellenando los campos. api_key list Que encontaras en tú instalación de Sendy Por último faltara añadir las credenciales de Telegram de Nuestro BOT ( https://docs.n8n.io/credentials/telegram/ ) e indicar el grupo o usuario donde queremos que notifique. Saludos,
by Boriwat Chanruang
Template Detail This template automates the process of converting a list of addresses into their latitude and longitude (LatLong) coordinates using Google Sheets and the Google Maps API. It's designed for businesses, developers, and analysts who need accurate geolocation data for use cases like delivery routing, event planning, or market analysis. What the Template Does Fetch Address Data: Retrieves addresses from a Google Sheet. Google Maps API Integration: Sends each address to the Google Maps API and retrieves the corresponding LatLong coordinates. Update Google Sheets: Automatically updates the same Google Sheet with the LatLong data for each address. Enhancements Google Sheets Template: Provide a pre-configured Google Sheets template that users can copy. Example link: Google Sheets Template. Columns required: Address: Column to input addresses. LatLong: Column for the latitude and longitude results. Updated Workflow Structure Trigger: A manual trigger node starts the workflow. Retrieve Data from Google Sheets: Fetch addresses from a Google Sheet. Send to Google Maps API: For each address, retrieve the LatLong coordinates directly via the Google Maps API. Update Google Sheets: Write the LatLong results back into the Google Sheet. Steps to Use Prepare Google Sheet: Copy the provided Google Sheets template and add your addresses to the Address column. Configure Google Cloud API: Enable the Maps API for your Google Cloud project. Generate an API key with the required permissions. Run the Workflow: Start the workflow in n8n; it will process the addresses automatically. Updated LatLong data will appear in the corresponding Google Sheet. Review the Results: Use the enriched LatLong data for mapping or analysis.
by Lucas Peyrin
Check Online Version ! [https://n8n-tools.streamlit.app/](https://n8n-tools.streamlit.app/ ) Who is it for? This workflow is perfect for n8n users who want to maintain clean and organized workflows without manually repositioning nodes. Whether you're building complex workflows or sharing them with a team, maintaining visual clarity is essential for efficiency and collaboration. This template automates the positioning process, saving time and ensuring consistent layout standards. How does it work? The template is divided into two parts: Positioning Engine: A webhook node kicks off the process by receiving a workflow ID. Using the provided workflow ID, an n8n API node fetches the workflow details. The fetched workflow is sent to a processing webhook that calculates optimized positions for the nodes. Finally, an n8n API node updates the workflow with the newly positioned nodes, ensuring a clean and professional layout. Reusable Positioning Block: This is an HTTP Request node that can be seamlessly integrated into any workflow you create. When triggered, it sends the current workflow for automatic positioning via the first part of this template. How to set it up? Enable n8n API Access: Ensure that your n8n instance has API access enabled with the appropriate credentials. Input Your n8n API URL and Credentials: Open the template, locate the n8n API nodes, and update them with your instance API key. Update the URL of the 'Magic Positioning' Http Request node to point to your n8n instance webhook URL. Embed the Reusable Block: Add the provided HTTP Request node to any of your workflows to instantly connect to the auto-positioning engine.
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Description This workflow automates the process of scraping local business data from Google Maps and enriching it using AI to generate lead profiles. It's designed to help sales, marketing, and outreach teams collect high-quality B2B leads from Google Maps and enrich them with contextual insights without manual data entry. Overview This workflow scrapes business listings from Google Maps, extracts critical information like name, category, phone, address, and website using Bright Data, and passes the results to Google Gemini to generate enriched summaries and lead insights such as company description, potential services offered, and engagement score. The data is then structured and stored in spreadsheets for outreach. Tools Used n8n: The core automation engine to manage flow and trigger actions. Bright Data: Scrapes business information from Google Maps at scale with proxy rotation and CAPTCHA-solving. Google Gemini: Enriches the raw scraped data with smart business summaries, categorization, and lead scoring. Google Sheets : For storing and acting upon the enriched leads. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Set Up Bright Data: Insert your Bright Data credentials and configure the Google Maps scraping proxy endpoint. Configure Gemini API: Add your Google Gemini API key (or use via Make.com plugin). Customize the Inputs: Choose your target location, business category, and number of results per query. Choose Storage: Connect to your preferred storage like Google Sheets. Test and Deploy: Run a test scrape and enrichment before deploying for bulk runs. Use Cases Sales Teams: Auto-generate warm B2B lead lists with company summaries and relevance scores. Marketing Agencies: Identify local business prospects for SEO, web development, or ads services. Freelancers: Find high-potential clients in specific niches or cities. Business Consultants: Collect and categorize local businesses for competitive analysis or partnerships. Recruitment Firms: Identify and score potential company clients for talent acquisition. Connect with Me Email: ranjancse@gmail.com LinkedIn: https://www.linkedin.com/in/ranjan-dailata/ Get Bright Data: Bright Data (Supports free workflows with a small commission) #n8n #automation #leadscraping #googlemaps #brightdata #leadgen #b2bleads #salesautomation #nocode #leadprospecting #marketingautomation #googlemapsdata #geminiapi #googlegemini #aiworkflow #scrapingworkflow #businessleads #datadrivenoutreach #crm #workflowautomation #salesintelligence #b2bmarketing
by Sean Lon
AI-Powered Tech Radar Advisor This project is built on top of the famous open source ThoughtWorks Tech Radar. You can use this template to build your own AI-Powered Tech Radar Advisor for your company or group of companies. Target Audience This template is perfect for: Tech Audit & Governance Leaders:** Those seeking to build a tech landscape AI platform portal. Tech Leaders & Architects:** Those aiming to provide modern AI platforms that help others understand the rationale behind strategic technology adoption. Product Managers:** Professionals looking to align product innovation with the company's current tech trends. IT & Engineering Teams:** Teams that need to aggregate, analyze, and visualize technology data from multiple sources efficiently. Digital Transformation Experts:** Innovators aiming to leverage AI for actionable insights and strategic recommendations. Data Analysts & Scientists:** Individuals who want to combine structured SQL analysis with advanced semantic search using vector databases. Developers:** Those interested in integrating RAG chatbot functionality with conversation storage. 1. Description Tech Constellation is an AI-powered Tech Radar solution designed to help organizations visualize and steer their technology adoption strategy. It seamlessly ingests data from a Tech Radar Google Sheet—converting it into both a MySQL database and a vector index—to consolidate your tech landscape in one place. The platform integrates an interactive AI chat interface powered by four specialized agents: AI Agent Router:** Analyzes and routes user queries to the most suitable processing agent. SQL Agent:** Executes precise SQL queries on structured data. RAG Agent:** Leverages semantic, vector-based search for in-depth insights. Output Guardrail Agent:** Validates responses to ensure they remain on-topic and accurate. This powerful template is perfect for technology leaders, product managers, engineering teams, and digital transformation experts looking to make data-driven decisions aligned with strategic initiatives across groups of parent-child companies. 2. Features Data Ingestion A Google Sheet containing tech radar data is used as the primary source. The data is ingested and converted into a MySQL database. Simultaneously, the data is indexed into a vector database for semantic (vector-based) search. Interactive AI Chat Chat Integration:** An AI-powered chat interface allows users to ask questions about the tech radar. Customizable AI Agents:** AI Agent Router: Determines the query type and routes it to the appropriate agent. SQL Agent: Processes queries using SQL on structured data. RAG Agent: Performs vector-based searches on document-like data. Output Guardrail Agent: Validates queries and ensures that the responses remain on-topic and accurate. Usage Examples Tell me, is TechnologyABC adopted or on hold, and why? List all the tools that are considered part of the strategic direction for company3 but are not adopted. Project Links & Additional Details GitHub Repository (Frontend Interface Source Code):** github.com/dragonjump/techconstellation Try It:** https://scaler.my