by Dvir Sharon
š Extract Google My Business Leads by Service & Location with Bright Data to Google Sheets This template requires a self-hosted n8n instance to run. A comprehensive n8n automation that extracts Google My Business listings by service type and geographic location using Bright Data's Google Maps dataset, with intelligent city expansion and automatic duplicate removal. š„ Who is this for? Lead generation professionals Sales teams Marketing agencies Business development representatives Entrepreneurs conducting outreach or market research ā What problem is this solving? Manual lead generation from Google Maps is time-consuming and inefficient. This workflow automates the process of finding businesses by service type and location, expanding searches across cities, removing duplicates, and organizing results in a structured format. āļø What this workflow does Input Processing Accepts service type, state, and country via web form Uses Claude AI to generate city lists Auto-categorizes services Creates search queries per city Data Collection Uses Bright Data's Google Maps dataset Processes in batches with rate limits Monitors scraping with retry logic Formats and handles API responses Quality Control Removes duplicates by name and phone Maintains clean data in Google Sheets Ensures structured, usable datasets š Output Data Points | Field | Description | Example | | :-------------- | :-------------------------- | :---------------------------- | | Business Name | Company or business name | TechFix Computer Repair | | Category | Business category type | Electronics | | Country | Country location | US | | City | Specific city searched | Austin | | Phone Number | Contact phone number | +1 (555) 123-4567 | | Website URL | Business website | https://techfix.com | | Google Maps URL | Direct Maps link | https://maps.google.com/... | | Address | Full business address | 123 Main St, Austin, TX | | Operating Hours | Business hours | Mon-Fri 9AM-6PM | | Google Rating | Star rating | 4.5 | | Total Reviews | Number of reviews | 127 | | Reviews URL | Link to reviews | https://maps.google.com/reviews... | š Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with Google Maps dataset access Anthropic API key for Claude AI Step-by-Step Import the workflow JSON into n8n Configure Bright Data credentials and dataset access Set up Google Sheets and OAuth2 credentials Configure Claude AI with your API key Replace all placeholder credential IDs and tokens. For improved security, use credentials instead of hardcoding the API token placeholder in the HTTP Request node. Test with sample data (e.g., "Coffee Shop" in California, US) Activate the workflow and use the form for submissions š How to Customize Modify Geographic Scope Add countries to the form dropdown Customize Claude prompts for city generation Adjust search logic for international markets Enhance Data Collection Add more fields from Bright Data Include revenue, employee count, social profiles Improve Duplicate Detection Use fuzzy matching for similar names Include address-based checks Customize Output Format Transform data for CRM compatibility Export to CSV, database, or multiple destinations Implement Advanced Features Integrate email finder services Include lead scoring logic Discover social media profiles Batch Processing Optimization Adjust batch sizes per Bright Data limits Use parallel processing and retry logic Integration Options Connect to CRMs like HubSpot or Salesforce Trigger email automation Integrate with marketing platforms
by Jimleuk
This n8n template is one of a 3-part series exploring use-cases for clustering vector embeddings: Survey Insights Customer Insights Community Insights This template demonstrates the Community Insights scenario where HN commments can be quickly grouped by similarity and an AI agent can generate insights on those groupings. With this workflow, Researchers or HN users can quickly breakdown community consensus on a particular topic and identify frequently mentioned positives and negatives. Sample Output: https://docs.google.com/spreadsheets/d/e/2PACX-1vQXaQU9XxsxnUIIeqmmf1PuYRuYtwviVXTv6Mz9Vo6_a4ty-XaJHSeZsptjWXS3wGGDG8Z4u16rvE7l/pubhtml How it works HN comments are imported via the Hacknews API node. Comments are then inserted into a Qdrant collection carefully tagged with the Hackernews API metadata. Comments are then fetched and are put through a clustering algorithm using the Python Code node. The Qdrant points are returned in clustered groups. Each group is looped to fetch the payloads of the points and feed them to the AI agent to summarise and generate insights for. The resulting insights and raw responses are then saved to the Google Spreadsheet for further analysis by the researcher or the HN user. Requirements Works best with lots of comments! Qdrant Vectorstore for storing embeddings. OpenAI account for embeddings and LLM. Customising the Template Adjust clustering parameters which make sense for your data. Adjust sentimentality setting if comments are overwhelmingly negative at times.
by IvanCore
Disclaimer: This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Important distinction: This template manages Telegram Copilot's UserBots (client accounts), not Telegram Bots. UserBot vs. Bot: Key Differences š¹ Telegram Copilot's UserBots Authenticate as real user accounts (phone number required) Can join groups/channels without "Bot" label Subject to Telegram's client API limits Require manual login (MFA supported) š¹ Telegram Bots Use @BotFather-created tokens Limited to bot API functionality Can't initiate chats with unbidden users No phone number required This template solves the unique challenges of UserBot management through: Core Functionality š”ļø Session Reliability Automatic crash recovery (5-step restart sequence) Persistent session monitoring (checks every 6h) Database cleanup via /clear command š± Multi-Device Support Manages sessions independently from mobile clients Tracks active devices via /stat command Isolates session data per credential š Smart Notifications Real-time alerts to admin chat Detailed error context with authState snapshots Success confirmations with session metadata Setup Guide Prerequisites Self-hosted n8n instance (community node required) Valid Telegram account for UserBot Telegram bot token for notifications TelePilot credentials with api_id/api_hash Configuration Steps Credential Setup Add TelePilot credentials in n8n Configure Telegram bot token in notification nodes Set admin chat ID for alerts Monitoring Customization Adjust check frequency in Schedule Trigger Modify alert thresholds in Filter nodes Configure retry logic in recovery sequence Session Management Test /start command flow Verify /stat output format Confirm notification delivery Workflow Customization Advanced Options Add secondary notification channels (Email, Slack) Implement escalating alert system Integrate with monitoring dashboards Customize recovery attempt limits Compliance Notes UserBots must comply with Telegram's Terms of Service Not intended for bulk messaging or spam Recommended for legitimate automation use cases Note: UserBots must comply with Telegram ToS. Not for spam/mass messaging. Why This Matters: UserBots enable automation scenarios impossible with regular bots (e.g., group management as normal user, reacting as human account). This workflow keeps them reliably online 24/7.
by Evoort Solutions
š¼ļø Text-to-Image Generator using n8n + Flux AI This n8n workflow automates image generation from text prompts using the Text-to-Image Flux AI API. It reads prompts from Google Sheets, generates images via API, uploads them to Google Drive, and logs the outcome. š Key Features Integrates with Text-to-Image Flux AI on RapidAPI Converts base64 image data to downloadable files Stores images on Google Drive Updates logs and errors back into Google Sheets Skips prompts already processed š Google Sheet Column Structure Your source Google Sheet should include the following columns: | Column Name | Description | |-------------------|--------------------------------------------------| | Prompt | The text prompt to generate an image from | | drive path | (Optional) File path or URL of saved image | | Generated Date | Date/time the image was generated | | Base64 | Base64 string or error message (for logging) | Only rows with a non-empty Prompt and empty drive path will be processed. š Use Case Perfect for: Bulk AI image generation for content marketing Creative automation with prompt-based image creation Building image assets based on structured datasets Any workflow where prompts are tracked via Google Sheets Uses the Text-to-Image Flux AI API to generate high-quality images on demand. š§ Workflow Summary | Step | Node | Description | |------|------|-------------| | 1 | Manual Trigger | Manually start the workflow | | 2 | Google Sheets2 | Reads prompts from Google Sheets | | 3 | Loop Over Items | Processes rows one by one | | 4 | If2 | Skips rows that already have images | | 5 | HTTP Request1 | Calls Text-to-Image Flux AI via RapidAPI | | 6 | Code1 | Converts base64 image to binary file | | 7 | Google Drive1 | Uploads the image file to a Drive folder | | 8 | Google Sheets1 | Logs base64 result and timestamp back | | 9 | If1 | Handles errors from the API | | 10 | Google Sheets4 | Logs errors to the sheet | | 11 | Wait | Adds delay between batches to prevent rate-limiting | š RapidAPI: Text-to-Image Flux AI This flow is powered by Text-to-Image Flux AI. Be sure to: Sign up at RapidAPI and subscribe to the API. Copy your API Key. Replace "your key" in the HTTP Request1 nodeās x-rapidapi-key header. You can test the API directly here before connecting it to n8n. ā Tips for Setup Ensure youāve set up a Google Service Account with access to both Sheets and Drive. Fill only the Prompt column ā leave drive path and Base64 empty for new prompts. Monitor your RapidAPI dashboard for usage and quota. Create your free n8n account and set up the workflow in just a few minutes using the link below: š Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by inderjeet Bhambra
Who is this for? This workflow is designed for travel bloggers, content creators, social media managers, and anyone who wants to transform their travel photos into engaging written narratives. It's perfect for travelers looking to create compelling stories from their photo collections without spending hours crafting content manually, families wanting to document memorable trips, and digital nomads who need to produce travel content efficiently. What problem is this workflow solving? Converting travel photos into engaging stories is time-consuming and requires both creative writing skills and the ability to analyze visual content meaningfully. This workflow solves the challenge of: Transforming visual memories into compelling written narratives Organizing photos chronologically to create logical story flow Generating professional-quality travel content without writing expertise Analyzing photo content to extract meaningful themes and emotions Creating day-by-day structured narratives from unorganized photo collections Reducing the time spent on manual content creation for travel documentation What this workflow does This AI-powered photo storyteller takes your travel photos and automatically generates immersive, first-person travel narratives. The workflow: Accepts multiple photos through a webhook endpoint Uses OpenAI Vision API (GPT-4o) to analyze each photo's content, emotions, and themes Automatically organizes photos chronologically by date and timestamp Groups photos by travel days and extracts daily themes Leverages GPT-4.1 (minimum required) to craft engaging, first-person travel stories with creative day titles Generates structured narratives with sensory details, cultural observations, and emotional insights Outputs JSON formatted content ready for formatting Creates day-by-day story structure with memorable moments and reflective conclusions Setup Required Credentials: OpenAI API key configured in n8n for both Vision Analysis and Story Generation nodes Ensure you have sufficient OpenAI credits for image analysis and text generation Webhook Configuration: The workflow creates a webhook endpoint at /tripteller-upload Configure your photo upload interface to POST photos array to this endpoint Photos should be sent as base64 encoded data with filename and metadata Photo Requirements: Supported formats: Standard image formats (JPEG, PNG, etc.) Photos should include timestamp metadata for chronological organization Caution Do not upload all photos at once. Start with a small number of photos, like 5 at a time. How to customize this workflow to your needs Story Style Customization: Modify the system prompt in the "Generate Travel Story" node to adjust writing tone (nostalgic, adventurous, poetic, etc.) Customize the story structure by editing the output format requirements Add specific cultural or geographical context prompts for location-specific storytelling Photo Analysis Enhancement: Adjust the Vision Analysis node prompt to focus on specific elements (architecture, food, people, landscapes) Modify the grouping logic in the "Group Photos by Day" node for different time-based organization Add location extraction from EXIF data for geographical context Output Format Adjustment: Customize the final response structure in the "Format Final Response" node Add integration with publishing platforms (blog APIs, social media, etc.) Include additional metadata like location tags, travel duration, or trip statistics Performance Optimization: Adjust the execution timeout based on your typical photo volume Modify the parallel processing approach for large photo collections Add progress tracking for longer processing workflows
by n8n Team
This workflow automatically sends Zendesk tickets to Pipedrive contacts and makes them task assignees. The automation is triggered every 5 minutes, with Zendesk checking and collecting new tickets which are then individually assigned to a Pipedrive contact. Prerequisites Pipedrive account and Pipedrive credentials Zendesk account and Zendesk credentials Note: The Pipedrive and the Zendesk accounts need to be created by the same person / with the same email. How it works Cron node triggers the workflow every 5 minutes. Zendesk node collects all the tickets received after the last execution timestamp. Set node passes only the requester`s email and ID further to the Merge node. Merge by key node merges both inputs together, the tickets and their contact emails. Pipedrive node then searches for the requester. HTTP Request node gets owner information of Pipedrive contact. Set nodes keep only the requester owner's email and the agent`s email and id. Merge by key node merges the information and adds the contact owner to ticket data. Zendesk node changes the assignee to the Pipedrive contact owner or adds a note if the requester is not found. The Function Item node sets the new last execution timestamp.
by AlexAy
Who is this workflow template for? This workflow template is perfect for freelancers, small business owners, accounting teams, or anyone responsible for managing and recording invoices regularly. If you deal with multiple invoices and spend considerable time manually entering invoice data into a database, this automation will significantly simplify your daily operations and reduce potential errors. What this workflow does The workflow automates the entire invoice logging process. It continuously monitors a designated Google Drive folder every minute for new PDF invoice uploads. Once a new invoice is detected, it is automatically converted from PDF to an image format using the ILovePDF API. After conversion, Google's Gemini AI analyzes the image, intelligently extracting essential details such as vendor name, item description, invoice amount, invoice date, payment date, and bank reference numbers. Finally, this structured data is automatically recorded in an Airtable database (or optionally in a Google Sheet), ensuring organized, accessible records. Detailed Workflow Explanation Step 1: Invoice Detection** Monitors Google Drive for newly uploaded PDF invoices. Step 2: PDF to Image Conversion** Converts PDFs into images using ILovePDF. Step 3: Data Extraction via Gemini AI** Uses Gemini AI to analyze the invoice image. Extracts data such as Vendor, Description, Amount, Invoice Date, Paid Date, and Bank Reference. Provides clear descriptions even when original invoice descriptions are vague or missing by analyzing vendor context. Step 4: Structured Data Storage** Automatically sends extracted data to Airtable or Google Sheets. Step 5: File Management** Moves processed PDF files into a separate "Done" folder to clearly differentiate between processed and unprocessed invoices. Step-by-Step Setup Instructions Set Up Google Drive: Log in to Google Drive and create two folders: One named Invoices (for incoming PDF files) One named Processed (for processed files) Obtain API Credentials: ILovePDF API: Sign up at ILovePDF Developers. Retrieve your API key from your account dashboard. Google Gemini AI API: Register at Google AI and generate an API key. Airtable Database Preparation: Create an Airtable base with the following columns: Vendor (Text) Description (Text) Amount (Number or Text) Invoice Date (Date) Paid Date (Date) Bank Reference (Text) Import and Configure Workflow in n8n: Import the provided workflow JSON file into your n8n instance. Connect your Google Drive, ILovePDF, Google Gemini AI, and Airtable accounts by entering your credentials in their respective nodes. Adjust Workflow Settings: In the Google Drive nodes, ensure your newly created Invoices and Processed folders are correctly selected. Update the ILovePDF public key in the appropriate HTTP Request node. Customize the Gemini AI prompt to refine or expand data extraction according to your specific needs. Testing Your Setup: Upload a sample PDF invoice into the Invoices folder. Execute the workflow by clicking Test Workflow in n8n and verify if data extraction and Airtable logging operate correctly. Airtable Column Specifications Ensure your Airtable includes the following structure: Vendor**: Single Line Text Description**: Single Line Text Amount**: Currency or Single Line Text Invoice Date**: Date (formatted as YYYY-MM-DD) Paid Date**: Date (formatted as YYYY-MM-DD) Bank Reference**: Single Line Text How to Customize the Workflow System Prompt:** Adjust the AI instructions by modifying the prompt text to focus on additional or fewer invoice details. Structured Output Parser:** Modify the JSON schema in the parser node to match the structure and data points your project specifically requires: By following these instructions, youāll have a fully automated, reliable system for handling and logging invoice data, significantly enhancing your productivity.
by Robert Breen
Extract Local Business Contacts with Google Sheets, SerpAPIĀ &Ā GPTā4o Status: Ready for UseāÆā Disclaimer: This workflow relies on community nodes that are not part of n8nās core package. Install the following from n8nāÆāāÆCommunityĀ Nodes before running: n8n-nodes-langchain** n8n-nodes-openai** (StructuredĀ OutputĀ Parser) n8n-nodes-apify** šĀ Description This n8n workflow automates discovery of localābusiness contact details by search term and location, then enriches the results with publicly listed email addresses using GPTā4oĀ AI. šĀ Key Features šĀ GoogleĀ SheetsĀ Integration Reads search terms and locations from a Google Sheet. Processes only rows that are not markedĀ Complete, preventing duplicates. šŗļøĀ GoogleĀ Maps Search viaāÆSerpAPI Queries GoogleĀ Maps through SerpAPI for every searchātermāandālocation pair. Retrieves the following fields: business name, website, street address, and phone number. š§ Ā WebsiteĀ ScrapingĀ &Ā EmailĀ Extraction Scrapes the business homepage content with Apifyās Fast Website Content Crawler. Sends the scraped HTML to a GPTā4oĀ AIĀ Agent. Extracts any publicly listed email address. Returns a clean, structured JSON object for downstream use. š¾Ā DataĀ StorageĀ &Ā Tracking Writes every result to a Results tab in the same Google Sheet. Marks the corresponding row in the Searches tab as Complete once finished. š§±Ā ExtensibleĀ Design The workflow uses modular subāworkflows and AI agents. You can easily extend it to add: Phoneānumber verification with Twilio Socialāmedia enrichment with Clearbit Exports to HubSpot, Salesforce, Airtable, PostgreSQL, or CSV files šĀ GoogleĀ SheetĀ Setup Create a Searches tab with these exact columns (one header row): Search | Area | Area Name | Complete Create a results tab with these columns title | website | address | phone | Search | Search Name | Area | email (Manual Entry) āļøĀ Prerequisites GoogleĀ CloudĀ Project with Google Sheets API and Google Drive API enabled SerpAPI account (free trial or paid) ā obtain an API key Apify account (free trial or paid) with the FastĀ WebsiteĀ ContentĀ Crawler actor installed OpenAI account with an API key that can access GPTā4o models šĀ SetupĀ Instructions Copy the GoogleĀ Sheet Make a personal copy of the template sheet. Ensure the tab names are Searches and Results. https://docs.google.com/spreadsheets/d/1QgcVMlXRlM_5ZFFUHr6bVK-93Tzia9XseTX03ZYnowI/edit?usp=sharing Configure GoogleĀ SheetsĀ nodes in n8n Open the workflow. Update the nodes ExtractĀ SearchĀ Terms and SaveĀ EmailsĀ toĀ Sheet to point at your copied sheet. Authenticate using Google OAuth2 credentials that have access to the sheet. Add SerpAPI credentials Sign in at <https://serpapi.com>. Copy your API key. In the SearchĀ GoogleĀ Maps node, create a new credential and paste the key. Set upĀ Apify Sign up at <https://apify.com>. Add the FastĀ WebsiteĀ ContentĀ Crawler actor to your account. In the ScrapeĀ WebĀ Page HTTP node, append ?token=YOUR_API_KEY to the actor URL. Add your OpenAIĀ API key Go to <https://platform.openai.com>. Generate an API key. Add it to the AIĀ Agent and OpenAIĀ ChatĀ Model node credentials. ā Ā RunningĀ theĀ Workflow Click ExecuteāÆWorkflow in n8n. For each unprocessed row in the Searches tab, the automation will: Retrieve business information from GoogleĀ Maps viaāÆSerpAPI. Scrape the business website using Apify. Use GPTā4o to extract a public email address. Write all collected data to the Results tab. Mark the original row as Complete. š§©Ā ExampleĀ UseĀ Cases Build highly targeted lead lists for sales and marketing outreach. Compile local business directories for regional websites or apps. Automate contactāinformation collection for leadāgeneration campaigns and reduce manual data entry. š¤ Connect with Me Description Iām Robert Breen, founder of Ynteractive ā a consulting firm that helps businesses automate operations using n8n, AI agents, and custom workflows. Iāve helped clients build everything from intelligent chatbots to complex sales automations, and Iām always excited to collaborate or support new projects. If you found this workflow helpful or want to talk through an idea, Iād love to hear from you. Links š Website: https://www.ynteractive.com šŗ YouTube: @ynteractivetraining š¼ LinkedIn: https://www.linkedin.com/in/robert-breen š¬ Email: rbreen@ynteractive.com
by Audun
Who is this for? Security professionals Developers Individuals interested in data breach awareness Use Case Automated monitoring for new breaches Proactive identity protection Demonstration of simple cache mechanism What this workflow does Checks the Have I Been Pwned API every 15 minutes for the latest breaches. Compares new breach data against previously notified breaches. Demonstrates a simple cache mechanism to track previously seen breaches. How the Cache Functionality Works Read from Cache**: Retrieves the last known breach from cache.json to avoid redundant alerts for the same breach. Compare Against Current Breach**: The workflow checks if the latest fetched breach differs from the cached one. Update the Cache**: If a new breach is detected, it updates cache.json with the latest breach data. Setup instructions The endpoint used in this workflow does not require an API key. Add your desired alert mechanism in the red box attached to the New breach node. How to customize this workflow to your needs Modify Notification Settings**: Tailor where alerts are sent (email, Slack, etc.). Add the desired node after the New breach node. This node contains all the data from the breach so it is eaisily available. You can choose from a variety of n8n nodes to send alerts when a new breach is detected. Below are a few common options you might consider adding after the New breach node: Email Node What it does: Sends an email notification to one or more recipients. Use case: Great for simple alerts to your inbox or a team distribution list. Customization: You can include breach details in the subject or body of the email, using data from the New breach node. Slack Node What it does: Sends a message to a Slack channel or user. Use case: Perfect for real-time alerts to your team in Slack. Customization: You can post breach details directly in a channel or DM. You can also format the message (bold, code blocks, etc.). Microsoft Teams Node What it does: Sends a message to a Teams channel. Use case: For organizations that use Microsoft Teams for communication. Customization: Similar to Slack, you can customize the message content and include all relevant breach information. Discord Node What it does: Sends an alert message to a Discord channel. Use case: Useful for teams or communities that coordinate via Discord. Customization: Add formatted messages with breach details for easy viewing. Telegram Node What it does: Sends messages to a Telegram chat or group. Use case: Good for mobile notifications and fast alerts. Customization: You can include breach summaries or detailed information, and even use bots to automate this. Webhook Node (as a sender) What it does: Sends breach data to another service via a webhook. Use case: If you have an external system or app that handles alerts, you can push the data directly to it. Customization: Send JSON payloads with detailed breach information to trigger actions in other systems. SMS Nodes (like Twilio) What it does: Sends an SMS notification to one or more phone numbers. Use case: For urgent alerts that need to be seen immediately. Customization: Keep messages concise, including key breach details like the time, type of breach, and affected system. Adjust Check Frequency**: Change the interval in the Schedule Trigger node (e.g., hourly or daily).
by Billy Christi
Who is this for? This workflow is perfect for: Digital marketers who need to scale SEO-optimized content production Bloggers and content creators who want to maintain consistent publishing schedules Small business owners who need regular blog content but lack writing resources What problem is this workflow solving? Creating high-quality, SEO-optimized blog content consistently is time-consuming and resource-intensive. This workflow solves that by: Automating the content generation process from topic to final draft Ensuring quality control through human-in-the-loop approval Managing topic queues and preventing duplicate content creation Streamlining the revision process based on human feedback Organizing and archiving all generated content for future reference What this workflow does From topics stored in Google Sheets, this workflow: Automatically retrieves pending topics from your Google Sheets tracking document Generates SEO-optimized blog posts (800-1200 words) using OpenAI GPT-4 with structured prompts Sends content for human approval via email with custom approval forms Handles revision requests by incorporating feedback while maintaining SEO best practices Updates topic status to prevent duplicate processing Add approved generated content in Google Sheets for easy access and management Routes workflow based on approval decisions (approve, revise, or cancel) Setup Copy the Google Sheet template here: š Automate Blog Content Creation ā Google Sheet Template Connect Google Sheets with your topic tracking document (requires "Topic List" and "Generated Content" sheets) Add your OpenAI API key to the AI agent nodes for content generation Configure Gmail for the approval notification system Set up your topic list in Google Sheets with "Topic" and "Status" columns Customize the schedule trigger to run at your preferred intervals Update email recipient in the approval node to your email address Test with a sample topic marked as "Pending" in your Google Sheet How to customize this workflow to your needs Adjust content length**: modify the word count requirements in the AI agent prompts Change writing style**: customize the copywriter prompts for different tones (formal, casual, technical) Add multiple reviewers**: extend the approval system to include additional stakeholders Integrate with CMS**: add nodes to automatically publish approved content to WordPress, Webflow, or other platforms Include keyword research**: add Ahrefs or SEMrush nodes to incorporate keyword data Add image generation**: integrate DALL-E or Midjourney for automatic featured image creation Customize approval criteria**: modify the approval form to include specific feedback categories Add content scoring**: integrate readability checkers or SEO analysis tools before approval
by Jimleuk
If you have a shared or personal drive location with a high frequency of files created by humans, it can become difficult to organise. This may not matter... until you need to search for something! This n8n workflow works with the local filesystem to target the messy folder and categorise as well as organise its files into sub directories automatically. Disclaimer Unfortunately due to the intended use-case, this workflow will not work on n8n Cloud and a self-hosted version of n8n is required. How it works Uses the local file trigger to activate once a new file is introduced to the directory The new file's filename and filetype are analysed using AI to determine the best location to move this file. The AI assess the current subdirectories as to not create duplicates. If a relevant subdirectory is not found, a new subdirectory is suggested. Finally, an Execute Command node uses the AI's suggestions to move the new file into the correct location. Requirements Self-hosted version of n8n. The nodes used in this workflow only work in the self-hosted version. If you are using docker, you must create a bind mount to a host directory. Mistral.ai account for LLM model Customise this workflow If the frequency of files created is high enough, you may not want the trigger to active on every new file created event. Switch to a timer to avoid concurrency issues. Want to go fully local? A version of this workflow is available which uses Ollama instead. You can download this template here: https://drive.google.com/file/d/1iqJ_zCGussXpfaUBYGrN5opziEFAEQMu/view?usp=sharing
by Sarfaraz Muhammad Sajib
š§ Email Validation Workflow Using APILayer API This n8n workflow enables users to validate email addresses in real time using the APILayer Email Verification API. It's particularly useful for preventing invalid email submissions during lead generation, user registration, or newsletter sign-ups, ultimately improving data quality and reducing bounce rates. āļø Step-by-Step Setup Instructions Trigger the Workflow Manually: The workflow starts with the Manual Trigger node, allowing you to test it on demand from the n8n editor. Set Required Fields: The Set Email & Access Key node allows you to enter: email: The target email address to validate. access_key: Your personal API key from apilayer.net. Make the API Call: The HTTP Request node dynamically constructs the URL: https://apilayer.net/api/check?access_key={{ $json.access_key }}&email={{ $json.email }} It sends a GET request to the APILayer endpoint and returns a detailed response about the email's validity. (Optional): You can add additional nodes to filter, store, or react to the results depending on your needs. š§ How to Customize Replace the manual trigger with a webhook or schedule trigger to automate validations. Dynamically map the email and access_key values from previous nodes or external data sources. Add conditional logic to filter out invalid emails, log them into a database, or send alerts via Slack or Email. š” Use Case & Benefits Email validation is crucial in maintaining a clean and functional mailing list. This workflow is especially valuable in: Sign-up forms where real-time email checks prevent fake or disposable emails. CRM systems to ensure user-entered emails are valid before saving them. Marketing pipelines to minimize email bounce rates and increase campaign deliverability. Using APILayerās trusted validation service, you can verify whether an email exists, check if itās a role-based address (like info@ or support@), and identify disposable email servicesāall with a simple workflow. Keywords: email validation, n8n workflow, APILayer API, verify email, real-time email check, clean email list, reduce bounce rate, data accuracy, API integration, no-code automation