by Yang
Who is this for? This workflow is for digital marketers, small business owners, lead generation agencies, and VAs who need a scalable way to find and store local business leads using AI. It’s especially useful for teams that want to enrich leads with real-time news insights and save the structured data to Airtable. What problem is this workflow solving? Manually researching local businesses and staying up to date with relevant news is time-consuming and inefficient. This automation eliminates that burden by using Dumpling AI chat agents to generate leads and context, GPT-4o to summarize, and Airtable to store everything in one place. What this workflow does This AI workflow listens for a manual trigger in n8n and executes the following steps: Extracts local business leads using a Local Business Agent from Dumpling AI. Pulls current news related to the business type or location using a News Agent from Dumpling AI. Uses GPT-4o to combine both responses into a human-readable summary. Extracts structured lead data like name, category, and city. Saves the summary and lead data into Airtable for easy follow-up. Setup 1. Create AI Agents in Dumpling AI Sign in at Dumpling AI Create two separate agents: Local Business Agent: Designed to respond with structured lists of businesses by location and category. News Agent: Designed to fetch relevant recent news and summaries about a specific industry or region. After setting up each agent, copy the Agent Key from Dumpling AI. These keys will be required in the headers of your HTTP Request nodes in n8n. 2. Manual Trigger This workflow begins with a manual trigger inside n8n, Which is the When chat message is recieved. This makes it easy to test and reuse, especially during setup. 3. Get Local Business Data from Dumpling AI The first HTTP Request node sends a prompt like List 5 top real estate companies in Atlanta with full address and services. Include your Local Business Agent Key in the x-agent-key header. The response will return a structured list of business leads. 4. Get News Context from Dumpling AI The second HTTP Request node sends a prompt such as Give me the latest news related to the real estate market in Atlanta. Use your News Agent Key in the header. This fetches a brief set of recent news summaries relevant to the businesses being researched. 5. Use GPT-4o to Merge and Summarize The GPT node combines the list of businesses and news into one coherent summary. You can modify the prompt to output in paragraph format, bullet points, or structured notes. 6. Save Lead to Airtable The Airtable node sends all structured fields into your selected base and table. Be sure to connect your Airtable account and confirm the columns match exactly. How to customize this workflow Replace the prompt inside the HTTP node to focus on different types of businesses or cities. Expand the GPT output to include additional lead info like websites, phone numbers, or emails if the agent includes them. Add a webhook trigger to allow this flow to be run via a chatbot, external app, or button. Link to HubSpot or another CRM to sync the leads automatically. Duplicate the process to run for multiple industries in parallel. Final Notes You must create and configure your Dumpling AI agents first before running this workflow. The Agent Keys from Dumpling AI are required in both HTTP Request nodes. This flow is modular and flexible, ready for deeper CRM integrations. The manual trigger is great for testing, but you can add a Webhook node to automate it. This workflow helps you launch an intelligent lead gen process that combines location-targeted business discovery, AI-generated insights, and structured CRM-friendly output, all powered by Dumpling AI and OpenAI.
by Yaron Been
🚀 Automated Job Market Tracker: Upwork Scraper to Google Sheets Workflow! Workflow Overview This cutting-edge n8n automation is a sophisticated job market intelligence tool designed to transform freelance job tracking into a seamless, data-driven process. By intelligently connecting Apify, data processing, and Google Sheets, this workflow: Discovers Job Opportunities: Automatically scrapes Upwork job listings Tracks recent freelance postings Eliminates manual job market research Intelligent Data Processing: Filters and extracts key job details Structures job information Ensures comprehensive opportunity tracking Seamless Data Logging: Automatically updates Google Sheets Creates real-time job market database Enables rapid market trend analysis Scheduled Intelligence Gathering: Periodic automated tracking Consistent job listing updates Zero manual intervention required Key Benefits 🤖 Full Automation: Zero-touch job market research 💡 Smart Filtering: Targeted job opportunity insights 📊 Comprehensive Tracking: Detailed freelance market intelligence 🌐 Multi-Source Synchronization: Seamless data flow Workflow Architecture 🔹 Stage 1: Job Discovery Scheduled Trigger**: Periodic market scanning Apify Integration**: Upwork job scraping Intelligent Filtering**: Recent job postings Specific keywords Relevant opportunities 🔹 Stage 2: Data Extraction Comprehensive Job Metadata Parsing** Key Information Retrieval** Structured Data Preparation** 🔹 Stage 3: Data Logging Google Sheets Integration** Automatic Row Appending** Real-Time Database Updates** Potential Use Cases Freelancers**: Market trend tracking Job Seekers**: Opportunity intelligence Recruitment Agencies**: Market analysis Skill Development Professionals**: Skill demand monitoring Business Strategists**: Labor market insights Setup Requirements Apify Upwork scraping actor API token Configured scraping parameters Google Sheets Connected Google account Prepared job tracking spreadsheet Appropriate sharing settings n8n Installation Cloud or self-hosted instance Workflow configuration API credential management Future Enhancement Suggestions 🤖 Advanced job matching algorithms 📊 Multi-platform job aggregation 🔔 Customizable alert mechanisms 🌐 Expanded job category tracking 🧠 Machine learning job recommendation Technical Considerations Implement robust error handling Use secure API authentication Maintain flexible data processing Ensure compliance with platform guidelines Ethical Guidelines Respect job poster privacy Use data for legitimate research Maintain transparent information gathering Provide proper attribution Hashtag Performance Boost 🚀 #FreelanceJobTracking #JobMarketIntelligence #WorkflowAutomation #CareerTech #MarketResearch #JobInsights #SkillsDemand #TechInnovation #DataDrivenCareer #ProfessionalGrowth Workflow Visualization [Scheduled Trigger] ⬇️ [Fetch Upwork Jobs] ⬇️ [Format Job Data] ⬇️ [Log to Google Sheets] Connect With Me Ready to revolutionize your job market research? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your job market intelligence with intelligent, automated workflows!
by Manuel
Effortlessly optimize your workflow by automatically importing hundreds of manufacturers from a Google Sheet into your Shopware online store, saving countless hours of manual work. How it works Retrieve all manufactures from a Google Sheet Add each manufacture via Shopware sync API Endpoint to Shopware Upload a logo for each manufacture from a provided public URL to Shopware Set Up Steps Add your Shopware url to first node called Settings Create a Google Sheet in your Google account with the following columns (Demo Sheet) name (the name of the manufacturer which has to be unique and is required) website (url to the manufacturer website) description logo_url (public manufcaturer logo url. Have to be a png, jpg or svg file) translation_language_code_1 (optional. Language Code of your language. For example 'es-ES' for spanish. You have to make sure a language with this code exists in your Shopware shop.) translation_name_1 (optional. Manufacturer name translated to the language you defined at translation_language_code_1) translation_description_1 (optional. Manufacturer description translated to the language you defined at translation_language_code_1) translation_language_code_2 (optional. Same as translation_language_code_1 for another language) translation_name_2 (optional. Same as translation_name_1 for another language) translation_description_2 (optional. Same as translation_description_1 for another language) translation_language_code_3 (optional. Same as translation_language_code_1 for another language) translation_name_3 (optional. Same as translation_name_1 for another language) translation_description_3 (optional. Same as translation_description_1 for another language) Connect to your Google account Connect to your Shopware account Create a Shopware Integration Connect to Shopware at the nodes "Import Manufacturer" and "Upload Manufacturer Logo" using a Generic OAuth2 API Authentication with Grant Type "Client Credentials". The Access Token URL is https://your-shopware-domain.com/api/oauth/token. Run the workflow
by Haqi Ramadhani
Automatically detect new n8n releases (stable or beta) from GitHub, update Coolify environment variables, and trigger deployments. Functionality This workflow automates deployment of n8n releases to a Coolify instance. It supports two tracks: Beta Releases: Checks GitHub every minute for prereleases, filters duplicates, updates the N8N_VERSION environment variable, and deploys. Stable Releases (disabled by default): Checks the latest stable release hourly and deploys. Key Features: Deduplication**: Ensures no repeated deployments for the same release. Version Parsing**: Extracts the semantic version (e.g., 1.34.0) from GitHub release names. Coolify Integration**: Updates environment variables and triggers deployments via API. Expected Outcomes New n8n beta/stable releases detected via GitHub API. Coolify environment variable N8N_VERSION updated to the latest version. Automatic deployment triggered in Coolify. Setup Guide Replace Placeholders: Update m8ccg8k44coogsk84swk8kgs in the Update ENV and Deploy nodes with your Coolify Application UUID. Configure Credentials: Add Coolify API credentials (httpHeaderAuth) with a valid API token in the headers. Enable Triggers: Toggle the Auto Update Latest Release node if stable releases are desired. Adjust schedule intervals as needed. Test: Run the workflow manually to validate API connections and version parsing. SEO Keywords Automated Deployment, n8n CI/CD, Coolify Integration, GitHub Release Monitoring, Environment Variable Management, Beta Release Automation.
by Ahmed Saadawi
⚠️ This Workflow Requires a Community Node and a Self-Hosted n8n Instance > This workflow uses the Vtiger CRM community node. To use it, you must be running a self-hosted version of n8n with Community Nodes enabled. 🔧 How to Install the Node Go to Settings → Community Nodes Click Install Node Enter the package name: n8n-nodes-vtiger-crm Restart your n8n instance if prompted 💬 Real-time Vtiger Support Tickets to Telegram with Auto Status Updates 📌 Overview Keep your support team instantly informed when new tickets are created in Vtiger CRM. This workflow: Fetches the most recent ticket marked as Open Sends its details to a Telegram chat Updates the status in Vtiger to In Progress to prevent re-sending 🔄 What This Workflow Does 📨 Pulls the latest open ticket from Vtiger HelpDesk 📲 Sends a rich-text message to Telegram with all key ticket details 🔁 Updates the ticket’s status to "In Progress" 🧠 Workflow Preview > 📲 Telegram Output Example > New ticket with the following details: Ticketid: TT2 Title: Internet down Status: Open Priority: High Severity: Minor Category: Small Problem Description: The internet was slow from yesterday and today is down completely 🛠️ Setup Instructions 🔗 Telegram Bot Setup Open Telegram and search for @BotFather Run /newbot and follow the instructions Save the bot token Add the bot to your chat or group Use @userinfobot to get your chat_id Paste the token and chat ID in the Telegram node inside n8n 🔗 Vtiger CRM Setup Make sure your Vtiger HelpDesk module includes: ticket_no, ticket_title, ticketstatus, ticketpriorities, ticketseverities, ticketcategories, description Connect your Vtiger API credentials inside n8n 👥 Who This Is For Customer support and IT helpdesk teams using Vtiger CRM Teams that want instant alerts in Telegram Anyone syncing CRM activity with chat-based notifications 🔐 Credentials Required ✅ Vtiger CRM API credentials ✅ Telegram Bot Token 🏷 Tags vtiger, telegram, crm automation, helpdesk alerts, no-code crm, realtime notifications, n8n telegram integration, support ticket automation, self-hosted n8n, community nodes, workflow automation, vtiger crm integration, helpdesk sync, n8n crm alerts `
by Ahmed Saadawi
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🧠 Vtiger CRM – Auto-Answer FAQs with DeepSeek AI Description: This workflow automates the process of answering FAQ drafts in Vtiger CRM using DeepSeek LLM via LangChain. It's perfect for teams who want to accelerate knowledge base creation, improve support response consistency, or reduce the manual effort of writing FAQ content. Every 1 minute, this workflow: 📥 Retrieves the most recent FAQ record marked as Draft in Vtiger CRM 🧠 Sends the question to a LangChain agent powered by DeepSeek AI 📝 Receives a plain-text answer 📤 Updates the original FAQ with the generated answer and changes its status to Published ⚙️ How It Works Trigger:** Scheduled to run every 1 minute Query:** Pulls the latest FAQ from Vtiger where faqstatus = 'Draft' AI Agent:** Uses LangChain + DeepSeek to generate a natural-language answer Memory Buffer:** Keeps context using LangChain memory Update:** Pushes the answer back to Vtiger and marks it as Published 🛠️ Setup Instructions Connect Credentials for: Vtiger CRM API DeepSeek API Ensure your Vtiger CRM has a Faq module with fields: question faq_answer faqstatus Install the required Community Node: Go to Settings → Community Nodes Click Install Node and enter: n8n-nodes-vtiger-crm Restart your instance when prompted. Optionally customize the schedule or field names as needed. 👤 Who Is This For? Customer support teams building a knowledge base Businesses using Vtiger as a CRM or internal helpdesk Teams looking to automate repetitive content creation using LLMs 🔐 Credentials Required ✅ Vtiger CRM API credentials ✅ DeepSeek AI API key ✅ Highlights Fully automated LLM-powered FAQ generation Uses custom community node for Vtiger support Lightweight and runs on a short interval (1 min) Includes sticky note for clarity and onboarding Clean conditional logic and memory context built-in 🏷 Tags vtiger, crm, faq automation, ai automation, deepseek, langchain, llm, open source crm, faq generation, customer support, n8n, n8n community nodes, workflow automation, ai generated answers, vtiger integration, deepseek ai, langchain integration
by Kees Bosch - Browserflow
Auto find & invite LinkedIn Leads This n8n template automates LinkedIn lead generation by scraping profiles, filtering out existing connections, and sending connection requests — all in a controlled, looped workflow. Ideal for outreach campaigns, recruitment, or lead gen efforts. ⚠️ Disclaimer – Community Node Notice This template uses a verified community node available inside the n8n cloud environment. To use it, go to "Nodes" → search for: Browserflow for Linkedin …and click Install. It’s officially verified and accessible directly from n8n cloud. In case you wish to run this template locally, you need to go to the settings, click community nodes and search for n8n-nodes-browserflow. Then after installing you can start using the actions in this node. 🛠️ How to Use Trigger: Manual Start Initiates the workflow manually via the “Test workflow” button, giving you full control. Scrape LinkedIn Profiles Uses the Browserflow automation to extract profile links from a LinkedIn search or keyword query. Split Out Results Converts the list of profiles into individual items for single-profile processing. Loop Through Each Profile Ensures each LinkedIn profile is handled one at a time, avoiding simultaneous actions. Check Existing Connection Verifies if you’re already connected with the lead on LinkedIn. Conditional Logic ✅ Already Connected → Skip to next profile ❌ Not Connected → Continue to next step Send Connection Invite Sends a LinkedIn connection request, optionally with a personalized message. 📦 Requirements n8n (cloud or self-hosted) Installed community node: Browserflow for Linkedin LinkedIn account Valid Browserflow acount (you can set up a free 7-day trial at https://browserflow.io) ⚙️ Setup Instructions Install the Browserflow Community Node Search “Browserflow for Linkedin” > Install. Get your API key Get your API key at https://browserflow.io Setup your Browserflow account After registering, setup your Browserflow and connect with Linkedin using the wizard at https://browserflow.io Connect with Browserflow by making a credential Click on the Browserflow actions to setup a connection with Browserflow by adding your API key to a credential. 🧩 Customization Tips Targeting: Adjust the Browserflow actions to scrape specific roles, industries, or locations. Messaging: You can add a message to the connection invite but remind that LinkedIn limits the amount of messages that can be send each month. Use variables in the message for personalization (e.g., {firstName}). Trigger: Replace manual trigger with a cron node for scheduled outreach. Integration: Combine with CRM tools (e.g., HubSpot, Notion, Airtable) for syncing leads or integrate with AI Agents.
by Dave Long
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Using the serial number for assets, this workflow will create a ticket with the subject "Found duplicate Serial Numbers" with a list of all of the duplicate assets for a technician to review and merge. Duplicate assets causes incorrect billing (if customers are billed based on asset counts), and additional overhead when reviewing the history of assets when that history is spread across multiple instances. Note: Due to limitations of the Syncro API, automatically merging duplicate assets is not possible. How it works: Get a list of all assets in Syncro and summarize the list based on the Customer ID, Asset Type, and Asset Serial Create a new ticket listing all of the duplicate assets Set up steps: Install the Syncro RMM community node Connect a Syncro RMM account* Open the "Create a ticket" node and update the customer ID *See Syncro RMM Community Node documentation for details about how to get a Syncro API key and what permissions the Syncro API key needs
by ist00dent
This n8n template allows you to monitor hourly weather conditions in a specific city using OpenWeatherMap and log the results to a Google Sheet. It’s perfect for anyone needing periodic weather tracking—whether you're managing logistics, travel planning, or environmental monitoring. 🔧 How it works A Schedule Trigger activates the workflow every hour. The Get Weather Data from OpenWeatherMap node fetches real-time weather details using the city name you specify. An IF node checks if the weather description contains "rain" or the temperature is below a set threshold. If the condition is true, the data is formatted with city, temperature, humidity, and conditions. The Google Sheets node appends this formatted information to your designated sheet. 👤 Who is it for? This workflow is ideal for: Operations teams monitoring weather-sensitive logistics Researchers collecting climate data Developers and hobbyists learning how to connect APIs with Google Sheets 🗂️ Google Sheet Structure Your Google Sheet should have the following columns: city (string) temperature (K) (number) humidity (number) conditions (string) status (string) ⚙️ Setup Instructions Create a Google Sheet with the above columns. Set up your Google Service Account credentials in n8n. Replace the API key in the HTTP Request node with your own OpenWeatherMap credential. Specify your target city and ensure your OpenWeatherMap account is active. Adjust the frequency in the Schedule Trigger as needed (default: every hour).
by Pedro Santos
🎥 Summarize YouTube Videos using SearchApi & LLM Who is this for? This workflow is ideal for content creators, students, digital marketers, educators, and researchers who want to quickly summarize YouTube videos. What problem does this workflow solve? Manually extracting important information from lengthy YouTube videos can be tedious and prone to errors. This workflow streamlines the process by automatically fetching video transcripts using SearchApi.io and producing concise, informative summaries through a summarization chain powered by any LLM provider. This allows users to quickly access crucial information without the need for manual transcription or detailed viewing. What this workflow does Fetches the complete transcript of a YouTube video using SearchApi. Combines the retrieved transcript into a single, continuous text. Utilizes a Summarization Chain with an LLM (e.g., OpenRouter models) to create a concise summary of the video content. Setup Install the SearchApi community node: Open Settings → Community Nodes inside your self‑hosted n8n instance. Fill npm Package Name with @searchapi/n8n-nodes-searchapi. Accept the risk prompt, and hit Install. It should now appear as a node when you search for it. API Configuration: Set up your SearchApi.io credentials in n8n. Add your preferred LLM provider credentials (e.g., OpenRouter API). Input Requirements: Provide the YouTube video ID (e.g., wBuULAoJxok). Connect LLM Integration: Configure the summarization chain with your chosen model and parameters for text splitting. How to customize this workflow to meet your needs Adjust the summarization model or modify text-splitter parameters to accommodate different lengths and complexities of video transcripts. Integrate additional nodes to export summaries directly into your preferred tools, such as Google Drive, Slack, or email. Customize prompt templates in the summarization chain to obtain various summary styles (bullet points, paragraphs, etc.). Modify the trigger to suit your workflow. Example Usage Input: YouTube video ID (wBuULAoJxok). Output: A concise, actionable summary that highlights key ideas, recommendations, and insights from the video.
by Tom
Markdown to Notion Blocks Converter Transform markdown-formatted text into properly structured Notion page content with this comprehensive workflow. Overview This workflow automatically converts markdown text into Notion's block format and inserts it directly into a Notion page. Perfect for content creators, documentation teams, and anyone who needs to migrate markdown content to Notion. Features Complete Markdown Support**: Handles headers (H1-H4), paragraphs, lists, quotes, code blocks, and horizontal rules Rich Text Formatting**: Preserves bold, italic, and link formatting Smart Text Processing**: Generates plain text excerpts and maintains original content structure Direct Notion Integration**: Automatically inserts converted blocks into your specified Notion page Batch Processing**: Efficiently handles large content blocks What It Does Takes markdown-formatted text as input Parses and converts it to Notion's block structure Handles complex formatting including: Headers and subheaders Bulleted and numbered lists Code blocks with syntax highlighting Blockquotes Bold and italic text Links Horizontal dividers Uploads the converted content directly to your Notion page Use Cases Content Migration**: Move existing markdown documentation to Notion Automated Publishing**: Convert blog posts or articles from markdown to Notion Documentation Workflows**: Streamline technical documentation processes Content Syndication**: Publish the same content across multiple platforms Requirements Notion API credentials Target Notion page ID Markdown-formatted source content Setup Configure your Notion API credentials Replace the page ID in the HTTP request node with your target Notion page Connect your markdown data source (replace the mock data node) Execute the workflow
by Polina Medvedieva
Who is this template for This template is for marketers, SEO specialists, or content managers who need to analyze keywords to identify which ones contain references to a specific area or topic, in this case – IT software, services, tools, or apps. Use case Automating the process of scanning a large list of keywords to determine if they reference known IT products or services (like ServiceNow, Salesforce, etc.), and updating a Google Sheet with this classification. This helps in categorizing keywords for targeted SEO campaigns, content creation, or market analysis. How this workflow works Fetches keyword data from a Google Sheet Processes keywords in batches to prevent rate limiting Uses an AI agent (OpenAI) to analyze each keyword and determine if it contains a reference to an IT service/software Updates the original Google Sheet with the results in a "Service?" column Continues processing until all keywords are analyzed Set up steps Connect your Google Sheets account credentials Set the Google Sheet document ID (currently using "Copy of Sheet1 1") Configure the OpenAI API credentials for the AI agent Adjust the batch size (currently 6) if needed based on your API rate limits Ensure the Google Sheet has the required columns: "Number", "Keyword", and "Service?" The AI agent's prompt is highly customizable to match different identification needs. For example, instead of looking for IT software/services, you could modify the prompt to identify: Industry-specific terms (healthcare, finance, education) Geographic references (cities, countries, regions) Product categories (electronics, clothing, food) Competitor brand mentions Here's how you could modify the prompt for different use cases: Copy // For identifying educational content keywords "Check the keyword I provided and define if this keyword relates to educational content, courses, or learning materials and return yes or no." // For identifying local service keywords "Check the keyword I provided and determine if it contains location-specific terms (city names, neighborhoods, regions) that suggest local service intent and return yes or no." // For identifying competitor mentions "Check the keyword I provided and determine if it mentions any of our competitors (CompetitorA, CompetitorB, CompetitorC) and return yes or no." `