by Babish Shrestha
Who is this tempate for? This workflow powers a simple yet effective customer and sales support chatbot for your webshop. It's perfect for solopreneurs who want to automate customer interactions without relying on expensive or complex support tools. How it works? The chatbot listens to user requests—such as checking product availability—and automatically handles the following Fetches product information from a Google Sheet Answers customer queries Places an order Updates the stock after a successful purchase Everything runs through a single Google Sheet used for both stock tracking and order management. Setup Instructions Before you begin, connect your Google Sheets credentials by following this guide: This will be used to connect all the tools to Google Sheets 👉 Setup Google sheets credentials Get Stock Open "Get Stock" tool node and select the Google sheet credentials you created. Choose the correct google sheet document and sheet name and you are done. Place order Go to your "Place Order" tool node and select the Google sheet credentials you have created. Choose the correct google sheet document and sheet name. Update Stock - Open your "Update Stock" tool node and select the Google sheet credentials you have created. Choose the correct google sheet document and sheet name. In "Mapping Column Mode" section select map each column manually. In "Column to match on" select the column with a unique identifier (e.g., Product ID) to match stock items. In values to update section, add only the column(s) that need to be updated—usually the stock count. AI Agent node Adjust the prompt according to your use case and customize what you need. Google Sheet Template Stock sheet |Case ID|Phone Model|Case Name|Case Type|Image URL|Quantity Avaialble|Initital Inventory|Sold| |-|-|-|-|-|-|-|-| |1023|Iphone 14 pro|Black Leather|Magsafe|https://example.com/url|90|100|10 Order sheet |Case ID|Phone Model|Case Name|Name|Phone Number|Address| |-|-|-|-|-|-| |1023|Black Leather |Iphone 14 pro|Fernando Torres|9998898888|Paris, France
by Mihai Farcas
This n8n workflow automates the process of saving web articles or links shared in a chat conversation directly into a Notion database, using Google's Gemini AI and Browserless for web scraping. Who is this AI automation template for? It's useful for anyone wanting to reduce manual copy-pasting and organize web findings seamlessly within Notion. A smarter web clipping tool! What this AI automation workflow does Starts when a message is received Uses a Google Gemini AI Agent node to understand the context and manage the subsequent steps. It identifies if a message contains a request to save an article/link. If a URL is detected, it utilizes a tool configured with the Browserless API (via the HTTP Request node) to scrape the content of the web page. Creates a new page in a specified Notion database, populating it with thea summary scraped content, in a specific format, never leaving out any important details. It also saves the original URL, smart tags, publication date, and other metadata extracted by the AI. Posts a confirmation message (e.g., to a Discord channel) indicating whether the article was saved successfully or if an error occurred. Setup Import Workflow: Import this template into your n8n instance. Configure Credentials & Notion Database: Notion Database: Create or designate a Notion database (like the example "Knowledge Database") where articles will be saved. Ensure this database has the following properties (fields): Name (Type: Text) - This will store the article title. URL (Type: URL) - This will store the original article link. Description (Type: Text) - This can store the AI-generated summary. Tags (Type: Multi-select) - Optional, for categorization. Publication Date (Type: Date) - *Optional, store the date the article was published. Ensure the n8n integration has access to this specific database. If you require a different format to the Notion Database, not that you will have to update the Notion tool configuration in this n8n workflow accordingly. Notion Credential: Obtain your Notion API key and add it as a Notion credential in n8n. Select this credential in the save_to_notion tool node. Configure save_to_notion Tool: In the save_to_notion tool node within the workflow, set the 'Database ID' field to the ID of the Notion database you prepared above. Map the workflow data (URL, AI summary, etc.) to the corresponding database properties (URL, Description, etc.). In the blocks section of the notion tool, you can define a custom format for the research page, allowing the AI to fill in the exact details you want extracted from any web page! Google Gemini AI: Obtain your API key from Google AI Studio or Google Cloud Console (if using Vertex AI) and add it as a credential. Select this credential in the "Tools Agent" node. Discord (or other notification service): If using Discord notifications, create a Webhook URL (instructions) or set up a Bot Token. Add the credential in n8n and select it in the discord_notification tool node. Configure the target Channel ID. Browserless/HTTP Request: Cloud: Obtain your API key from Browserless and configure the website_scraper HTTP Request tool node with the correct API endpoint and authentication header. Self-Hosted: Ensure your Browserless Docker container is running and accessible by n8n. Configure the website_scraper HTTP Request tool node with your self-hosted Browserless instance URL. Activate Workflow: Save test and activate the workflow. How to customize this workflow to your needs Change AI Model:** Experiment with different AI models supported by n8n (like OpenAI GPT models or Anthropic Claude) in the Agent node if Gemini 2.5 Pro doesn't fit your needs or budget, keeping in mind potential differences in context window size and processing capabilities for large content. Modify Notion Saving:** Adjust the save_to_notion tool node to map different data fields (e.g., change the summary style by modifying the AI prompt, add specific tags, or alter the page content structure) to your Notion database properties. Adjust Scraping:** Modify the prompt/instructions for the website_scraper tool or change the parameters sent to the Browserless API if you need different data extracted from the web pages. You could also swap Browserless for another scraping service/API accessible via the HTTP Request node.
by Hueston
Who is this for? Content strategists analyzing web page semantic content SEO professionals conducting entity-based analysis Data analysts extracting structured data from web pages Marketers researching competitor content strategies Researchers organizing and categorizing web content Anyone needing to automatically extract entities from web pages What problem is this workflow solving? Manually identifying and categorizing entities (people, organizations, locations, etc.) on web pages is time-consuming and error-prone. This workflow solves this challenge by: Automating the extraction of named entities from any web page Leveraging Google's powerful Natural Language API for accurate entity recognition Processing web pages through a simple webhook interface Providing structured entity data that can be used for analysis or further processing Eliminating hours of manual content analysis and categorization What this workflow does This workflow creates an automated pipeline between a webhook and Google's Natural Language API to: Receive a URL through a webhook endpoint Fetch the HTML content from the specified URL Clean and prepare the HTML for processing Submit the HTML to Google's Natural Language API for entity analysis Return the structured entity data through the webhook response Extract entities including people, organizations, locations, and more with their salience scores Setup Prerequisites: An n8n instance (cloud or self-hosted) Google Cloud Platform account with Natural Language API enabled Google API key with access to the Natural Language API Google Cloud Setup: Create a project in Google Cloud Platform Enable the Natural Language API for your project Create an API key with access to the Natural Language API Copy your API key for use in the workflow n8n Setup: Import the workflow JSON into your n8n instance Replace "YOUR-GOOGLE-API-KEY" in the "Google Entities" node with your actual API key Activate the workflow to enable the webhook endpoint Copy the webhook URL from the "Webhook" node for later use Testing: Use a tool like Postman or cURL to send a POST request to your webhook URL Include a JSON body with the URL you want to analyze: {"url": "https://example.com"} Verify that you receive a response containing the entity analysis data How to customize this workflow to your needs Analyzing Specific Entity Modify the "Google Entities" node parameters to include entityType filters Add a "Function" node after "Google Entities" to filter specific entity types Create conditions to extract only entities of interest (people, organizations, etc.) Processing Multiple URLs in Batch: Replace the webhook with a different trigger (HTTP Request, Google Sheets, etc.) Add a "Split In Batches" node to process multiple URLs Use a "Merge" node to combine results before sending the response Enhancing Entity Data: Add additional API calls to enrich extracted entities with more information Implement sentiment analysis alongside entity extraction Create a data transformation node to format entities by type or relevance Additional Notes This workflow respects Google's API rate limits by processing one URL at a time The Natural Language API may not identify all entities on a page, particularly for highly technical content HTML content is trimmed to 100,000 characters if longer to avoid API limitations Consider legal and privacy implications when analyzing and storing entity data from web pages You may want to adjust the HTML cleaning process for specific website structures ❤️ Hueston SEO Team
by Halfbit 🚀
Daily YouTrack In-Progress Tasks Summary to Discord by Assignee Keep your team in sync with a daily summary of tasks currently In Progress in YouTrack — automatically posted to your Discord channel. This workflow queries issues, filters them by status, groups them by assignee and priority, and sends a formatted message to Discord. It's perfect for teams that need a lightweight, automated stand-up report. > 📝 This workflow uses Discord as an example. You can easily replace the messaging integration with Slack, Mattermost, MS Teams, or any other platform that supports incoming webhooks. Use Case Remote development teams using YouTrack + Discord Replacing daily stand-up meetings with async updates Project managers needing quick visibility into active tasks Features Scheduled** daily execution (default: weekdays at 09:00) Status filter**: only issues marked as In Progress Grouping** by assignee and priority Custom mapping** for user mentions (YouTrack → Discord) Clean Markdown output** for Discord, with direct task links Setup Instructions YouTrack Configuration Get a permanent token: Go to your YouTrack profile → Account Security → Authentication Create a new permanent token with "Read Issue" permissions Copy the token value Set the base API URL: Format: https://yourdomain.youtrack.cloud/api/issues Replace yourdomain with your actual YouTrack instance Identify custom field IDs: Method 1: Go to YouTrack → Administration → Custom Fields → find your "Status" field and note its ID Method 2: Use API call GET /api/admin/customFieldSettings/customFields to list all field IDs Method 3: Inspect a task's API response and look for field IDs in the customFields array Example Status field ID: 105-0 or 142-1 Discord Configuration Create a webhook URL in your Discord server: Server Settings → Integrations → Webhooks → New Webhook Choose target channel and copy the webhook URL Extract webhook ID from URL (numbers after /webhooks/) Environment Variables & Placeholders | Placeholder | Description | |-------------|-------------| | {{API_URL}} | Your YouTrack API base URL | | {{TOKEN}} | YouTrack permanent token | | {{FIELD_ID}} | ID of the "Status" custom field | | {{QUERY_FIELDS}} | Fields to fetch (e.g., summary, id) | | {{PROJECT_LINK}} | Link to your YouTrack project | | {{USER_X}} | YouTrack usernames | | {{DISCORD_ID_X}} | Discord mentions or usernames | | {{NAME_X}} | Display names | | {{WEBHOOK_ID}} | Discord webhook ID | | {{DISCORD_CHANNEL}} | Discord channel name | | {{CREDENTIAL_ID}} | Your credential ID in n8n | Testing the Workflow Test YouTrack connection: Execute the "HTTP Request YT" node individually Verify that issues are returned from your YouTrack instance Check if the Status field ID is correctly filtering tasks Verify filtering: Run the "Filter fields" node Confirm only "In Progress" tasks pass through Check message formatting: Execute the "Discord message" node Review the generated message content and formatting Test Discord delivery: Run the complete workflow manually Verify the message appears in your Discord channel Schedule verification: Enable the workflow Test weekend skip functionality by temporarily changing dates Customization Tips Language**: All labels/messages are in English — customize if needed User mapping**: Adjust assignee → Discord mention logic in the message builder Priorities**: Update the priorityMap to reflect your own naming structure Schedule**: Modify the trigger time in the Schedule Trigger node Alternative platforms**: Swap out the Discord webhook for another messaging service if preferred
by Ria
This workflow demonstrates how to use the workflowStaticData() function to set any type of variable that will persist within workflow executions. https://docs.n8n.io/code/cookbook/builtin/get-workflow-static-data/ This can be useful for example when working with access tokens that expire after a certain time period. Using staticData we can keep a record of that access token and the expiry time and build our workflow logic around it. Important Static Data only persists across production executions, i.e. triggered by Webhooks or Schedule Triggers (not manual executions!) For this the workflow will have to be activated. Setup configure HTTP Request node to fetch access token from your API (optional) activate workflow test the workflow with the webhook production link you can check the population of the static data in the single executions Feedback If you found this useful or want to report some missing information - I'd be happy to hear from you at ria@n8n.io
by Marcelo Abreu
Who is this workflow for? If you're using Meta Ads to generate new leads to your sales pipeline, this workflow is for you! 🙌🏻 What this workflow does Triggers every time you have a new calendar event on a chosen Google Acount Filter only events with the same name of your "Schedule a demo" event Formats and send event to Meta Conversion API What events can I send? Any event you'd like! It's preconfigured with the "Schedule" event, but you can change to "Purchase", "InitiateCheckout", "Lead" and custom events. Setup Guide Connect Google OAuth2 to n8n Get your PIXEL ID and Access Token from Meta Set your configuration node with Pixel ID, Access Token, source_url and event_name Requirements Meta Access Token + Pixel ID (via Meta Conversion API): Documentation Google Access (via OAuth2): Documentation This free template was created by pdforge. Feel free to contact us via the founder Linkedin, if you have any questions! 👋🏻
by Giacomo Lanzi
Extract Title tag and meta description from url for SEO analysis. How it works The workflows takes records from Airtable, get the url in the records and extract from the related webpage the title tag (<title>) and meta description (<meta name="description" content="Some content">). If title tag and/or meta description tag isn't available on the webpage, the result will be empty. Setup Set a Base in Airtable with a table with the following structure: url (field type url), title tag (field type text string), meta desc (field type text field) Minimum suggested table structure is: url (https://example.com), title (Title example), meta desc* (This is the meta description of the example page) Connect Airtable to both Airtable nodes in the template and, with the following formula, get all the records that miss title tag and meta desc. Formula: AND(url != "", {title tag} = "", {meta desc} = "") Insert the url to be analyzed in the table in the field url and let the workflow do the rest. Extra You can also calculate the length for title tag and meta desc using formula field inside Airtable. This is the formula: LEN({title tag}) or LEN({meta desc}) You can automate the process calling a Webhook from Airtable. For this, you need an Airtable paid plan.
by JaredCo
Real-time Weather Forecasts with MCP Tools This n8n workflow demonstrates how to integrate real-time weather intelligence into any automation using the Model Context Protocol (MCP). Get current conditions and 5-day forecasts with natural language queries like "What's the weather like in Miami?" or "Will it rain next Tuesday in Seattle?" - all powered by live weather data and AI. Good to know No API keys required - uses hosted MCP weather server with built-in WorldWeatherOnline integration Provides current conditions and detailed 5-day forecasts Natural language queries work for any location worldwide Powered by WorldWeatherOnline - the world's most accurate weather system Fully preconfigured and ready to run out-of-the-box Enterprise-ready with error handling and rate limiting How it works Natural Language Input**: Receives weather queries via webhook, chat, email, or voice AI Agent Processing**: n8n Agent node interprets requests and determines: Location extraction from natural language Weather data type needed (current or 5-day forecast) Response formatting preferences MCP Weather Tool**: Live hosted server provides: Real-time current conditions (temperature, humidity, wind, conditions) 5-day detailed forecasts with daily highs/lows Weather descriptions and condition codes Powered by WorldWeatherOnline's premium data Intelligent Responses**: AI formats weather data into: Conversational natural language responses Structured data for downstream automation Action-triggering data for workflows How to use Import the workflow into n8n from the template Add your preferred AI model API key to the Agent node Customize the system prompt for your specific use case Connect to your preferred input/output channels Run and start querying weather with natural language Use Cases Smart Home Automation**: "Turn on sprinklers if no rain forecast for 3 days" Travel Planning**: "Check weather for my Paris trip next week" Event Management**: "Will outdoor wedding conditions be good Saturday?" Agriculture/Farming**: "Check 5-day forecast for planting schedule" Logistics**: "Delay shipping if severe weather forecast in delivery zone" Personal Assistant**: "Should I wear a jacket today in Chicago?" Sports/Recreation**: "Surf conditions and wind forecast for weekend" Construction**: "Safe working conditions for outdoor project this week" Requirements n8n instance (cloud or self-hosted) AI model provider account (OpenAI, Anthropic, Google, etc.) Internet connection for MCP weather server access Optional: Webhook endpoints for external integrations Customizing this workflow Location Intelligence**: Add geocoding for address-to-coordinates conversion Data Storage**: Save weather history to databases for trend analysis Dashboard Integration**: Connect to Grafana, Tableau, or custom visualizations Voice Integration**: Add speech-to-text for voice weather queries Scheduling**: Set up automated daily/weekly weather briefings Conditional Logic**: Trigger different actions based on weather conditions Sample Input/Output Natural Language Queries: "What's the weather like in Miami?" "Will it rain next Tuesday in Seattle?" "5-day forecast for London" "Temperature in Tokyo tomorrow" "Weather conditions for outdoor event Saturday" Rich Responses: { "location": "Miami, FL", "current": { "temperature": "78°F", "condition": "Partly Cloudy", "humidity": "65%", "wind": "10 mph SE" }, "forecast": { "today": "High 82°F, Low 71°F, 20% rain", "tomorrow": "High 85°F, Low 73°F, Sunny" }, "ai_summary": "Perfect beach weather in Miami today! Partly cloudy with comfortable temperatures and light winds." } Why This Workflow is Unique Zero Setup Weather Data**: No API key management - MCP server handles everything World-Class Accuracy**: Powered by WorldWeatherOnline's premium weather data AI-Powered Intelligence**: Natural language understanding of complex weather queries Enterprise Ready**: Built-in error handling, rate limiting, and reliability Global Coverage**: Worldwide weather data with location intelligence Action-Oriented**: Designed for automation decisions, not just information display Transform your automations with intelligent weather awareness powered by the world's most accurate weather system! 🧪 Setup Steps ✅ The Agent node is already configured: The system prompt is included The tool endpoint is pre-set All you need to do is: Add your AI model API key to the existing Agent credential Hit run and you're done ✅ 🔗 Full project link: Github: weathertrax-mcp-agent-demo
by LukaszB
This workflow is designed for freelancers, solopreneurs, and business owners who receive a high volume of irrelevant messages in their Gmail inbox — from cold offers to spammy promotions — and want to automatically filter and delete them using AI. Its main purpose is to scan new emails with the help of OpenAI, classify their content, and automatically delete those considered marketing (OFFER) or junk (SPAM). The result is a cleaner inbox without the need to manually sift through low-value messages. The classification logic uses a detailed system prompt with practical examples, so even complex or borderline messages are categorized accurately. Important emails — such as payment confirmations, shipping updates, or genuine business inquiries — remain untouched. This helps maintain a professional inbox with only valuable and relevant communication. The entire process runs automatically in the background and can be customized further — for example, to archive instead of delete, or log deleted emails for review. How it works When triggered (every hour), the workflow fetches new Gmail messages using the Gmail Trigger node. Each message is passed to an AI classifier powered by OpenAI, which reads the message body (email snippet) and returns one of three labels: SPAM: Obvious junk messages, scams, or low-effort bulk messages OFFER: Cold outreach, discount promotions, cart reminders, or generic advertising IMPORTANT: Valuable information for the user, even if commercial (e.g., invoices, order updates, personal inquiries) The workflow then routes the result through an IF node. If the message is marked as SPAM or OFFER, it is immediately deleted from Gmail via the Gmail Delete node. Emails marked as IMPORTANT are ignored and remain in the inbox. The classification is entirely AI-driven based on message content — sender address, headers, or metadata are not used. How to set up To get started, simply connect two credentials: A Gmail account using OAuth2 (via the Gmail Trigger and Gmail Delete nodes) An OpenAI API key (used by the AI classifier node) No advanced setup is needed beyond these two connections. Optionally, you can review or modify the system prompt used for classification — it’s available inside the workflow’s LangChain AI Agent node. The prompt is in English, so it’s recommended to use this workflow with English-language emails for best results. By default, the workflow deletes matching emails immediately. If you prefer safer testing, you can modify the Gmail node to archive, label, or log emails instead of deleting them. The full workflow takes around 5–10 minutes to configure and includes a sticky note with additional instructions and warnings.
by bangank36
This workflow restores all n8n instance workflows from GitHub backups using the n8n API node. It complements the Backup Your Workflows to GitHub template by allowing users to seamlessly restore previously saved workflows. How It Works The workflow fetches workflows stored in a GitHub repository and imports them into your n8n instance. Setup Instructions To configure the workflow, update the Globals node with the following values: repo.owner** – Your GitHub username repo.name** – The name of your GitHub repository storing the workflows repo.path** – The folder path within the repository where workflows are stored For example, if your GitHub username is john-doe, your repository is named n8n-backups, and workflows are stored in a workflows/ folder, you would set: repo.owner → john-doe repo.name → n8n-backups repo.path → workflows/ Required Credentials GitHub API** – Access to your repository n8n API** – To import workflows into your n8n instance Who Is This For? This template is ideal for users who want to restore their workflows from GitHub backups, ensuring easy migration and recovery in case of data loss. Check out my other templates: 👉 My n8n Templates
by Jonathan | NEX
Supercharge Your Security Operations for Free Stop wasting time manually investigating suspicious IP addresses. This workflow template is your launchpad to automating real-time IP cybersecurity analysis using the NixGuard platform, which you can use for free. This is the first of a two-part system designed to integrate seamlessly into your existing security stack, especially with Wazuh. It calls our main workflow, Automate IP Reputation Checks and Get AI Risk Summaries from NixGuard, to do the heavy lifting. What This Workflow Unlocks for You Free AI-Powered Risk Summaries:** Don't just get data; get answers. NixGuard provides a clear, human-readable summary of why an IP is considered risky. Automated IP Reputation Checks:** Programmatically check any IP against a vast array of threat intelligence sources. A Foundation for Your SOC Automation:** Use the results to trigger your incident response process. The template includes a pre-built example of how to send a detailed alert to Slack, which you can easily adapt for Jira, TheHive, or any other tool. How the Two-Workflow System Works This "Dispatcher" workflow is designed for flexibility. It holds your API key and input, then calls the main analysis workflow. This allows you to easily create multiple triggers (e.g., one for Slack bots, one for webhooks) without duplicating the core logic. Critical Setup Instructions Get the Main Workflow: First, add the main analysis engine to your n8n instance from the community page: NixGuard Analysis Workflow. Add Your Free API Key: In this workflow, click the blue Set API Key & Initial Prompt node. Paste your free NixGuard API key into the apiKey value field. Connect The Workflows: Click the purple Execute NixGuard & Wazuh Workflow node. In the parameters, use the dropdown to select the main analysis workflow you added in Step 1. Ready to automate your threat intelligence? Get your free API key and learn more at; 🔗 Learn more about NixGuard: [thenex.world](thenex.world )🔗 Get started with a free security subscription: thenex.world/security/subscribe Tags: Free, IP Analysis, NixGuard, Wazuh, Security, Automation, AI, Cybersecurity, Threat Intelligence, SOC, Incident Response, IP Reputation, DevSecOps, API
by Ramsey Njire
Who Is This For? This workflow is perfect for content creators, marketers, and business professionals who receive regular newsletters and want to effortlessly convert them into engaging LinkedIn posts. By automating the extraction and repurposing process, you can save time and consistently share thoughtful updates with your network. What Problem Does This Workflow Solve? Manually reading newsletters, extracting the key points, and then formatting that content into professional, engaging LinkedIn posts can be time-consuming and error-prone. This workflow automates those steps by: Filtering Emails:** Uses the Gmail node to process only those emails from a specific sender (e.g., newsletter@example.com). Extracting Content:** Leverages OpenAI to identify and summarize the top news items in your newsletter. Generating Posts:** Crafts concise, insightful LinkedIn posts in a smart, deadpan style with a touch of subtle humor. Publishing:** Posts the generated content directly to LinkedIn. What This Workflow Does Filter Newsletters:** The Gmail node is set up to only handle emails from your chosen sender, ensuring that only relevant newsletters are processed. Extract Key Content:** An OpenAI node analyzes the newsletter text to pull out the most important news items, including headlines and summaries. Split Content:** A Split Out node divides the extracted content so each news item is processed on its own. Generate LinkedIn Posts:** Another OpenAI node takes each news item's details and produces a well-structured LinkedIn post that delivers practical insights and ends with a reflective observation or question. Publish to LinkedIn:** The LinkedIn node publishes the crafted posts directly to your account. Setup Gmail Node: Rename it to “Filter Gmail Newsletter” and configure it to filter emails by your newsletter sender. OpenAI Nodes: Ensure your OpenAI API credentials are set up correctly. Customize the prompt if needed to match your desired tone. LinkedIn Node: Rename it to “Post to LinkedIn” and confirm that your LinkedIn OAuth2 credentials are properly configured. How to Customize OpenAI Prompts:** Adjust the prompts in the OpenAI nodes to fine-tune the post tone and output formatting. Email Filter:** Change the Gmail filter to match the sender of your newsletters. Post Processing:** Optionally, add extra formatting (using Function nodes) to further enhance the readability of the generated LinkedIn posts. This template offers an automated, hands-off solution to transform your newsletter content into engaging LinkedIn updates, keeping your audience informed and inspired with minimal effort.