by Ria
This workflow demonstrates how to use the workflowStaticData() function to set any type of variable that will persist within workflow executions. https://docs.n8n.io/code/cookbook/builtin/get-workflow-static-data/ This can be useful for example when working with access tokens that expire after a certain time period. Using staticData we can keep a record of that access token and the expiry time and build our workflow logic around it. Important Static Data only persists across production executions, i.e. triggered by Webhooks or Schedule Triggers (not manual executions!) For this the workflow will have to be activated. Setup configure HTTP Request node to fetch access token from your API (optional) activate workflow test the workflow with the webhook production link you can check the population of the static data in the single executions Feedback If you found this useful or want to report some missing information - I'd be happy to hear from you at ria@n8n.io
by Giacomo Lanzi
Extract Title tag and meta description from url for SEO analysis. How it works The workflows takes records from Airtable, get the url in the records and extract from the related webpage the title tag (<title>) and meta description (<meta name="description" content="Some content">). If title tag and/or meta description tag isn't available on the webpage, the result will be empty. Setup Set a Base in Airtable with a table with the following structure: url (field type url), title tag (field type text string), meta desc (field type text field) Minimum suggested table structure is: url (https://example.com), title (Title example), meta desc* (This is the meta description of the example page) Connect Airtable to both Airtable nodes in the template and, with the following formula, get all the records that miss title tag and meta desc. Formula: AND(url != "", {title tag} = "", {meta desc} = "") Insert the url to be analyzed in the table in the field url and let the workflow do the rest. Extra You can also calculate the length for title tag and meta desc using formula field inside Airtable. This is the formula: LEN({title tag}) or LEN({meta desc}) You can automate the process calling a Webhook from Airtable. For this, you need an Airtable paid plan.
by JaredCo
Real-time Weather Forecasts with MCP Tools This n8n workflow demonstrates how to integrate real-time weather intelligence into any automation using the Model Context Protocol (MCP). Get current conditions and 5-day forecasts with natural language queries like "What's the weather like in Miami?" or "Will it rain next Tuesday in Seattle?" - all powered by live weather data and AI. Good to know No API keys required - uses hosted MCP weather server with built-in WorldWeatherOnline integration Provides current conditions and detailed 5-day forecasts Natural language queries work for any location worldwide Powered by WorldWeatherOnline - the world's most accurate weather system Fully preconfigured and ready to run out-of-the-box Enterprise-ready with error handling and rate limiting How it works Natural Language Input**: Receives weather queries via webhook, chat, email, or voice AI Agent Processing**: n8n Agent node interprets requests and determines: Location extraction from natural language Weather data type needed (current or 5-day forecast) Response formatting preferences MCP Weather Tool**: Live hosted server provides: Real-time current conditions (temperature, humidity, wind, conditions) 5-day detailed forecasts with daily highs/lows Weather descriptions and condition codes Powered by WorldWeatherOnline's premium data Intelligent Responses**: AI formats weather data into: Conversational natural language responses Structured data for downstream automation Action-triggering data for workflows How to use Import the workflow into n8n from the template Add your preferred AI model API key to the Agent node Customize the system prompt for your specific use case Connect to your preferred input/output channels Run and start querying weather with natural language Use Cases Smart Home Automation**: "Turn on sprinklers if no rain forecast for 3 days" Travel Planning**: "Check weather for my Paris trip next week" Event Management**: "Will outdoor wedding conditions be good Saturday?" Agriculture/Farming**: "Check 5-day forecast for planting schedule" Logistics**: "Delay shipping if severe weather forecast in delivery zone" Personal Assistant**: "Should I wear a jacket today in Chicago?" Sports/Recreation**: "Surf conditions and wind forecast for weekend" Construction**: "Safe working conditions for outdoor project this week" Requirements n8n instance (cloud or self-hosted) AI model provider account (OpenAI, Anthropic, Google, etc.) Internet connection for MCP weather server access Optional: Webhook endpoints for external integrations Customizing this workflow Location Intelligence**: Add geocoding for address-to-coordinates conversion Data Storage**: Save weather history to databases for trend analysis Dashboard Integration**: Connect to Grafana, Tableau, or custom visualizations Voice Integration**: Add speech-to-text for voice weather queries Scheduling**: Set up automated daily/weekly weather briefings Conditional Logic**: Trigger different actions based on weather conditions Sample Input/Output Natural Language Queries: "What's the weather like in Miami?" "Will it rain next Tuesday in Seattle?" "5-day forecast for London" "Temperature in Tokyo tomorrow" "Weather conditions for outdoor event Saturday" Rich Responses: { "location": "Miami, FL", "current": { "temperature": "78°F", "condition": "Partly Cloudy", "humidity": "65%", "wind": "10 mph SE" }, "forecast": { "today": "High 82°F, Low 71°F, 20% rain", "tomorrow": "High 85°F, Low 73°F, Sunny" }, "ai_summary": "Perfect beach weather in Miami today! Partly cloudy with comfortable temperatures and light winds." } Why This Workflow is Unique Zero Setup Weather Data**: No API key management - MCP server handles everything World-Class Accuracy**: Powered by WorldWeatherOnline's premium weather data AI-Powered Intelligence**: Natural language understanding of complex weather queries Enterprise Ready**: Built-in error handling, rate limiting, and reliability Global Coverage**: Worldwide weather data with location intelligence Action-Oriented**: Designed for automation decisions, not just information display Transform your automations with intelligent weather awareness powered by the world's most accurate weather system! 🧪 Setup Steps ✅ The Agent node is already configured: The system prompt is included The tool endpoint is pre-set All you need to do is: Add your AI model API key to the existing Agent credential Hit run and you're done ✅ 🔗 Full project link: Github: weathertrax-mcp-agent-demo
by bangank36
This workflow restores all n8n instance workflows from GitHub backups using the n8n API node. It complements the Backup Your Workflows to GitHub template by allowing users to seamlessly restore previously saved workflows. How It Works The workflow fetches workflows stored in a GitHub repository and imports them into your n8n instance. Setup Instructions To configure the workflow, update the Globals node with the following values: repo.owner** – Your GitHub username repo.name** – The name of your GitHub repository storing the workflows repo.path** – The folder path within the repository where workflows are stored For example, if your GitHub username is john-doe, your repository is named n8n-backups, and workflows are stored in a workflows/ folder, you would set: repo.owner → john-doe repo.name → n8n-backups repo.path → workflows/ Required Credentials GitHub API** – Access to your repository n8n API** – To import workflows into your n8n instance Who Is This For? This template is ideal for users who want to restore their workflows from GitHub backups, ensuring easy migration and recovery in case of data loss. Check out my other templates: 👉 My n8n Templates
by Cameron Wills
Who is this for? Content creators, social media managers, digital marketers, and researchers who need to download original TikTok videos without watermarks for analysis, repurposing, or archiving purposes. What problem does this workflow solve? Downloading TikTok videos without watermarks typically requires using questionable third-party websites that may have limitations, ads, or privacy concerns. This workflow provides a clean, automated solution that can be integrated into your own systems and processes. What this workflow does This workflow automates the process of downloading TikTok videos without watermarks in three simple steps: Fetch the TikTok video page by providing the video URL Extract the raw video URL from the page's HTML data Download the original video file without watermark (Optional) Upload to Google Drive with public sharing link generation The workflow uses web scraping techniques to extract the original video source directly from TikTok's own servers, maintaining the highest possible quality without any added watermarks or branding. Setup (Est. time: 5-10 minutes) Before getting started, you'll need: n8n installation The URL of a TikTok you want to download (Optional) Google Drive API enabled in Google Cloud Console with OAuth Client ID and Client Secret credentials if you want to use the upload feature How to customize this workflow to your needs Replace the example TikTok URL with your desired video links Modify the file naming convention for downloaded videos Integrate with other nodes to process videos after downloading Create a webhook to trigger the workflow from external applications Set up a schedule to regularly download videos from specific accounts This workflow can be extended to support various use cases like trending content analysis, competitor research, creating compilation videos, or building a content library for inspiration. It provides a foundation that can be customized to fit into larger automated workflows for content creation and social media management.
by Airtop
Automating LinkedIn Company URL Verification Use Case This automation verifies that a given LinkedIn URL actually belongs to a company by comparing the website listed on their LinkedIn page against the expected company domain. It is essential for ensuring data accuracy in lead qualification, enrichment, and CRM updates. What This Automation Does Input Parameters Company LinkedIn**: The LinkedIn URL to be verified. Company Domain**: The expected domain (e.g., example.com) for validation. Airtop Profile (connected to LinkedIn)**: Airtop Profile with LinkedIn authentication. Output Confirmation whether the LinkedIn page corresponds to the provided domain. Returns the verified LinkedIn URL if the match is confirmed. How It Works Extracts the website URL from the specified LinkedIn company profile. Compares the extracted URL with the provided company domain. If the domain is contained in the extracted website, the LinkedIn profile is confirmed as valid. Returns the original LinkedIn URL if the match is successful. Setup Requirements Airtop API Key LinkedIn-authenticated Airtop Profile Next Steps Use for LinkedIn Discovery Validation**: Ensure correctness after automated LinkedIn page discovery. Combine with CRM Updates**: Prevent incorrect LinkedIn links from being stored in CRM. Automate in Data Pipelines**: Use this as a validation gate before enrichment or scoring steps.
by Monospace Design
What is this workflow doing? This simple workflow is pulling the latest Euro foreign exchange reference rates from the European Central Bank and responding expected values to an incoming HTTP request (GET) via a Webhook trigger node. Setup no authentication** needed the workflow is ready to use test** the workflow template by hitting the test workflow button and calling the URL in the webhook node optional: choose your own Webhook listening path in the Webhook trigger node Usage There are two possible usage scenarios: get all Euro exchange rates as an array of objects get only a specific currency exchange rate as a single object All available rates Using the HTTP query ?foreign=USD (where USD is one of the available currency symbols) will provide only that specificly asked rate. Response example: {"currency":"USD","rate":"1.0852"} Single exchange rate If no query is provided, all available rates are returned. Response example: [{"currency":"USD","rate":"1.0852"},{"currency":"JPY","rate":"163.38"},{"currency":"BGN","rate":"1.9558"},{"currency":"CZK","rate":"25.367"},{"currency":"DKK","rate":"7.4542"},{"currency":"GBP","rate":"0.85495"},{"currency":"HUF","rate":"389.53"},{"currency":"PLN","rate":"4.3053"},{"currency":"RON","rate":"4.9722"},{"currency":"SEK","rate":"11.1675"},{"currency":"CHF","rate":"0.9546"},{"currency":"ISK","rate":"149.30"},{"currency":"NOK","rate":"11.4285"},{"currency":"TRY","rate":"33.7742"},{"currency":"AUD","rate":"1.6560"},{"currency":"BRL","rate":"5.4111"},{"currency":"CAD","rate":"1.4674"},{"currency":"CNY","rate":"7.8100"},{"currency":"HKD","rate":"8.4898"},{"currency":"IDR","rate":"16962.54"},{"currency":"ILS","rate":"3.9603"},{"currency":"INR","rate":"89.9375"},{"currency":"KRW","rate":"1444.46"},{"currency":"MXN","rate":"18.5473"},{"currency":"MYR","rate":"5.1840"},{"currency":"NZD","rate":"1.7560"},{"currency":"PHP","rate":"60.874"},{"currency":"SGD","rate":"1.4582"},{"currency":"THB","rate":"38.915"},{"currency":"ZAR","rate":"20.9499"}] Further info Read more about Euro foreign exchange reference rates here.
by Niklas Hatje
Use Case In most companies, employees have a lot of great ideas. That was the same for us at n8n. We wanted to make it as easy as possible to allow everyone to add their ideas to some formatted database - it should be somewhere where everyone is all the time and could add a new idea without much extra effort. Since we're using Slack, this seemed to be the perfect place to easily add ideas and collect them in Notion. What this workflow does This workflow waits for a webhook call within Slack, that gets fired when users use the /idea command on a bot that you will create as part of this template. It then checks the command, adds the idea to Notion, and notifies the user about the newly added idea as you can see below: Creating your Slack bot Visit https://api.slack.com/apps, click on New App and choose a name and workspace. Click on OAuth & Permissions and scroll down to Scopes -> Bot token Scopes Add the chat:write scope Head over to Slash Commands and click on Create New Command Use /idea as the command Copy the test URL from the Webhook node into Request URL Add whatever feels best to the description and usage hint Go to Install app and click install Setup Add a Database in Notion with the columns Name and Creator Add your Notion credentials and add the integration to your Notion page. Fill the setup node below Create your Slack app (see other sticky) Click Test workflow and use the /idea comment in Slack Activate the workflow and exchange the Request URL with the production URL from the webhook How to adjust it to your needs You can adjust the table in Notion and for example, add different types of ideas or areas that they impact You might wanna add different templates in Notion to make it easier for users to fill their ideas with details Rename the Slack command as it works best for you How to enhance this workflow At n8n we use this workflow in combination with some others. E.g. we have the following things on top: We additionally have a /bug Slack command that adds a new bug to Linear. Here we're using AI to classify the bugs and move it to the right team. (see this template and this template) We also added other types, like /pain to be less solution-driven To make it easier for everyone to give input, we added a Votes column that allows everyone to vote on ideas/pain points in the list We're also running a workflow once a week that highlights the most popular new ideas and the most active voters (see here)
by Daniel Shashko
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automates the process of scraping product data from e-commerce websites and using it to fine-tune a custom OpenAI GPT model for generating high-quality marketing copy and product descriptions. Main Use Cases Fine-tune OpenAI models with real product data from hundreds of supported e-commerce websites for marketing content generation. Create custom AI models specialized in writing compelling product descriptions across different industries and platforms. Automate the entire pipeline from data collection to model training using Bright Data's extensive scraper library. Generate marketing copy using your custom-trained model via an interactive chat interface. How it works The workflow operates in two main phases: model training and model usage, organized into these stages: Data Collection & Processing Manually triggered to start the fine-tuning process. Uses Bright Data's web scraper to extract product information from any supported e-commerce platform (Amazon, eBay, Shopify stores, Walmart, Target, and hundreds of other websites). Collects product titles, brands, features, descriptions, ratings, and availability status from your chosen platform. Easily customizable to scrape from different websites by simply changing the dataset configuration and product URLs. Training Data Preparation A Code node processes the scraped product data to create training examples in OpenAI's required JSONL format. For each product, generates a complete training example with: System message defining the AI's role as a marketing assistant. User prompt containing specific product details (title, brand, features, original description snippet). Assistant response providing an ideal marketing description template. Compiles all training examples into a single JSONL file ready for OpenAI fine-tuning. Model Fine-Tuning Uploads the training file to OpenAI using the OpenAI File Upload node. Initiates a fine-tuning job via HTTP Request to OpenAI's fine-tuning API using the GPT-4o-mini model as the base. The fine-tuning process runs on OpenAI's servers to create your custom model. Interactive Chat Interface Provides a chat trigger that allows real-time interaction with your fine-tuned model. An AI Agent node connects to your custom-trained OpenAI model. Users can chat with the model to generate product descriptions, marketing copy, or other content based on the training. Custom Model Integration The OpenAI Chat Model node is configured to use your specific fine-tuned model ID. Delivers responses trained on your product data for consistent, high-quality marketing content. Summary Flow: Manual Trigger → Scrape E-commerce Products (Bright Data) → Process & Format Training Data (Code) → Upload Training File (OpenAI) → Start Fine-Tuning Job (HTTP Request) | Parallel: Chat Trigger → AI Agent → Custom Fine-Tuned Model Response Benefits: Fully automated pipeline from raw product data to trained AI model. Works with hundreds of different e-commerce websites through Bright Data's extensive scraper library. Creates specialized models trained on real e-commerce data for authentic marketing copy across various industries. Scalable solution that can be adapted to different product categories, niches, or websites. Interactive chat interface for immediate access to your custom-trained model. Cost-effective fine-tuning using OpenAI's most efficient model (GPT-4o-mini). Easily customizable with different websites, product URLs, training prompts, and model configurations. Setup Requirements: Bright Data API credentials for web scraping (supports hundreds of e-commerce websites). OpenAI API key with fine-tuning access. Replace placeholder credential IDs and model IDs with your actual values. Customize the product URLs list and Bright Data dataset for your specific website and use case. The workflow can be adapted for any e-commerce platform supported by Bright Data's scraping infrastructure.
by Guillaume Duvernay
Description This template provides a simple and powerful backend for adding speech-to-text capabilities to any application. It creates a dedicated webhook that receives an audio file, transcribes it using OpenAI's gpt-4o-mini model, and returns the clean text. To help you get started immediately, you'll find a complete, ready-to-use HTML code example right inside the workflow in a sticky note. This code creates a functional recording interface you can use for testing or as a foundation for your own design. Who is this for? Developers:** Quickly add a transcription feature to your application by calling this webhook from your existing frontend or backend code. No-code/Low-code builders:** Embed a functional audio recorder and transcription service into your projects by using the example code found inside the workflow. API enthusiasts:** A lean, practical example of how to use n8n to wrap a service like OpenAI into your own secure and scalable API endpoint. What problem does this solve? Provides a ready-made API:** Instantly gives you a secure webhook to handle audio file uploads and transcription processing without any server setup. Decouples frontend from backend:** Your application only needs to know about one simple webhook URL, allowing you to change the backend logic in n8n without touching your app's code. Offers a clear implementation pattern:** The included example code provides a working demonstration of how to send an audio file from a browser and handle the response—a pattern you can replicate in any framework. How it works This solution works by defining a clear API contract between your application (the client) and the n8n workflow (the backend). The client-side technique: Your application's interface records or selects an audio file. It then makes a POST request to the n8n webhook URL, sending the audio file as multipart/form-data. It waits for the response from the webhook, parses the JSON body, and extracts the value of the Transcript key. You can see this exact pattern in action in the example code provided in the workflow's sticky note. The n8n workflow (backend): The Webhook node catches the incoming POST request and grabs the audio file. The HTTP Request node sends this file to the OpenAI API. The Set node isolates the transcript text from the API's response. The Respond to Webhook node sends a clean JSON object ({"Transcript": "your text here..."}) back to your application. Setup Configure the n8n workflow: In the Transcribe with OpenAI node, add your OpenAI API credentials. Activate the workflow to enable the endpoint. Click the "Copy" button on the Webhook node to get your unique Production Webhook URL. Integrate with the frontend: Inside the workflow, find the sticky note labeled "Example Frontend Code Below". Copy the complete HTML from the note below it. ⚠️ Important: In the code you just copied, find the line const WEBHOOK_URL = 'YOUR WEBHOOK URL'; and replace the placeholder with the Production Webhook URL from n8n. Save the code as an HTML file and open it in your browser to test. Taking it further Save transcripts:* Add an *Airtable* or *Google Sheets** node to log every transcript that comes through the workflow. Error handling:** Enhance the workflow to catch potential errors from the OpenAI API and respond with a clear error message. Analyze the transcript:* Add a *Language Model** node after the transcription step to summarize the text, classify its sentiment, or extract key entities before sending the response.
by Agent Studio
This workflow is a experiment to build HTML pages from a user input using the new Structured Output from OpenAI. How it works: Users add what they want to build as a query parameter The OpenAI node generate an interface following a structured output defined in the body The JSON output is then converted to HTML along with a title The HTML is encapsulated in an HTML node (where the Tailwind css script is added) The HTML is rendered to the user via the Webhook response. Set up steps Create an OpenAI API Key Create the OpenAI credentials Use the credentials for both nodes HTTP Request (as Predefined Credential type) and OpenAI Activate your workflow Once active, go to the production URL and add what you'd like to build as the parameter "query" Example: https://production_url.com?query=a%20signup%20form Example of generated page
by bangank36
This workflow retrieves all Squarespace Orders and saves them into a Google Sheets spreadsheet using the Squarespace Commerce API. It uses pagination to ensure all orders are collected efficiently. How It Works The workflow queries your Squarespace Orders API. It fetches data in paginated batches and inserts them into Google Sheets. The Global node is used to configure API parameters dynamically, allowing users to set date filters, pagination, and fulfillment status. The workflow runs on demand or on a schedule, ensuring your data stays up to date. Parameters This workflow allows you to customize the API request using the Global node settings: api-version** (string, required) – The current API version (see Squarespace Orders API documentation). modifiedAfter**={a-datetime} (string, conditional) – Fetch orders modified after a specific date (ISO 8601 format). modifiedBefore**={b-datetime} (string, conditional) – Fetch orders modified before a specific date (ISO 8601 format). cursor**={c} (string, conditional) – Used for pagination, cannot be combined with other filters. fulfillmentStatus**={status} (optional, enum) – Filter by fulfillment status: PENDING, FULFILLED, or CANCELED. maxPage** – Set -1 to enables infinite pagination to fetch all available orders. Requirements Credentials To use this workflow, you need: Squarespace API Key – Retrieve from your Squarespace settings. Google Sheets API credentials – Required to insert data into a spreadsheet. Google Sheets Setup Use the Squarespace order export feature to create a reference sheet. Google Sheets template is available Who Is This For? This workflow is designed for: Squarespace store owners exporting orders for tax reports, analytics, or sales tracking. Businesses automating order data retrieval for external reporting. Anyone needing an efficient way to extract Squarespace order data without manual effort. Explore More Templates Get all orders in Shopify to Google Sheets Sync Shopify customers to Google Sheets + Squarespace compatible csv 👉 Check out my other n8n templates