by Strategiflows
Who Is This For? E-commerce managers, data analysts, and n8n beginners who need a hands-off way to pull all Shopify orders—even stores with thousands of orders—into Google Sheets for reporting or BI. What Problem Does It Solve? Shopify’s GraphQL API only returns up to 250 orders per call, forcing you to manually manage cursors and loops. This template handles the “get next 250” logic for you, so you never miss an order. What This Workflow Does Schedule Trigger – Runs at your chosen cadence (daily, hourly, or manual). Set Date Range – Defines startDay and endDay based on $now. GraphQL Loop – Fetches orders 250 at a time, using pageInfo.hasNextPage and endCursor until complete. Code Node – Flattens orders into line-item rows and summarizes by SKU/vendor. Google Sheets – Appends results to your sheet for easy analysis.
by Davide
This workflow integrates Flowise Multi-Agent Chatflows into a custom-branded n8n chatbot, enabling real-time interaction between users and AI agents powered by large language models (LLMs). Key Advantages: ✅ Easy Integration with Flowise: Uses a low-code HTTP node to send user questions to Flowise's API (/api/v1/prediction/FLOWISE_ID) and receive intelligent responses. Supports multi-agent chatflows, allowing for complex, dynamic interactions. 🎨 Customizable Chatbot UI: Includes pre-built JavaScript for embedding the n8n chatbot into any website. Provides customization options such as welcome messages, branding, placeholder text, chat modes (e.g., popup or embedded), and language support. 🔐 Secure & Configurable: Authorization via Bearer token headers for Flowise API access. Clearly marked notes in the workflow for setting environment variables like FLOWISE_URL and FLOW_ID. How It Works Chat Trigger: The workflow starts with the When chat message received node, which acts as a webhook to receive incoming chat messages from users. HTTP Request to Flowise: The received message is forwarded to the Flowise node, which sends a POST request to a Flowise API endpoint (https://FLOWISEURL/api/v1/prediction/FLOWISE_ID). The request includes the user's input as a JSON payload ({"question": "{{ $json.chatInput }}"}) and uses HTTP header authentication (e.g., Authorization: Bearer FLOWSIE_API). Response Handling: The response from Flowise is passed to the Edit Fields node, which maps the output ($json.text) for further processing or display. Set Up Steps Configure Flowise Integration: Replace FLOWISEURL and FLOWISE_ID in the HTTP Request node with your Flowise instance URL and flow ID. Ensure the Authorization header is set correctly in the credentials (e.g., Bearer FLOWSIE_API). Embed n8n Chatbot: Use the provided JavaScript snippet in the sticky notes to embed the n8n chatbot on your website. Replace YOUR_PRODUCTION_WEBHOOK_URL with the webhook URL generated by the When chat message received node. Customize the chatbot's appearance and behavior (e.g., welcome messages, language, UI elements) using the createChat configuration options. Optional Branding: Adjust the sticky note examples to include branding details, such as custom messages, colors, or metadata for the chatbot. Activate Workflow: Toggle the workflow to "Active" in n8n and test the chat functionality end-to-end. Ideal Use Cases: Embedding branded AI assistants into websites. Connecting Flowise-powered agents with customer support chatbots. Creating dynamic, smart conversational flows with LLMs via n8n automation. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Le Thua Phu
Overview This n8n workflow automates the process of crawling a website's sitemap to extract URLs, which is particularly useful for SEO analysis, website auditing, or content monitoring. By leveraging n8n's nodes, the workflow fetches the sitemap from a specified URL, processes the XML data, and extracts individual URLs, which can then be converted into a downloadable file or integrated with tools like Google Sheets. How It Works The workflow operates in a sequential manner, utilizing a series of nodes to fetch, parse, and process sitemap data: Trigger: Initiates when the user clicks "Test workflow" (Manual Trigger node). Set URL: Defines the base domain (e.g., https://phu.io.vn/) for the sitemap (Set URL node). Crawl Sitemap: Fetches the main sitemap file (sitemap.xml) from the specified domain using an HTTP request (Crawl sitemap node). Parse XML: Converts the sitemap XML into a JSON format for easier processing (XML node). Split Sitemap: Extracts individual sitemap entries (e.g., <sitemap> tags) from the parsed data (Split Out node). Crawl Sub-Sitemap: Fetches each sub-sitemap URL listed in the main sitemap (Crawl sitemap 2 node). Parse Sub-Sitemap XML: Converts the sub-sitemap XML into JSON (XML 2 node). Split URLs: Extracts individual URLs (e.g., <url> tags) from the sub-sitemap (Split Out 2 node). Convert to File: Saves the extracted URLs into a file for download or further use (Convert to File node). This workflow supports both single sitemap files and sitemap indexes that reference multiple sub-sitemaps, ensuring comprehensive URL extraction. How to Use To implement this workflow in n8n, follow these steps: Set Up n8n: Ensure you have an active n8n instance (Cloud, npm, or self-hosted). Refer to the n8n documentation for setup instructions. Import Workflow: Copy the JSON from the provided Extract Website URLs from Sitemap.XML for SEO Analysis.json file and import it into your n8n instance via the workflow editor. Configure the Domain: In the Set URL node, update the Domain parameter with the target website's base URL (e.g., https://example.com/). Alternatively, in the Crawl sitemap node, directly paste the full sitemap URL if known (e.g., https://example.com/sitemap.xml). Test the Workflow: Click "Test workflow" to execute the Manual Trigger node. Verify that the workflow fetches the sitemap and processes the URLs correctly. Download or Integrate: The Convert to File node generates a file containing the extracted URLs. Optionally, replace this node with a Google Sheets node to append URLs to a spreadsheet. Refer to the Google Sheets node documentation for setup. Save and Activate: Save the workflow and activate it for production use if needed, using a trigger like a schedule or webhook (see Trigger Node). Requirements n8n Instance**: An active n8n instance (version 1.0 or later recommended) on n8n Cloud, npm, or self-hosted (Docker). See Choose your n8n for details. Technical Knowledge**: Basic understanding of n8n's editor UI and node configuration. Familiarity with XML sitemaps is helpful but not mandatory. Permissions**: For self-hosted setups, ensure the n8n process has network access to fetch the sitemap URL. For Docker deployments, verify permissions as outlined in the n8n v1.0 migration guide. Optional**: If integrating with Google Sheets, valid Google Sheets credentials are required (see Credentials). Timeout Configuration**: The HTTP Request nodes (Crawl sitemap and Crawl sitemap 2) have a 10-second timeout. Adjust the timeout parameter in the node settings if dealing with slow-responding servers. FAQ Q: What happens if the sitemap is large or contains many sub-sitemaps? A: The workflow handles sitemap indexes by splitting and processing each sub-sitemap individually. For very large sitemaps, ensure your n8n instance has sufficient resources (memory and CPU) to avoid performance issues. See Scaling n8n for optimization tips. Q: Can I use this workflow with a specific sitemap URL instead of a domain? A: Yes, in the Crawl sitemap node, replace the url parameter ({{ $json.Domain }}sitemap.xml) with the direct sitemap URL (e.g., https://example.com/sitemap.xml). Update the node’s notes for clarity. Q: Why am I getting a timeout error? A: The HTTP Request nodes have a default timeout of 10 seconds. If the target server is slow, increase the timeout value in the options parameter of the Crawl sitemap or Crawl sitemap 2 nodes. Q: How can I save the URLs to Google Sheets instead of a file? A: Replace the Convert to File node with a Google Sheets node. Configure it with your Google Sheets credentials and map the loc field from the Split Out 2 node to the desired spreadsheet column. Refer to the Google Sheets node documentation. Q: Is this workflow compatible with older n8n versions? A: The workflow uses nodes compatible with n8n version 1.0 and later. For older versions, check for deprecated features (e.g., MySQL support) in the n8n v1.0 migration guide. Q: Can I automate this workflow to run periodically? A: Yes, replace the Manual Trigger node with a Schedule Trigger node to run the workflow at set intervals. See Trigger Nodes for configuration details. For further assistance, consult the n8n Community Forum or submit an issue on the n8n GitHub repository. Need help customizing? Contact me for consulting and support or add me on Facebook or email.
by Emmanuel Bernard
🎉 Do you want to master AI automation, so you can save time and build cool stuff? I’ve created a welcoming Skool community for non-technical yet resourceful learners. 👉🏻 Join the AI Atelier 👈🏻 This workflow exposes an API endpoint that lets you dynamically replace an image in Google Slides, perfect for automating deck presentations like updating backgrounds or client logos. *📺 Youtube Overview 📺 * Here's how to get started: Step 1: Set Up a Key Identifier in Google Slides Add a unique key identifier to the images you want to replace. Click on the image. Go to Format Options and then Alt Text. Enter your unique identifier, like client_logo or background. Step 2: Use a POST Request to Update the Image Send a POST request to the workflow endpoint with the following parameters in the body: presentation_id: The ID of your Google Slides presentation. You can find it in the URL of your Google presentation: https://docs.google.com/presentation/d/<this-part>/edit) image_key: The unique identifier you created. image_url: The URL of the new image. That's it! The specified image in your Google Slides presentation will be replaced with the new one from the provided URL. This workflow is designed to be flexible, allowing you to use the same identifier across multiple slides and presentations. I hope it streamlines your slide automation process! Example Curl Request to execute: curl --location 'https://workflow.url' \ --form 'presentation_id="google-presentation-id"' \ --form 'image_key="background"' \ --form 'image_url="https://picsum.photos/536/354"' Happy automating! The n8Ninja 🥷
by Audun
Description This workflow reads a sitemap.xml file, extracts all URLs, and allows you to filter out specific types of links—such as PDF files, images, or any other content—based on your needs. Who Is This For? SEO Specialists** looking to analyze specific URLs in their sitemap. Developers** who need to extract links for automated processing. Content Managers** filtering out downloadable assets like PDFs or images. How It Works Fetch sitemap.xml – The workflow reads the sitemap file from a given URL. Extract URLs – Parses all the URLs listed in the sitemap. Filter URLs – Use a simple filter to extract only the links you need (e.g., *.pdf). Export or Process – The filtered list can be sent via email, stored in a database, or used in another workflow. Customization Edit the Set sitemap URL block and edit the sitemapUrl value to the sitemap you want to fetch. Edit the Filter URLs block and edit the filter conditions to meet your needs.
by Vitali
📌 Validate Seatable Webhooks with HMAC SHA256 Authentication This mini workflow is designed to securely validate incoming Seatable webhooks using HMAC SHA256 signature verification. 🔐 What it does: Listens for incoming Seatable webhook requests. Calculates a SHA256 HMAC hash of the raw request body using your shared secret. Compares the computed hash with the x-seatable-signature header (after removing the sha256= prefix). If the hashes match: responds with 200 OK and forwards the request to subsequent nodes. If the hashes don’t match: responds with 403 Forbidden. ⚠️ Important Notes: This workflow is provided as a template and is not intended to work standalone. Please duplicate it and integrate it with your custom logic at the "Add nodes for processing" node. Configuration steps: Set your secret key in the “Calculate sha256” crypto node (replace the placeholder). Adjust the webhook path to suit your environment (or set it to "manual" for testing). Connect your actual logic after the verification step.
by Jaruphat J.
This workflow integrates LINE BOT, AI Agent (GPT), Google Sheets, and Google Drive to enable users to search for file URLs using natural language. The AI Agent extracts the filename from the message, searches for the file in Google Sheets, and returns the corresponding Google Drive URL via LINE BOT. Supports natural language queries (e.g., "Find file 1.pdf for me") AI-powered filename extraction Google Sheets Lookup for file URLs Auto-response via LINE BOT How to Use This Template 1. Download & Import Copy and save the Template Code as a .json file. Go to n8n Editor → Click Import → Upload the file. 2. Update Required Fields Replace YOUR_GOOGLE_SHEET_ID with your actual Google Sheet ID. Replace YOUR_LINE_ACCESS_TOKEN with your LINE BOT Channel Access Token. 3. Activate & Test Click Execute Workflow to test manually. Set Webhook URL in LINE Developer Console. Features of This Template Supports Natural Language Queries (e.g., “Find file 1.pdf for me”) AI-powered filename extraction using OpenAI (GPT-4/3.5) Real-time file lookup in Google Sheets Automatic LINE BOT Response Fully Automated Workflow
by Anthony
Use Case It is very convenient to add expenses via simple chat message. This workflow attempts to do exactly this using AI-powered n8n magic! Send message to a chat, something like "car wash; 59.3 usd; 25 jan 2024" And get a response: Your expense saved, here is the output of save sub-workflow:{"cost":59.3,"descr":"car wash","date":"2024-01-25","msg":"car wash; 59.3 usd; 25 jan 2024"} LLM will smartly parse your message to structured JSON and save the expense as a new row into Google Sheet! Installation 1. Set up Google Sheets: Clone this Sheet: https://docs.google.com/spreadsheets/d/1D0r3tun7LF7Ypb21CmbTKEtn76WE-kaHvBCM5NdgiPU/edit?gid=0#gid=0 (File -> Make a copy) Choose this sheet into "Save expense into Google Sheets" node. 2. Fix sub-workflow dropdown: open "Parse msg and save to Sheets" node (which is an n8n sub-workflow executor tool) and make sure the SAME workflow is chosen in the dropdown. it will allow n8n to locate and call "Workflow Input Trigger" properly when needed. 3. Activate the workflow to make chat work properly. Sent message to chat, something like "car wash; 59.3 usd; 25 jan 2024" you should get a response: Your expense saved, here is the output of save sub-workflow:{"cost":59.3,"descr":"car wash","date":"2024-01-25","msg":"car wash; 59.3 usd; 25 jan 2024"} and new row in Google sheets should be inserted!
by ConvertAPI
Who is this for? For developers and organizations that need to convert DOCX files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the DOCX file from the web. Converts the DOCX file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Adjust url_to_file in the Config node to URL pointing to your file. Optionally, additional Body Parameters can be added for the converter.
by Jonathan
This workflow will take an alert from Syncro, determine if it's an agent_offline_trigger type, then determine if it's a new alert or a close to an existing alert, and then submit it to OpsGenie. New alerts will create a new alert in OpsGenie and resolved alerts will close the alert in OpsGenie. It doesn't require any kind of Google Sheets because OpsGenie allows you to submit a unique ID (known as an alias) along with the alert, which can be referenced later when closing the alert. The trigger type can be changed to suit your needs. You will need to create an API integration in OpsGenie. In Syncro, in addition to setting up the appropriate notification to webhook, you will also need a script that closes the agent_offline_trigger alert and an automated remediation to trigger that script when the asset goes offline (the script is queued and run when the asset comes back online). > This workflow is part of an MSP collection, The original can be found here: https://github.com/bionemesis/n8nsyncro
by Yaron Been
Prunaai Flux Schnell Image Generator Description This is a 3x faster FLUX.1 [schnell] model from Black Forest Labs, optimised with pruna with minimal quality loss. Contact us for more at pruna.ai Overview This n8n workflow integrates with the Replicate API to use the prunaai/flux-schnell model. This powerful AI model can generate high-quality image content based on your inputs. Features Easy integration with Replicate API Automated status checking and result retrieval Support for all model parameters Error handling and retry logic Clean output formatting Parameters Required Parameters prompt** (string): Prompt for generated image Optional Parameters seed** (integer, default: None): Random seed. Set for reproducible generation megapixels** (string, default: 1): Approximate number of megapixels for generated image speed_mode** (string, default: Juiced 🔥 (default)): Run faster predictions with model optimized for speed num_outputs** (integer, default: 1): Number of outputs to generate aspect_ratio** (string, default: 1:1): Aspect ratio of the output image output_format** (string, default: jpg): Format of the output images output_quality** (integer, default: 80): Quality when saving the output images, from 0 to 100. 100 is best quality, 0 is lowest quality. Not relevant for .png outputs num_inference_steps** (integer, default: 4): Number of denoising steps. 4 is recommended, and lower number of steps produce lower quality outputs, faster. How to Use Set up your Replicate API key in the workflow Configure the required parameters for your use case Run the workflow to generate image content Access the generated output from the final node API Reference Model: prunaai/flux-schnell API Endpoint: https://api.replicate.com/v1/predictions Requirements Replicate API key n8n instance Basic understanding of image generation parameters
by Harshil Agrawal
This workflow automatically monitors the functionality of a factory. The workflow logs machine data coming from factory sensors in a CrateDB database, generates an incident report in PagerDuty, and notifies the responsible staff members when the temperature of a machine crosses the threshold value. This workflow builds on a workflow that generates factory data. Read more about this use case and how to build both workflows with step-by-step instructions in the blog post How to automate your factory's incident reporting. Prerequisites A PagerDuty account and credentials AMQP, an ActiveMQ connection, and credentials A CrateDB instance running locally or on a server, and credentials. Nodes AMQP Trigger node starts the workflow. IF node filters sensor values higher than 50°C. PagerDuty node creates an incident in the account. Set nodes set the required incident information and sensor data, respectively. CrateDB nodes ingest the information data and machine sensor data, respectively. Function node converts degrees from Celsius to Fahrenheit.