by Rodrigue Gbadou
What this workflow does This n8n workflow connects to Google Search Console to fetch SEO performance data (clicks, impressions, CTR, and average position) for the last 7 days. It formats the results into a clean weekly summary and automatically sends it to your email inbox every Monday morning. Ideal for: Website owners Bloggers SEO consultants who want to track site performance over time without manual reporting. Setup steps Replace YOUR_SITE_URL in the HTTP Request node with your verified domain from Google Search Console. Connect your Google OAuth2 credentials to the HTTP Request node. Set up your SMTP credentials in the "Send Email" node. Adjust the recipient email address and subject line if necessary. (Optional) Customize the Function node to include more queries or format the report as a PDF. Estimated setup time: ~10 minutes Sticky notes are included in the workflow canvas to guide you step-by-step. Technologies used Google Search Console API SMTP Email Node n8n Function Node n8n HTTP Request Node n8n Sticky Notes
by Aditya Gaur
Who is this template for? This template is designed for developers, DevOps engineers, and automation enthusiasts who want to streamline their GitLab merge request process using n8n, a low-code workflow automation tool. It eliminates manual intervention by automating the merging of GitLab branches through API calls. How it works ? Trigger the workflow: The workflow can be triggered by a webhook, a scheduled event, or a GitLab event (e.g., a new merge request is created or approved). Fetch Merge Request Details: n8n makes an API call to GitLab to retrieve merge request details. Check Merge Conditions: The workflow validates whether the merge request meets predefined conditions (e.g., approvals met, CI/CD pipelines passed). Perform the Merge: If all conditions are met, n8n sends a request to the GitLab API to merge the branch automatically. Setup Steps 1. Prerequisites An n8n instance (Self-hosted or Cloud) A GitLab personal access token with API access A GitLab repository with merge requests enabled 2. Create the n8n Workflow Set up a trigger: Choose a trigger node (Webhook, Cron, or GitLab Trigger). Fetch merge request details: Add an HTTP Request node to call GET /merge_requests/:id from GitLab API. Validate conditions: Check if the merge request has necessary approvals. Ensure CI/CD pipelines have passed. Merge the request: Use an HTTP Request node to call PUT /merge_requests/:id/merge API. 3. Test the Workflow Create a test merge request. Check if the workflow triggers and merges automatically. Debug using n8n logs if needed. 4. Deploy and Monitor Deploy the workflow in production. Use n8n’s monitoring features to track execution. This template enables seamless GitLab merge automation, improving efficiency and reducing manual work! Note: Never hard code API token or secret in your https request.
by The Higher Pitch
This workflow automates the process of publishing PR News articles to the WordPress website. 🔧 How it works: Uses an RSS Feed Trigger to monitor new PR News articles. Extracts the article content and parses the featured image URL. Uploads the image to WordPress as a media item. Creates a new draft post on the WordPress site using the article's content and sets the uploaded image as the featured image. ✅ Features: Polls RSS feed every minute. Automatically extracts and sets featured images. Posts are created as drafts for editorial review. 📝 Requirements: WordPress REST API access with media upload permission. Active WordPress credentials in n8n. Perfect for teams who want to streamline PR content publishing without manual effort.
by Yang
What this workflow does This workflow extracts product details—like name, price, discount, and rating— from website screenshots using Dumpling AI. It starts when a new product page URL is added to a Google Sheet, captures a screenshot of that page, extracts visible product info from the image, and writes the results back into the sheet. What problem is this workflow solving? Many product pages block traditional scraping tools or use unstructured layouts. This workflow bypasses HTML limitations by using visual AI extraction, making it reliable even when content is embedded in images or hard to parse with code. Who is this for? This is ideal for eCommerce researchers, pricing analysts, marketers, or anyone building a product database from websites without needing to code or maintain complex scrapers. Setup Create a Google Sheet with a column named "Site" (or update the trigger). Add your product page URLs in this column—one per row. Connect your Google Sheets and Dumpling AI credentials in n8n. Ensure your Dumpling AI account has API access for screenshots and extraction. How to customize the workflow Prompt adjustment**: In the “Extract Text from Screenshot” node, you can modify the prompt to extract other information like brand name, delivery time, or availability. Add more fields**: After the extraction, edit the “Format Extracted Data” node to map additional fields from the response to your Google Sheet columns. Change output destination**: You can easily replace the Google Sheets module with Airtable, Notion, or another app if preferred. > ⚠️ This works best when the product data is clearly visible in the screenshot. > It won’t extract info that’s hidden behind popups or loaded via user interaction.
by Shahrear
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Transform your expense tracking with automated AI receipt processing that extracts data and organizes it instantly. What this workflow does Monitors Google Drive for new receipt uploads (images/PDFs) Downloads and processes files automatically Extracts key data using VLM Run community node (merchant, amount, currency, date) Saves structured data to Google Sheets for easy tracking Setup Prerequisites: Google Drive/Sheets accounts, VLM Run API credentials, n8n instance. You need to install VLM Run community node. To install Community nodes you need to follow steps, Settings -> Community Nodes -> Install -> Search with name @vlm-run/n8n-nodes-vlmrun Quick Setup: Configure Google Drive OAuth2 and create receipt upload folder Add VLM Run API credentials Create Google Sheets with columns: Customer, Merchant, Amount, Currency, Date Update folder/sheet IDs in workflow nodes Test and activate How to customize this workflow to your needs Extend functionality by: Adding expense categories and approval workflows Connecting to accounting software (QuickBooks, Xero) Including Slack notifications for processed receipts Adding data validation and duplicate detection This workflow transforms manual receipt processing into an automated system that saves hours while improving accuracy.
by RedOne
This workflow is designed for e-commerce store owners, operations managers, and developers who use Shopify as their e-commerce platform and want an automated way to track and analyze their order data. It is particularly useful for businesses that: Need a centralized view of all Shopify orders Want to analyze order trends without logging into Shopify Need to share order data with team members who don't have Shopify access Want to build custom reports based on order information What Problem Is This Workflow Solving? While Shopify provides excellent order management within its platform, many businesses need their order data available in other systems for various purposes: Data accessibility**: Not everyone in your organization may have access to Shopify's admin interface Custom reporting**: Google Sheets allows for flexible analysis and report creation Data integration**: Having orders in Google Sheets makes it easier to combine with other business data Backup**: Creates an additional backup of your critical order information What This Workflow Does This n8n workflow creates an automated bridge between your Shopify store and Google Sheets: Listens for new order notifications from your Shopify store via webhooks Processes the incoming order data and transforms it into a structured format Stores each new order in a dedicated Google Sheets spreadsheet Sends real-time notifications to Telegram when new orders are received or errors occur Setup Create a Google Sheet Create a new Google Sheet to store your orders Add a sheet named "orders" with the following columns: orderId orderNumber created_at processed processed_at json customer shippingAddress lineItems totalPrice currency Set Up Telegram Bot Create a Telegram bot using BotFather (send /newbot to @BotFather) Save your bot token for use in n8n credentials Start a chat with your bot and get your chat ID (you can use @userinfobot) Configure the Workflow Set your Google Sheet ID in the "Edit Variables" node Enter your Telegram chat ID in the "Edit Variables" node Set up your Telegram API credentials in n8n Configure Shopify Webhook In your Shopify admin, go to: Settings > Notifications > Webhooks Create a new webhook for "Order creation" Set the URL to your n8n webhook URL (from the "Receive New Shopify Order" node) Set the format to JSON How to Customize This Workflow to Your Needs Additional data**: Modify the "Transform Order Data to Standard Format" function to extract more Shopify data Multiple sheets**: Duplicate the Google Sheets node to store different aspects of orders in separate sheets Telegram messages**: Customize the text in Telegram nodes to include more details or rich formatting Data processing**: Add nodes to perform calculations or transformations on order data Additional notifications**: Add more channels like Slack, Discord, or SMS Integrations**: Extend the workflow to send order data to other systems like CRMs, ERPs, or accounting software Final Notes This workflow serves as a foundation that you can build upon to create a comprehensive order management system tailored to your specific business needs.
by bangank36
This workflow restores all n8n instance credentials from GitHub backups using the n8n API node. It complements the Backup Your Credentials to GitHub template by allowing users to seamlessly restore previously saved credentials. How It Works The workflow fetches credentials stored in a GitHub repository and imports them into your n8n instance. Setup Instructions To configure the workflow, update the Globals node with the following values: repo.owner** – Your GitHub username repo.name** – The name of your GitHub repository storing the credentials repo.path** – The folder path within the repository where credentials are stored For example, if your GitHub username is john-doe, your repository is named n8n-backups, and credentials are stored in a credentials/ folder, you would set: repo.owner → john-doe repo.name → n8n-backups repo.path → credentials/ Required Credentials GitHub API** – Access to your repository n8n API** – To import credentials into your n8n instance Who Is This For? This template is ideal for users who want to restore their credentials from GitHub backups, ensuring easy migration and recovery in case of data loss. Check out my other templates: 👉 My n8n Templates
by Jacob @ vwork Digital
This n8n template allows you to send emails with a custom alias from your Gmail account Since the native Gmail node has some limitations regarding use of email aliases, this template allows you to set up your own internal endpoint/sub-workflow to send emails as an email alias . How it works This workflow uses a Code node and the Gmail API via an HTTP node to format the email content and send using an alias on your Gmail account. Setup instructions You must have added the email address as an alias you wish to send as in your Gmail account, guide on how to do so here. You must have created a Gmail credential in N8N, guide on how to do so here. Use your Gmail OAuth Credential in the HTTP node. Use this template as an API endpoint or a sub-workflow, and send this payload to it via POST: { "senderName": "SENDER NAME HERE", "fromEmail": "FROM EMAIL HERE", "replyTo": "REPLY TO EMAIL HERE", "toEmail": "jacob@vwork.digital", "subject": "SUBJECT LINE HERE", "htmlBody": "HTML BODY HERE - MUST BE JSON STRINGIFIED", "file_urls": [ "FILE URLS FOR ATTACHMENTS HERE" ] } Notes Only the following are required fields: fromEmail toEmail subject htmlBody Customizing this workflow You can easily convert this to a sub-workflow by swapping out the Webhook trigger for a "When executed by another workflow" trigger
by Guillaume Duvernay
Unlock a new level of sophistication for your AI agents with this template. While the native n8n Think Tool is great for giving an agent an internal monologue, it's limited to one instance. This workflow provides a clever solution using a sub-workflow to create multiple, custom thinking tools, each with its own specific purpose. This template provides the foundation for building agents that can plan, act, and then reflect on their actions before proceeding. Instead of just reacting, your agent can now follow a structured, multi-step reasoning process that you design, leading to more reliable and powerful automations. Who is this for? AI and automation developers:** Anyone looking to build complex, multi-tool agents that require robust logic and planning capabilities. LangChain enthusiasts:** Users familiar with advanced agent concepts like ReAct (Reason-Act) will find this a practical way to implement similar frameworks in n8n. Problem solvers:** If your current agent struggles with complex tasks, giving it distinct steps for planning and reflection can dramatically improve its performance. What problem does this solve? Bypasses the single "Think Tool" limit:** The core of this template is a technique that allows you to add as many distinct thinking steps to your agent as you need. Enables complex reasoning:** You can design a structured thought process for your agent, such as "Plan the entire process," "Execute Step 1," and "Reflect on the result," making it behave more intelligently. Improves agent reliability and debugging:** By forcing the agent to write down its thoughts at different stages, you can easily see its line of reasoning, making it less prone to errors and much easier to debug when things go wrong. Provides a blueprint for sophisticated AI:** This is not just a simple tool; it's a foundational framework for building state-of-the-art AI agents that can handle more nuanced and multi-step tasks. How it works The re-usable "Thinking Space": The magic of this template is a simple sub-workflow that does nothing but receive text. This workflow acts as a reusable "scratchpad." Creating custom thinking tools: In the main workflow, we use the Tool (Workflow) node to call this "scratchpad" sub-workflow multiple times. We give each of these tools a unique name (e.g., Initial thoughts, Additional thoughts). The power of descriptions: The key is the description you give each of these tool nodes. This description tells the agent when and how it should use that specific thinking step. For example, the Initial thoughts tool is described as the place to create a plan at the start of a task. Orchestration via system prompt: The main AI Agent's system prompt acts as the conductor, instructing the agent on the overall process and telling it about its new thinking abilities (e.g., "Always start by using the Initial thoughts tool to make a plan..."). A practical example: This template includes two thinking tools to demonstrate a "Plan and Reflect" cycle, but you can add many more to fit your needs. Setup Add your own "action" tools: This template provides the thinking framework. To make it useful, you need to give the agent something to do. Add your own tools to the AI Agent, such as a web search tool, a database lookup, or an API call. Customize the thinking tools: Edit the description of the existing Initial thoughts and Additional thoughts tools. Make them relevant to the new action tools you've added. For example, "Plan which of the web search or database tools to use." Update the agent's brain: Modify the system prompt in the main AI Agent node. Tell it about the new action tools you've added and how it should use your customized thinking tools to complete its tasks. Connect your AI model: Select the OpenAI Chat Model node and add your credentials. Taking it further Create more granular thinking steps:** Add more thinking tools for different stages of a process, like a "Hypothesize a solution" tool, a "Verify assumptions" tool, or a "Final answer check" tool. Customize the thought process:* You can change *how the agent thinks by editing the prompt inside the fromAI('Thoughts', ...) field within each tool. You could ask for thoughts in a specific format, like bullet points or a JSON object. Change the workflow trigger:** Switch the chat trigger for a Telegram trigger, email, Slack, whatever you need for your use case! Integrate with memory:** For even more power, combine this framework with a long-term memory solution, allowing the agent to reflect on its thoughts from past conversations.
by Matheus Pedrosa
Who is this template for? This template is ideal for n8n instance administrators, developers, and DevOps teams who need a proactive and organized way to monitor the health of their automations. If you want to be notified about failures as soon as they happen, without having to manually check execution logs, this workflow is for you. What does this template do? This workflow automates error monitoring on your n8n instance. Every hour, it performs the following steps: Queries the n8n API to fetch all executions that have failed in the last hour. Groups the errors by workflow to consolidate the information. Builds a rich message for each failed workflow, including the error count. Sends an alert to a Slack channel with a button to open the workflow directly, allowing for immediate investigation. Requirements Before you start, you will need to have the following configured in your n8n instance: n8n API Credentials:** You need to generate an API key in your n8n instance settings so the workflow can query execution data. Slack Credentials:* A configured *Slack (OAuth2 API)** credential to allow n8n to send messages to your workspace. How to set it up Setup is simple and only takes a few minutes: Config Node: In the node named "Config", you must set the value for the baseUrl to your n8n instance's URL (e.g., https://n8n.yourdomain.com). This is crucial for generating the correct workflow links in the Slack message. Schedule Trigger: The workflow is pre-configured to run every hour. You can adjust the frequency in this node to fit your needs. "Get Failed Executions" Node (HTTP Request): Under Authentication, select 'Header Auth'. In the Credentials field, select your n8n API credential. "Post to Slack" Node (Slack): Select your Slack credential. In the Channel field, enter the name of the channel where error notifications should be sent (e.g., #n8n-alerts). Activate the Workflow! After these steps, just activate the workflow to start the automatic error monitoring. How to customize the workflow You can easily customize this template: Change the Schedule:** Modify the Schedule Trigger node to run at different intervals (every 15 minutes, once a day, etc.). Change the Notification Channel:** Instead of Slack, you can replace the last node to send notifications to Discord, Microsoft Teams, Telegram, or even by email. Add More Information:** You can modify the MakeMessage node that generates the message to include more details about the errors, such as the error message or the exact time of failure.
by Yaron Been
This cutting-edge n8n automation is a powerful market research tool designed to continuously monitor and capture User-Generated Content (UGC) opportunities on Fiverr. By intelligently scraping, parsing, and logging gig data, this workflow provides: Automated Market Scanning: Daily scrapes of Fiverr UGC gigs Real-time market intelligence Consistent, hands-off data collection Intelligent Data Extraction: Parses complex HTML structures Captures key gig details Transforms unstructured web data into actionable insights Seamless Data Logging: Automatic Google Sheets integration Comprehensive gig marketplace tracking Historical data preservation Key Benefits 🤖 Full Automation: Continuous market research 💡 Smart Filtering: Detailed UGC gig insights 📊 Instant Reporting: Real-time market trends ⏱️ Time-Saving: Eliminate manual research Workflow Architecture 🔍 Stage 1: Automated Triggering Scheduled Scraping**: Daily gig discovery Precise Timing**: Configurable run intervals Consistent Monitoring**: Always-on market intelligence 🌐 Stage 2: Web Scraping HTTP Request**: Fetch Fiverr search results Dynamic Headers**: Bypass potential scraping restrictions Targeted Search**: UGC-specific gig discovery 🧩 Stage 3: Data Extraction HTML Parsing**: Extract critical gig information Structured Data Collection**: Gig Prices Seller Names Gig Titles Direct Gig URLs 📋 Stage 4: Data Logging Google Sheets Integration**: Automatic data storage Historical Tracking**: Build comprehensive gig databases Easy Analysis**: Spreadsheet-ready format Potential Use Cases Content Creators**: Market rate research Freelance Platforms**: Competitive intelligence Marketing Agencies**: UGC trend analysis Recruitment Specialists**: Talent pool mapping Business Strategists**: Market opportunity identification Setup Requirements Fiverr Search Configuration Targeted search keywords Specific UGC categories Web Scraping Preparation User-agent rotation strategy Potential proxy configuration Robust error handling Google Sheets Setup Connected Google account Prepared spreadsheet Appropriate sharing permissions n8n Installation Cloud or self-hosted instance Import workflow configuration Configure API credentials Future Enhancement Suggestions 🤖 AI-powered gig trend analysis 📊 Advanced data visualization 🔔 Real-time price change alerts 🧠 Machine learning market predictions 🌐 Multi-platform gig tracking Ethical Considerations Respect Fiverr's Terms of Service Implement responsible scraping practices Avoid overwhelming target websites Use data for legitimate research purposes Technical Recommendations Implement exponential backoff for requests Use randomized delays between scrapes Maintain flexible CSS selector strategies Consider rate limiting and IP rotation Connect With Me Ready to unlock market insights? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your market research with intelligent, automated workflows!
by Tony Duffy
. Read and store IOT sensor data with the MQTT Trigger and InfluxDB tonyduffy@protonmail.com This workflow is for users wanting a practical example of how to obtain data from remote IOT systems using the MQTT protocol in an n8n environment. The template provides typical n8n node implementation and configuration settings necessary to read and store IOT data. The workflow reads the temperature and humidity data from a remote IOT system in this case a DHT22 sensor connected to a ESP32 micro controller. The data is parsed into the correct JSON format and then ingested in an InfluxDB data bucket. From there the stored temperature and humidity values can be displayed in real time. The workflow can be easily modified to read any MQTT driven device data. Remote IOT Sensor Setup The ESP32 controller with the DHT22 sensor are running on a Wokwi simulator. The simulator uses micro python to publish a MQTT "wokwi-weather" topic with the temperature and humidity payloads to an online Mosquitto MQTT broker. The n8n MQTT trigger node subscribes to the topic on the broker and reads the payload values when any changes are published. The code node then prepares the payload for JSON format. The HTTP request node ingests the data in a InfluxDB bucket How to customise this workflow to your needs Wokwi IOT ESP32 simulator You will need to setup a free account at Wokwi.com Once created search for a project "Micro-Python MQTT Weather Logger (ESP32)" Then when the MQTT weather logger project is open change lines 28 and 29 to the following 28 MQTT_CLIENT_ID = "" 29 MQTT_BROKER = "test.mosquitto.org" You then can start the simulation by clicking on the green arrow and it will connect the mosquitto broker and the "wokwi-weather" topic will be published. By clicking on the DHT22 sensor the temperature and humidity bar will appear and you can change the values to send updated payload values to the broker. InfluxDB You will require access to functioning InfluxDB database to utilise this workflow Note : You will have to provide the following for the HTTP request node to connect to InfluxDB. The URL and port of the desired InfluxDB (In this case the InfluxDB is running locally on port 8086 ie. http://localhost:8086.) InfluxDB bucket for the data. ( In this case the created bucket name is "wokwi-data") The Organization ID of the InfluxDB. This can be obtained for the InfluxDB admin page A generated API token to read and write to the InfluxDB bucket. Created from the InfluxDB admin n8n workflow. The MQTT trigger node is configured to subscribe to the "wokwi-weather" topic on the test Mosquitto MQTT broker. It reads the temperature and humidity data sent by ESP32. The code node uses Javascript to move the temperature and humidity payloads to JSON format. This is flexible and can easily modified. The HTTP request node posts the JSON payloads to the InfluxDB bucket. When the above is configured the workflow should function correctly. Thanks to the many who have downloaded this template. Let me know on what you would like to build. Contact me at tonyduffy@protonmail.com