by Not Another Marketer
Your Landing Page is Leaking Sales—Here’s How to Fix It in Seconds Visitors land on your page. But instead of converting, they bounce. Why? Something’s broken. Something’s missing. But what? ❌ Is your CTA too weak? ❌ Is your messaging unclear? ❌ Is your design creating friction? You know something is off, but don’t know what. What if you could get an instant, expert-level report on exactly what to fix? This workflow will do an AI Analysis of your landing page, provide a CRO Audit, so you can optimize your landing page. Who is This For? SaaS Founders & Startups**: Stop leaving money on the table. Make every visitor count. Marketers & Growth Experts**: Turn landing pages into high-converting assets. E-commerce & Lead Gen Businesses**: More conversions = more revenue. How It Works Paste your URL Get an instant roast + fix list Implement changes & watch conversions jump The workflow scrapes the url you input, gets the htlm source code of the landing page, and sends it to OpenAI AI Agent. The Agent makes a deep analysis, roasts the landing page, and provides 10 Conversion Rate Optimization Tips to improve your landing page. Setup Guide You will need OpenAI Credentials with an API Key to run the workflow. The workflow is using the OpenAI-o1 model to deliver the best results. It costs between $0.20/0.30 per run. You can adjust the prompt to your wish in the AI Agent parameters. Once the workflow has been completed, select Logs to get a readable version. Below is an example.
by Olek
How it works This workflow will activate and deactivate a selected other workflow on schedule. > ⚠️ Warning! > This approach won't work for trial users as it requires n8n API that is not available to trial users. > See https://docs.n8n.io/api/ for details. Set up steps Adjust activation/deactivation schedule per your needs. Custom (cron) interval is a recommended approach. Set targeted Workflow ID. You will find it in the URL of the workflow you want to manage. Set n8n API credentials: Create an API key: how to Create n8n credentials using the API key: how to This workflow uses n8n node. #DevOps #workflow-management Other useful stuff Need a universal Error workflow to catch both execution and trigger errors? Here you go: Error handling: Send email via Gmail on execution or trigger-level errors More stuff by Olek and do not forget to backup your workflows often by automating.
by Richard Uren
This template extracts all customers from shopify using GraphQL and the shopify admin API and sync them into a Baserow table. Setup Notes Update the Endpoint in GraphQL node to reflect your Shopify store. In Baserow create a shopify database with a customer table in Baserow. Create columns in the Baserow customer table for first_name, last_name, and email. It takes about 1 second per row to insert.
by Zakaria Ben
This workflow template is designed for dental assistants and anyone looking to automate appointment scheduling. It integrates Google Calendar for booking appointments and Google Sheets as a database to store patient information. How It Works The user interacts with the chatbot to schedule an appointment. The chatbot collects necessary details and checks availability via Google Calendar. If the requested time is available, the AI books the appointment. If unavailable, the AI suggests alternative time slots. Once booked, the AI logs the appointment details into Google Sheets for record-keeping. Setup Instructions 📌 Watch this 🎥 Setup Video for detailed instructions on running and customizing this workflow. Step 1: Set Up Credentials OpenAI API Key (for chatbot functionality). Google Account (for Google Sheets & Google Calendar integration). Step 2: Choose the Right Tools Select the correct Google Calendar in the Google Calendar tool. Choose the appropriate Google Sheets file in the Google Sheets tool. Step 3: Test Run a test to ensure everything works correctly. Once tested. Example Templates Below are sample Google Sheets template to help you get started.
by Lucas Peyrin
How it works This workflow converts an HTML string into a polished PDF file using the powerful open-source Gotenberg service. It's designed to be a reusable utility in your automation stack. Receives Input: The workflow is triggered with a JSON object containing the full html code as a string and a desired file_name for the output. Prepares File: It converts the incoming HTML string into a binary index.html file, which is required for the API call. Calls Gotenberg API: It sends the HTML file to a running Gotenberg instance via an HTTP request. It also dynamically sets the output filename and embeds metadata (like Author, Title, and Creation Date) directly into the PDF. Returns PDF: The workflow outputs the final binary PDF file, ready to be saved, sent in an email, or used in the next step of your main workflow. Set up steps Setup time: ~3 minutes This workflow has one critical prerequisite: a running Gotenberg instance that your n8n can connect to. 1. Prerequisite: Run Gotenberg You need to have the Gotenberg service running. The easiest way is with Docker. Add the following service to your docker-compose.yml file (the same one you use for n8n): services: ... your n8n service ... gotenberg: image: gotenberg/gotenberg:8 restart: always Then, restart your stack with docker compose up -d. This makes Gotenberg available at the address http://gotenberg:3000 from within your n8n container. 2. Use as a Sub-Workflow This workflow is ready to be used as a sub-workflow. In your main workflow, add an Execute Sub-Workflow node. In the Workflow parameter, select this "Create PDF from HTML" workflow. Provide the input data in the required format: a JSON object with html and file_name keys.
by Jimleuk
This template is for Self-Hosted N8N Instances only. This n8n demonstrates how to build a simple SQLite MCP server to perform local database operations as well as use it for Business Intelligence. This MCP example is based off an official MCP reference implementation which can be found here -https://github.com/modelcontextprotocol/servers/tree/main/src/sqlite How it works A MCP server trigger is used and connected to 5 tools: 2 Code Node and 3 Custom Workflow. The 2 Code Node tools use the SQLLite3 library and are simple read-only queries and as such, the Code Node tool can be simply used. The 3 custom workflow tools are used for select, insert and update queries as these are operations which require a bit more discretion. Whilst it may be easier to allow the agent to use raw SQL queries, we may find it a little safer to just allow for the parameters instead. The custom workflow tool allows us to define this restricted schema for tool input which we'll use to construct the SQL statement ourselves. All 3 custom workflow tools trigger the same "Execute workflow" trigger in this very template which has a switch to route the operation to the correct handler. Finally, we use our Code nodes to handle select, insert and update operations. The responses are then sent back to the the MCP client. How to use This SQLite MCP server allows any compatible MCP client to manage a SQLite database by supporting select, create and update operations. You will need to have a SQLite database available before you can use this server. Connect your MCP client by following the n8n guidelines here - https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/#integrating-with-claude-desktop Try the following queries in your MCP client: "Please create a table to store business insights and add the following..." "what business insights do we have on current retail trends?" "Who has contributed the most business insights in the past week?" Requirements SQLite for database. MCP Client or Agent for usage such as Claude Desktop - https://claude.ai/download Customising this workflow If the scope of schemas or tables is too open, try restrict it so the MCP serves a specific purpose for business operations. eg. Confine the querying and editing to HR only tables before providing access to people in that department. Remember to set the MCP server to require credentials before going to production and sharing this MCP server with others!
by Jimleuk
This n8n demonstrates how to build your own Github MCP server to personalise it to your organisation's repositories, issues and pull requests. This n8n implementation, though not as fully featured as the official MCP server offered by Github, allows you to control precisely what access and/or functionality is granted to users which can make MCP use simpler and in some cases, more secure. The use-case in this template is to simply view and comment on issues within a specific repository but can be extended to meet the needs of your team. This MCP example is based off an official MCP reference implementation which can be found here https://github.com/modelcontextprotocol/servers/tree/main/src/github How it works A MCP server trigger is used and connected to 3 custom workflow tools. We're using custom workflow tools as there is quite a few nodes required for each task. Behind these tools are regular Github nodes although preconfigured with credentials and targeted repository. The "Get Issue Comments" and "Create Issue Comment" tools depend on obtaining an Issue Number first. The agent should call the "Get Latest Issues" tool for this. How to use This Github MCP server allows any compatible MCP client to view and comment on Github Issues. You will need to have a Github account and repository access available before you can use this server. Connect your MCP client by following the n8n guidelines here - https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/#integrating-with-claude-desktop Try the following queries in your MCP client: "Can you get me the latest issues about MCP?" "What is the current progress on Issue 12345?" "Please can you add a comment to Issue 12345 that they should try installing the latest version and see if that works?" Requirements Github for account and repository access. The repository need not be your own but you'll still need to ensure you have the correct permissions. MCP Client or Agent for usage such as Claude Desktop - https://claude.ai/download Customising this workflow Extend this template to interactive with pull requests or workflows within your own company's Github repositories. Alternatively, pull in metrics and generate reports for programme managers. Remember to set the MCP server to require credentials before going to production and sharing this MCP server with others!
by Guillaume Duvernay
This template provides a fully automated system for monitoring news on any topic you choose. It leverages Linkup's AI-powered web search to find recent, relevant articles, extracts key information like the title, date, and summary, and then neatly organizes everything in an Airtable base. Stop manually searching for updates and let this workflow deliver a curated news digest directly to your own database, complete with a Slack notification to let you know when it's done. This is the perfect solution for staying informed without the repetitive work. Who is this for? Marketing & PR professionals:** Keep a close eye on industry trends, competitor mentions, and brand sentiment. Analysts & researchers:** Effortlessly gather source material and data points on specific research topics. Business owners & entrepreneurs:** Stay updated on market shifts, new technologies, and potential opportunities without dedicating hours to reading. Anyone with a passion project:** Easily follow developments in your favorite hobby, field of study, or area of interest. What problem does this solve? Eliminates manual searching:** Frees you from the daily or weekly grind of searching multiple news sites for relevant articles. Centralizes information:** Consolidates all relevant news into a single, organized, and easily accessible Airtable database. Provides structured data:** Instead of just a list of links, it extracts and formats key information (title, summary, URL, date) for each article, ready for review or analysis. Keeps you proactively informed:** The automated Slack notification ensures you know exactly when new information is ready, closing the loop on your monitoring process. How it works Schedule: The workflow runs automatically based on a schedule you set (the default is weekly). Define topics: In the Set news parameters node, you specify the topics you want to monitor and the time frame (e.g., news from the last 7 days). AI web search: The Query Linkup for news node sends your topics to Linkup's API. Linkup's AI searches the web for relevant news articles and returns a structured list containing each article's title, URL, summary, and publication date. Store in Airtable: The workflow loops through each article found and creates a new record for it in your Airtable base. Notify on Slack: Once all the news has been stored, a final notification is sent to a Slack channel of your choice, letting you know the process is complete and how many articles were found. Setup Configure the trigger: Adjust the Schedule Trigger node to set the frequency and time you want the workflow to run. Set your topics: In the Set news parameters node, replace the example topics with your own keywords and define the news freshness that you'd like to set. Connect your accounts: Linkup: Add your Linkup API key in the Query Linkup for news node. Linkup's free plan includes €5 of credits monthly, enough for about 1,000 runs of this workflow. Airtable: In the Store one news node, select your Airtable account, then choose the Base and Table where you want to save the news. Slack: In the Notify in Slack node, select your Slack account and the channel where you want to receive notifications. Activate the workflow: Toggle the workflow to "Active", and your automated news monitoring system is live! Taking it further Change your database:* Don't use Airtable? Easily swap the *Airtable* node for a *Notion, **Google Sheets, or any other database node to store your news. Customize notifications:* Replace the *Slack* node with a *Discord, **Telegram, or Email node to get alerts on your preferred platform. Add AI analysis:** Insert an AI node after the Linkup search to perform sentiment analysis on the news summaries, categorize articles, or generate a high-level overview before saving them.
by Yang
Who is this for? This workflow is for content creators, social media managers, marketing teams, and virtual assistants who want to automatically repurpose YouTube videos into ready-to-post social media content. If you need to quickly turn long-form videos into short posts for platforms like Instagram, Facebook, or LinkedIn, this workflow saves you hours of manual work. What problem is this workflow solving? Manually extracting ideas from YouTube videos, writing captions, creating images, and preparing social media posts takes a lot of time and effort. This workflow automates the entire process: it reads the video, generates posts with captions and AI images, and organizes everything into Airtable. It lets you focus more on growing your audience instead of spending hours repurposing content. What this workflow does Watches a YouTube channel RSS feed for new videos. Extracts the video transcript automatically using Dumpling AI. Summarizes and transforms the transcript into 3 social media captions (Instagram, Facebook, LinkedIn) using OpenAI. Generates 3 unique AI image prompts. Sends the prompts to Dumpling AI to create realistic social media images. Saves the captions and attaches the AI images into Airtable, ready for posting. Setup RSS Feed Setup Get your YouTube channel’s RSS feed URL. Insert the URL into the RSS Trigger node. This will monitor for new YouTube uploads automatically. Dumpling AI Setup for Transcript Extraction Sign up at Dumpling AI. Get your Dumpling AI API Key. In the first HTTP Request node after the RSS trigger, insert your API Key (use HTTP Header Authentication). This sends the YouTube URL to Dumpling AI’s /extract-transcript endpoint. OpenAI Setup for Caption and Prompt Generation Get your OpenAI API Key. In the OpenAI node, connect your account. The AI will: Generate 3 platform-specific captions. Generate 3 creative prompts to design images related to the video. Edit Fields Node This node organizes the generated captions and prompts into separate fields for easy Airtable mapping. Captions are split for Instagram, Facebook, and LinkedIn. Dumpling AI Setup for AI Image Generation After the Edit Fields node, the second HTTP Request node sends the image prompt to Dumpling AI’s /generate-image endpoint. This returns a realistic AI-generated image. Airtable Setup for Saving Posts (Without Image First) Create a new base in Airtable with the following fields: Platform (Single select: Instagram, Facebook, LinkedIn) Content (Long text field) Image (Attachment field) Connect your Airtable Personal Access Token to the Airtable node. The Airtable node saves the generated captions into separate records, initially without images. Upload Generated Images Back to Airtable The third HTTP Request node PATCHES the Airtable record. It updates the Image field with the generated AI image from Dumpling AI. Credentials Required Dumpling AI API Key (for transcript extraction and AI image generation) OpenAI API Key (for caption and prompt creation) Airtable Personal Access Token (for inserting and updating records) How to customize this workflow to your needs Change the OpenAI prompt to generate captions in your brand tone (e.g., friendly, professional, witty). Modify the image prompts to match your design style better. Adjust the Airtable base fields if you want to add more platforms or content formats. Add scheduling tools like Buffer or Metricool to automatically post from Airtable. ⚡ Quick Tips Make sure Dumpling AI credits are active to allow transcript and image generation. Set Airtable permissions properly so PATCH requests can update attachments.
by darrell_tw
How it works Fetch all workflows from your n8n instance. Filter workflows that contain nodes with a modelId setting. Extract the node names, model IDs, model names, workflow names, and workflow URLs. Save the extracted information into a connected Google Sheet. Set up steps Connect your n8n API credentials. Connect your Google Sheets account. Replace "Your n8n domain" with your actual domain URL. Use this Google Sheet template to create a new sheet for results. Setup typically takes 5 minutes. Be cautious: if you have over 100 workflows, performance may be impacted. Notes Sticky notes inside the workflow provide extra guidance. This workflow clears old sheet data before writing new results. Make sure your n8n instance allows API access. Result Example Update: It didn't detect the AI model in tool originally. Now it's fixed! Update 20250429: Support 1.91.0 with open node directly! Optimize the url with node id.
by MattF
This workflow helps SEO teams catch top movers in Google Search Console by comparing daily performance across keyword segments like brand, nonbrand, and content categories. Instead of serving as a routine check, it highlights the queries and pages with the biggest jumps or drops, making it ideal for spotting wins, losses, or unexpected shifts early. How It Works Runs daily on a scheduled trigger (e.g. every morning). Pulls GSC data for the prior two days (e.g. yesterday vs. day before). Segments traffic by keyword type or URL pattern (e.g. brand, nonbrand, recipes, blogs, etc.). Calculates changes in clicks, impressions, CTR, and average position. Flags top movers with the biggest positive or negative deltas. Sends structured reports via Slack or email, grouped by segment and sorted by impact. Setup Steps Connect your Google Search Console account and optionally Gmail or Slack. Swap in your own domain(s) and customize segmentation logic (e.g. brand terms, path filters). By default, the workflow includes Slack alerts, but these can be easily switched to or combined with email, webhook, or other channels. Full setup takes around 15–20 minutes with working GSC credentials. Note: The “recipes” segment is included as an example of how to segment content. This can be changed to match blog, FAQ, product pages, or any other category.
by Jimleuk
This n8n demonstrates how any organisation can quickly and easily build and offer MCP servers to their customers or internal staff to improve productivity. This MCP example uses PayCaptain.com as an example and shows how to create an MCP server which can search for and update employee data. How it works A MCP server trigger is used and connected to 3 custom workflow tools: Search Employee, Get Employee by ID and Update Employee. Each tool makes calls to the PayCaptain API to perform their respective tasks. Extra care is performed to strip out sensitive data and ensure we're not sharing too much. The Update Employee too also guards against updating fields which would preferably remain readonly. When you control the MCP server, you can determine behaviour of the tool. Finally, a Google Sheet node is used to log all operations for later audit. This will add a tiny bit of latency but recommended if sensitive data is being accessed. How to use This MCP server allows any compatible MCP client to manage their PayCaptain employee database. You will need to have a PayCaptain account and developer key to use it. Connect your MCP client by following the n8n guidelines here - https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/#integrating-with-claude-desktop Try the following queries in your MCP client: "When did Sarah start here employment at the company?" "Does Jack work Wednesdays or Fridays?" "Please update Tracy's NI number to ABCD123456" Requirements PayCaptain Account and Developer Key. Google Sheets to log actions for later audit. MCP Client or Agent for usage such as Claude Desktop - https://claude.ai/download Customising this workflow Add or remove employee attributes as required for your user case. If Google Sheets is too slow, consider an API call to a faster service to log calls to the MCP server. Remember to set the MCP server to require credentials before going to production and sharing this MCP server with others!