by n8n Team
This workflow automatically adds closed deals from Pipedrive as new customers into Stripe. Prerequisites Pipedrive account and Pipedrive credentials Stripe account and Stripe credentials How it works Pipedrive trigger node starts the workflow when a deal gets updated in Pipedrive. IF node checks that the current won time is not equal to the previuos one in the deal and continues the workflow if it's true. Pipedrive node extracts the organization's details to pass it further. HTTP Request node searches for the same organization's details within Stripe. If a customer doesn't exist within Stripe, Merge node passes a new customer details to Stripe. Stripe node creates a new customer.
by Praveena
Purpose The purpose of this automation is to help context switch from office to some side projects or passion gigs so you can be free of distracting thoughts and re-set your perspective. Benefits Anyone who works full time and also does something on the side (perhaps a side gig/being a mom/just follow your passion project) What you need N8N (lol) Any LLM API Key (I used OpenAI 4.1) IPhone (automations and shortcuts) Template Setup Setup LLM API key. Import template file to new workflow. On Iphone create a new shortcut as per video. Create automation steps. Resources Youtube
by Thomas Janssen
Build an AI Agent which accesses two MCP Servers: a RAG MCP Server and a Search Engine API MCP Server. This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Tutorial Click here to watch the full tutorial on YouTube! How it works We build an AI Agent which has access to two MCP servers: An MCP Server with a RAG database (click here for the RAG MCP Server An MCP Server which can access a Search Engine, so the AI Agent also has access to data about more current events Installation In order to use the MCP Client, you also have to use MCP Server Template. Open the MCP Client "MCP Client: RAG" node and update the SSE Endpoint to the MCP Server workflow Install the "n8n-nodes-mcp" community node via settings > community nodes ONLY FOR SELF-HOSTING: In Docker, click on your n8n container. Navigate to "Exec" and execute the below command to allow community nodes: N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true Navigate to Bright Data and create a new "Web Unlocker API" with the name "mcp_unlocker". Open the "MCP Client" and add the following credentials: How to use it Run the Chat node and start asking questions More detailed instructions Missed a step? Find more detailed instructions here: Personal Newsfeed With Bright Data and n8n What is Retrievel Augmented Generation (RAG)? Large Language Models (LLM's) are trained on data until a specific cutoff date. Imagine a model is trained in December 2023 based data until September 2023. This means the model doesn't have any knowledge about events which happened in 2024. So if you ask the LLM who was the Formula 1 World Champion of 2024, it doesn't know the answer. The solution? Retrieval Augmented Generation. When using Retrieval Augmented Generation, a user's question is being sent to a semantic database. The LLM will use the information retrieved from the semantic database to answer the user's question. What is Model Context Protocol (MCP)? MCP is a communication protocol which is used by AI agents to call tools hosted on external servers. When an MCP client communicates with an MCP server, the server will provide an overview of all its tools, prompts and resources. The MCP server can then choose which tools to execute (based on the user's request) and execute the tools. An MCP client can communicate with multiple MCP servers, which can all host multiple tools.
by Airtop
Extracting Comments from an X Post Use Case Engaging with conversations on X (formerly Twitter) is critical for brands and individuals monitoring sentiment, leads, or emerging trends. Manually collecting comments is time-consuming—this automation enables scalable extraction of comment data to inform your outreach or analysis. What This Automation Does This automation extracts comments from a specified X post, with the following input parameters: airtop_profile**: The name of your Airtop Profile connected to X. x_post_url**: The URL of the X post to extract comments from. max_number_of_comments**: The maximum number of comments to retrieve. How It Works Takes input via a form or another workflow. Normalizes the input values. Creates a new browser session using Airtop. Navigates to the provided X post. Uses a prompt to extract up to the specified number of comments, returning: Author name Author profile URL Comment text Setup Requirements Airtop API Key — free to generate. An Airtop Profile connected to X (requires one-time login). Next Steps Pair with X Monitoring**: Use this with the X monitoring automation to detect relevant posts and extract discussion context automatically. Feed into Analytics**: Combine with summarization or sentiment analysis tools to understand audience response at scale. Export for CRM/BI**: Pipe the structured comment data into your CRM or business intelligence stack for lead tracking or reporting. Read more about Extracting Comments from X Posts
by Yang
Who is this for? This workflow is perfect for marketers, SEO specialists, product teams, and competitive analysts who want to monitor and summarize public reviews of their competitors. It’s especially helpful for small teams who want fast insights from Google reviews without spending hours manually reading and sorting them. What problem is this workflow solving? Manually going through competitor reviews is time-consuming and repetitive. You risk missing patterns or insights, and it’s hard to share summaries with your team quickly. This workflow automatically scrapes reviews from Google and generates a structured summary of pain points and positive feedback. That way, you can focus on strategy instead of sorting through dozens of reviews. What this workflow does This automation watches for new competitor entries in a Google Sheet, then: Uses Dumpling AI to scrape the latest Google reviews (up to 20) for each business. Splits and cleans the reviews for analysis. Sends them to GPT-4o, which summarizes the most common complaints and praises. Saves the structured result back to the same Google Sheet. You’ll instantly get an overview of what people are saying about any competitor. Setup Google Sheet Setup Create a Google Sheet with at least one column: Business Add names or search queries for the competitors you want to analyze Optional: Add columns for Summary of Reviews and Pain Points Connect Dumpling AI Sign up at Dumpling AI Create an agent using the get-google-reviews endpoint Copy your agent key Use it in the HTTP Request node in this workflow OpenAI Setup Use your API key with GPT-4o access The prompt is already structured to generate grouped summaries from reviews Run the Workflow Trigger it manually or schedule it Make sure your Google Sheets, OpenAI, and Dumpling AI connections are active How to customize this workflow to your needs You can expand the number of reviews retrieved by changing the Dumpling AI agent config Replace Google Sheets with Airtable if you want more robust data views Add more fields like star ratings or review dates in your agent for richer analysis Change the GPT prompt to highlight emotional tone, urgency, or feature mentions 🧠 Node Details Google Sheets Trigger**: Watches for new competitor names HTTP Request (Dumpling AI)**: Scrapes 20 recent reviews from Google SplitOut Node**: Breaks review array into individual items Code Node**: Extracts and combines review text Edit Fields Node**: Structures the review content before GPT GPT-4o Node**: Analyzes and summarizes top pain points and praise Google Sheets Output**: Saves the summary back to the same sheet Dependencies Dumpling AI account and review scraping agent setup OpenAI API key with GPT-4o access Google Sheets OAuth2 credentials
by Angel Menendez
Enhance Query Resolution with the Knowledge Base Tool! Our KB Tool - Confluence KB is crafted to seamlessly integrate into the IT Ops AI SlackBot Workflow, enhancing the IT support process by enabling sophisticated search and response capabilities via Slack. Workflow Functionality: Receive Queries**: Directly accepts user queries from the main workflow, initiating a dynamic search process. AI-Powered Query Transformation**: Utilizes OpenAI's models or local ai to refine user queries into searchable keywords that are most likely to retrieve relevant information from the Knowledge Base. Confluence Integration**: Executes searches within Confluence using the refined keywords to find the most applicable articles and information. Deliver Accurate Responses**: Gathers essential details from the Confluence results, including article titles, links, and summaries, preparing them to be sent back to the parent workflow for final user response. To view a demo video of this workflow in action, click here. Quick Setup Guide: Ensure correct configurations are set for OpenAI and Confluence API integrations. Customize query transformation logic as per your specific Knowledge Base structure to improve search accuracy. Need Help? Dive into our Documentation or get support from the Community Forum! Deploy this tool to provide precise and informative responses, significantly boosting the efficiency and reliability of your IT support workflow.
by n8n Team
This workflow syncs Outlook Calendar events to a Notion database. The Outlook Calendar event must be within a specific time frame (default of within next year) for the workflow to pick up the event. The event subject will be the title of the Notion page, and the event link will be added to the Notion page as a property. Prerequisites Notion account and Notion credentials. Microsoft account and Microsoft credentials. How it works On scheduled intervals, find all Outlook Calendar events within a specific time frame. For each event, check if the event already exists in the Notion database. If it does not exist, create a new page in the Notion database, otherwise update the existing page. Setup This workflow requires that you set up a Notion database or use an existing one with at least the following fields: Title (title) Date (date) Event ID (text) Link (URL)
by Jihene
AI-Agent Code Review for GitHub Pull Requests Description: This n8n workflow automates the process of reviewing code changes in GitHub pull requests using an OpenAI-powered agent. It connects your GitHub repo, extracts modified files, analyzes diffs, and uses an AI agent to generate a code review based on your internal code best practices (fed from a Google Sheet). It ends by posting the review as a comment on the PR and tagging it with a visual label like ✅ Reviewed by AI. 🔧 What It Does Triggered on PR creation Extracts code diffs from the PR Formats and feeds them into an OpenAI prompt Enriches the prompt using a Google Sheet of Swift best practices Posts an AI-generated review as a comment on the PR Applies a PR label to visually mark reviewed PRs ✅ Prerequisites Before deploying this workflow, ensure you have the following: n8n Instance (Self-hosted or Cloud) GitHub Repository with PR activity OpenAI API Key** for GPT-4o, GPT-4-turbo, or GPT-3.5 GitHub OAuth App** (or PAT) connected to n8n to post comments and access PR diffs (Optional) Google Sheets API credentials if using the code best practices lookup node. ⚙️ Setup Instructions 1. Import the Workflow in n8n, click on Workflows → Import from file or JSON Paste or upload the JSON code of this template 2. Configure Triggers and Connections 🔁 GitHub Trigger Node**: PR Trigger Repository**: Select the GitHub repo(s) to monitor Events**: Set to pull_request Auth**: Use GitHub OAuth2 credentials 📥 HTTP Request Node: Get file's Diffs from PR No authentication needed; it uses dynamic path from trigger 🧠 OpenAI Model Node**: OpenAI Chat Model Model**: Select gpt-4o, gpt-4-turbo, or gpt-3.5-turbo Credential**: Provide your OpenAI API Key 🧑💻 Code Review Agent Node : Code Review Agent Connected to OpenAI and optionally to tools like Google Sheets 💬 GitHub Comment Poster Uses GitHub API to post review comments back on PR Node: GitHub Robot Credential: Use the agent Github account (OAuth or PAT) Repo : Pick your owen Github Repository 🏷️ PR Labeler (optional) Adds label ReviewedByAI after successful comment Node: Add Label to PR Label : you ca customize the label text of your owen tag. 📊 Google Sheet Best Practices config (optional) Connects to a Google Sheet for coding guideline lookups, we can replace Google sheet by another tool or data base First prepare your best practices list with the clear description and the code bad/good examples Add al the best practices in your Google Sheet Configure* the Code *Best Practices node** in the template : Credential : Use your Google Sheet account by OAuth2 URL : Add your Google Sheet document URL Sheet : Add the name of the best practices sheet
by Mariano Kostelec
A fully automated content engine that researches, writes, scores, and visualizes LinkedIn posts — built with n8n, OpenAI, Perplexity, and Replicate. What it does: ✅ Researches any topic using real-time data ✅ Writes a personalized post in your voice ✅ Refines tone and structure ✅ Generates abstract, high-quality visual assets ✅ Scores the output and saves it to Google Sheets How it works: Triggered when you change a row status in Google Sheets Uses Perplexity to research GPT-4o (OpenAI) to create and polish content Replicate (FLUX Pro) to generate images Scores the post using heuristics Appends everything back to your sheet
by AlQaisi
Template for Kids' Story in Arabic The n8n template for creating kids' stories in Arabic offers a versatile platform for storytellers to captivate young audiences with educational and interactive tales. It allows for customization to suit various use cases and can be set up effortlessly. Check this example: https://t.me/st0ries95 Use Cases Educational Platforms: Educational platforms can automate the creation and distribution of educational stories in Arabic for children using this template. By incorporating visual and auditory elements into the storytelling process, educational platforms can enhance learning experiences and engage young learners effectively. Children's Libraries: Children's libraries can utilize this template to curate and share a diverse collection of Arabic stories with young readers. The automated generation of visual content and audio files enhances the storytelling experience, encouraging children to immerse themselves in new worlds and characters through captivating narratives. Language Learning Apps: Language learning apps focused on Arabic can integrate this template to offer culturally rich storytelling experiences for children learning the language. By translating stories into Arabic and supplementing them with visual and auditory components, these apps can facilitate language acquisition in an enjoyable and interactive manner. Configuration Guide for Nodes OpenAI Chat Model Nodes: Functionality**: Allows interaction with the OpenAI GPT-4 Turbo model. Purpose**: Enables communication with advanced chat capabilities. Create a Prompt for DALL-E Node: Customization**: Tailor prompts for generating relevant visual content. Summarization**: Define prompts for visual content generation without text. Generate an Image for the Story Node: Resource Type**: Specifies image as the resource. Prompt Setup**: Configures prompt for textless image creation within the visual content. Generate Audio for the Story Node: Resource Type**: Chooses audio as the resource. Input Definition**: Sets input text for audio file generation. Translate the Story to Arabic Node: Chunking Mode Selection**: Allows advanced chunking mode choice. Summarization Configuration**: Sets method and prompts for story translation into Arabic. Send the Story To Channel Node: Channel ID**: Specifies the channel ID for sending the story text. Text Configuration**: Sets up the text to be sent to the channel. By following these node descriptions, users can effectively configure the n8n template for kids' stories in Arabic, tailoring it to specific use cases for a seamless and engaging storytelling experience for young audiences.
by Keith Rumjahn
WordPress Post Auto-Categorization Workflow 💡 Click here to read detailed case study 📺 Click here to watch youtube tutorial 🎯 Purpose Automatically categorize WordPress blog posts using AI, saving hours of manual work. This workflow analyzes your post titles and assigns them to predefined categories using artificial intelligence. 🔄 What This Workflow Does • Connects to your WordPress site • Retrieves all uncategorized posts • Uses AI to analyze post titles • Automatically assigns appropriate category IDs • Updates posts with new categories • Processes dozens of posts in minutes ⚙️ Setup Requirements WordPress site with admin access Predefined categories in WordPress OpenAI API credentials (or your preferred AI provider) n8n with WordPress credentials 🛠️ Configuration Steps Add your WordPress categories (manually in WordPress) Note down category IDs Update the AI prompt with your category IDs Configure WordPress credentials in n8n Set up AI API connection 🔧 Customization Options • Modify AI prompts for different categorization criteria • Adjust for multiple category assignments • Add tag generation functionality • Customize for different content types • Add additional metadata updates ⚠️ Important Notes • Backup your WordPress database before running • Test with a few posts first • Review AI categorization results initially • Categories must be created manually first 🎁 Bonus Features • Can be modified for tag generation • Works with scheduled posts • Handles bulk processing • Maintains categorization consistency Perfect for content managers, bloggers, and website administrators looking to organize their WordPress content efficiently. #n8n #WordPress #ContentManagement #Automation #AI Created by rumjahn
by Dmytro
AI-Powered Product Assistant for E-commerce Transform your online store customer service with an intelligent AI assistant that automatically processes customer inquiries, searches your product database, and provides personalized responses about product availability, pricing, and specifications. Perfect for shoe stores, fashion retailers, and any business with extensive product catalogs - this workflow eliminates manual customer service while increasing response speed and accuracy. How it works Customer sends product inquiry via webhook (Instagram DM, website chat, or messaging app) AI extracts key product details (brand, model, size, color) from natural language text System searches your Google Sheets product database with smart filtering AI generates friendly, personalized response with availability, pricing, and stock information Automatic response sent back to customer with product details or alternatives Screenshots: Customer inquiry: "Do you have Nike Air Max 40 size?" AI response: "Nike Air Max 90, size 40 - in stock 3 pieces, price 120$" Set up steps Prepare your product database - Create Google Sheets with columns: Brand, Model, Size, Color, Price, Quantity Configure AI settings - Connect OpenAI API for natural language processing Set up webhook endpoint - Configure trigger for your messaging platform (Instagram, Telegram, website chat) Test with sample inquiries - Verify AI correctly parses requests and finds products Deploy and monitor - Launch your automated assistant and track performance Time investment: 30-45 minutes setup, works immediately with any product catalog up to 1000+ items.