by Julian Kaiser
How it works Many users have asked in the support forum about different methods to analyze images and PDF documents with Google Gemini AI in n8n. This workflow answers that question by demonstrating five different approaches: Single image with auto binary passthrough - The simplest approach using AI Agent's automatic binary handling Multiple images with predefined prompts - For customized analysis with different instructions per image Native n8n item-by-item processing - For handling multiple items using n8n's standard workflow paradigm PDF analysis via direct API - For document analysis and text extraction Image analysis via direct API - For direct control over API parameters Each method has advantages depending on your specific use case, data volume, and customization needs. Set up steps Setup time: ~5-10 minutes You'll need: A Google Gemini API key n8n with HTTP Request and AI Agent nodes Important: For the HTTP Request nodes making direct API calls to Gemini (Methods 3, 4, and 5), you'll need to set up Query Authentication with your Gemini API key. Add a parameter named "key" with your API key value in the Query Auth section of these nodes. I'll updated this if I find better ways. Also let me know if you know other ways. Eager to learn :)
by Lucas Peyrin
How it works Ever wonder how to make your workflows smarter? How to handle different types of data in different ways? This template is a hands-on tutorial that teaches you the three most fundamental nodes for controlling the flow of your automations: Merge, IF, and Switch. To make it easy to understand, we use a simple package sorting center analogy: Data Items** are packages on a conveyor belt. The Merge Node is where multiple conveyor belts combine into one. The IF Node is a simple sorting gate with two paths (e.g., "Fragile" or "Not Fragile"). The Switch Node is an advanced sorting machine that routes packages to many different destinations. This workflow takes you on a step-by-step journey through the sorting center: Creating Packages: Three different "packages" (two letters and one parcel) are created using Set nodes. Merging: The first Merge node combines all three packages onto a single conveyor belt so they can be processed together. Simple Sorting: An IF node checks if a package is fragile. If true, it's sent down one path; if false, it's sent down another. Re-Grouping: After being processed separately, another Merge node brings the packages back together. This "Split > Process > Merge" pattern is a critical concept in n8n! Advanced Sorting: A Switch node inspects each package's destination and routes it to the correct output (London, New York, Tokyo, or a Default bin). By the end, you'll see how all packages have been correctly sorted, and you'll have a solid understanding of how to build intelligent, branching logic in your own workflows. Set up steps Setup time: 0 minutes! This template is a self-contained tutorial and requires zero setup. There are no credentials or external services to configure. Simply click the "Execute Workflow" button. Follow the flow from left to right, clicking on each node to see its output and reading the detailed sticky notes to understand what's happening at each stage.
by n8n Team
This workflow digests mentions of n8n on Reddit that can be sent as an single email or Slack summary each week. We use OpenAI to classify if a specific Reddit post is really about n8n or not, and then the summarise it into a bullet point sentence. How it works Get posts from Reddit that might be about n8n; Filter for the most relevant posts (posted in last 7 days and more than 5 upvotes and is original content); Check if the post is actually about n8n; If it is, categorise with OpenAI. Bear in mind: Workflow only considers first 500 characters of each reddit post. So if n8n is mentioned after this amount, it won't register as being a post about n8n.io. Next steps Improve OpenAI Summary node prompt to return cleaner summaries; Extend to more platforms/sources - e.g. it would be really cool to monitor larger Slack communities in this way; Do some classification on type of user to highlight users likely to be in our ICP; Separate a list of data sources (reddit, twitter, slack, discord etc.), extract messages from there and have them go to a sub workflow for classification and summarisation.
by Mujahid Kabae
How it works This workflow scrapes the latest Artificial Intelligence articles from TechCrunch, then processes and classifies the content using OpenAI and LangChain nodes. The final result is saved to Google Sheets and sent as a summary to a Telegram group. Workflow Logic: Trigger: Schedules daily at 6AM Bangkok time. Scraper: Extracts URLs and publish dates from TechCrunch's AI category. Filter: Only continues if the article is from yesterday (to avoid duplication). Content Fetch: Downloads and extracts article body text. AI Agent: Summarizes the article in Thai. Scores it using strict journalism criteria (max 100). Categorizes the news into one of 9 predefined categories. Output: Saves all structured data to Google Sheets. Sends a summary to a Telegram group. Set up steps 🕒 Estimated setup time: 10–15 minutes Connect your credentials: Google Sheets (OAuth2) Telegram OpenAI account (via LangChain model) Update the Telegram chatId and Google Sheets documentId/sheetName values. Deploy and activate the workflow. It runs daily without manual intervention.
by Humble Turtle
Manage Jira Issues with Natural Language via Telegram and GPT-4o Overview The Jira Agent is an AI-powered assistant that allows users to interact with Jira directly through messaging platform Telegram. It leverages OpenAI's GPT-4o model to interpret natural language commands and perform various Jira-related actions. On Telegram, it enables users to create Jira stories by triggering a guided form when prompted with "create story." Additionally, it provides more extensive functionality, including creating, updating, searching, and transitioning Jira issues through natural language commands. How it works Normal interaction Using messages as "Please give all my issues". Standardized process of creating stories: Message: "create story" Open the Form that Telegram responds back to you Fill in the essential story information in the form The story automatically gets created in your backlog. Required Connections To use the Jira Agent effectively, users need access to: A Telegram account, Telegram setup involves deploying the bot and starting a chat; story creation is triggered with a simple text command. A connected Jira workspace Permissions to create and modify Jira issue Access to GPT-4o API-key Detailed configuration instructions are provided in the workflow Setup Time <15 minutes Customising this workflow Try adding more details to the form for more complete Jira ticket creation. Try connecting a Google Calendar node to plan your work
by Jimleuk
This n8n template demonstrates how you can automate community moderation using human-in-the-loop functionality for Discord. The use-case is for detecting and dealing with spam messages in a predefined and consistent way. Human-in-the-loop allows for a balance between overly aggressive bots and time and effort from the moderation team. How it works A scheduled trigger is used to scan the most recent messages in a Discord Channel. Messages are tagged via the "Remove Duplicates" node so they don't get processed again in the future. Messages are grouped by user to allow for minimising of number of notifications sent. An AI text classifier node is then used to detect for spam in each user's message. When detected, a notification is sent to a moderation channel using the Send-and-wait mode for Discord. This notification comes with an n8n form and dropdown list of predefined actions to take in dealing with the spam messages. Once sent the workflow waits until a response is received. Once a moderator selects an action, the workflow continues and carries out a predefined moderation action. How to use Depending on how busy your community is and subject to spammers, you may need to increase the scheduled interval. Add as many or few moderation actions as required. Remember to activate the workflow to get it started. Requirements Discord channel for messages to moderate OpenAI for text classification Customising this template It is possible to cover multiple channels. Add as many as your community needs. Not using Discord. The template can also work in slack or other services which offer the same bot functionality.
by Alex Huang
Use case Manually monitoring Reddit for viable business ideas is time-consuming and inconsistent. This workflow automatically analyzes trending Reddit discussions using AI to surface high-potential opportunities, filter irrelevant content, and generate actionable insights - saving entrepreneurs 10+ hours weekly in market research. What this workflow does This AI-powered workflow automatically collects trending Reddit discussions, analyzes posts for viable business opportunities using GPT-4, applies smart filters to exclude low-value content, and generates scored opportunity reports with market insights. It identifies unmet customer needs through sentiment analysis, prioritizes high-potential ideas using custom criteria, and outputs structured data to Google Sheets for actionable decision-making. Setup Add Reddit,Google and OpenAI credentials Configure target subreddits in Subreddit node Test workflow by testing workflow Review generated opportunity report in Google Sheets How to adjust this template Change data sources**: Replace Reddit trigger with Twitter/X or Hacker News API Modify criteria**: Adjust scoring thresholds in Opportunity Calculator node Add integrations**: Create automatic Slack alerts for urgent opportunities Generate draft business plans using AI Document Writer
by Joseph LePage
This workflow template creates an AI agent chatbot with long-term memory and note storage using Google Docs and Telegram integration. Google Docs Integration 📄 n8n Google Docs Node Setup Google Credentials Telegram Integration 💬 Telegram Setup Core Features 🌟 AI Agent Integration 🤖 Implements a sophisticated AI agent with memory management capabilities Uses GPT-4o-mini and DeepSeek models for intelligent conversation handling Maintains context awareness through session management Memory System 🧠 Long-term memory storage using Google Docs Separate note storage system for specific information Window buffer memory for maintaining conversation context Intelligent memory retrieval and storage mechanisms Communication Interface 💬 Telegram integration for message handling Real-time message processing and response generation Technical Components 🔧 Memory Architecture 📚 Dual storage system separating memories from notes Automated memory retrieval before each interaction Structured memory saving with timestamps AI Models 🤖 Primary GPT-4o-mini mini model for general interactions DeepSeek-V3 Chat for specialized processing Custom agent system with tool integration Storage Integration 💾 Google Docs integration for persistent storage Separate document management for memories and notes Automated document updates and retrievals
by Nico Kowalczyk
Description: This template facilitates the transfer of a folder, along with all its files and subfolders, within a Nextcloud instance. The Nextcloud user must have access to both the source and destination folders. While Nextcloud allows folder movement, complications may arise when dealing with external storage that has rate limits. This workflow ensures the individual transfer of each file to avoid exceeding rate limits, particularly useful for setups involving external storage with rate limitations. How it works: Identify all files and subfolders within the specified source folder. Recursive search within subfolders for additional files. Replicate the folder structure in the target folder. Individually move each identified file to the corresponding location in the target folder. Set up steps: Set Nextcloud credentials for all Nextcloud nodes involved in the process. -Edit the trigger settings. Detailed instructions can be found within the respective trigger configuration. Initiate the workflow to commence the folder transfer process. Help If you need assistance with applying this template, feel free to reach out to me. You can find additional information about me and my services here. => https://nicokowalczyk.de/links I have also produced a video where I explain the workflow and provide an example. You can find this video over here. https://youtu.be/K1kmG_Q_jRk Cheers. Nico Kowalczyk
by Chris Carr
Split Test Agent Prompts with Supabase and OpenAI Use Case Oftentimes, it's useful to test different settings for a large language model in production against various metrics. Split testing is a good method for doing this. What it Does This workflow randomly assigns chat sessions to one of two prompts, the baseline and the alternative. The agent will use the same prompt for all interactions in that chat session. How it Works When messages arrive, a table containing information regarding session ID and which prompt to use is checked to see if the chat already exists If it does not, the session ID is added to the table and a prompt is randomly assigned These values are then used to generate a response Setup Create a table in Supabase called split_test_sessions. It needs to have the following columns: session_id (text) and show_alternative (bool) Add your Supabase, OpenAI, and PostgreSQL credentials Modify the Define Path Values node to set the baseline and alternative prompt values. Activate the workflow and test by sending messages through n8n's inbuilt chat Experiment with different chat sessions to test see both prompts in action Next Steps Modify the workflow to test different LLM settings such as temperature Add a method to measure the efficacy of the two alternative prompts
by David Olusola
📊 Google Sheets MCP Workflow – AI Meets Spreadsheets! 😄 ✨ What It Does This n8n workflow lets you chat with your spreadsheets using AI + MCP! From reading and updating data to creating sheets, it’s your smart assistant for Google Sheets 📈🤖 🚀 Cool Features 💬 Natural language commands (e.g. "Add a new lead: John Doe") ✏️ Full CRUD (Create, Read, Update, Delete) 🧠 AI-powered analysis & smart workflows 🗂️ Multi-sheet support 🔗 Works with ChatGPT, Claude, and more (via MCP) 💡 Use Cases Data Tasks: “Update status to 'Done' in row 3” Sheet Ops: “Create a ‘Marketing 2024’ sheet” Business Flows: “Summarize top sales from Q2” 🛠️ Quick Setup Import Workflow into n8n Copy the JSON In n8n → Import JSON → Paste & Save ✅ Connect Google Sheets Create a project in Google Cloud Enable Sheets & Drive APIs Create OAuth2 credentials In n8n → Add Google Sheets OAuth2 credential → Connect 🔐 Add Your Credentials Get your credential ID Open each Google Sheets node → Update with your new credential ID Link to AI (Optional 😊) MCP webhook is pre-set Plug it into your AI tool (like ChatGPT) Send test command → Watch the magic happen ✨ ✅ Test It Out Try these fun commands: 🆕 "Add entry: Jane Doe, janed@example.com" 🔍 "Read data from Sales 2024" 🧹 "Clear data from A1:C5" ➕ "Create sheet 'Budget 2025'" ❌ "Delete sheet 'Test'" 🧠 MCP Command List (AI-Callable Functions) These are the tasks the AI can perform via MCP: Add a new entry to a sheet Read data from a sheet Update a row in a sheet Delete a row from a sheet Create a new sheet Delete an existing sheet Clear data from a specific range Summarize data from a sheet using AI ⚙️ Tips & Fixes OAuth2 Errors? Re-authenticate and check scopes Confirm redirect URI is exact Permissions? Spreadsheet must be shared with edit access Use service accounts for production Webhook Not Firing? Double-check the URL Trigger it manually to test
by Zacharia Kimotho
What it does This workflow scrapes the top 10 pages on SERP and conducts an in-depth analysis of the keyword intent for each ranking keyword, saving the information to a Google Sheet for further analysis. How does this workflow work? We add our keywords and country code to a Google sheet that we need to monitor and research on Run the system Scrape the top 10 pages Analyze the intents of the top 10 and update to a Google sheet Technical Setup Make a copy of this G sheet Add your desired keywords to the Google sheet Map keyword and country code Update the Zone name to match your zone on Bright Data Run the scraper Upon successful scraping, we run an intent classifier to determine the intents for each ranking page and update the G sheet. Setting up the Serp Scraper in Bright Data On Bright Data, go to the Proxies & Scraping tab Under SERP API, create a new zone Give it a suitable name and description. The default is serp_api Add this to your account Add your credentials as a header credential