by Jeremiah Wright
Who’s it for Recruiters, freelancers, and ops teams who scan job briefs and want quick, relevant n8n template suggestions, saved in a Google Sheet for tracking. What it does Parses any job text, extracts exactly 5 search keywords, queries the n8n template library, and appends the matched templates (ID, name, description, author) to Google Sheets, including the canonical template URL. How it works Trigger receives a message or paste-in job brief. LLM agent returns 5 concise search terms (JSON). For each keyword, an HTTP request searches the n8n templates API. Results are split and written to Google Sheets; the workflow builds the public URL from ID+slug. Set up Add credentials for OpenAI (or swap the LLM node to your provider). Create a Google Sheet with columns: Template ID, Name, User, Description, URL. In the ⚙️ Config node, set: GOOGLE_SHEETS_DOC_ID, GOOGLE_SHEET_NAME, N8N_TEMPLATES_API_URL. Requirements • n8n (cloud or self-hosted) • OpenAI (or alternative LLM) credentials • Google Sheets OAuth credentials Customize • Change the model/system prompt to tailor keyword extraction. • Swap Google Sheets for Airtable/Notion. • Extend filters (e.g., only AI/CRM templates) before writing rows.
by Shun Nakayama
This workflow implements cutting-edge concepts from Google DeepMind's OPRO (Optimization by PROmpting) and Stanford's DSPy to automatically refine AI prompts. It iteratively generates, evaluates, and optimizes responses against a ground truth, allowing you to "compile" your prompts for maximum accuracy. Why this is powerful Instead of manually tweaking prompts (trial and error), this workflow treats prompt engineering as an optimization problem: OPRO-style Optimization**: The "Optimizer" LLM analyzes past performance scores and reasons to mathematically deduce a better prompt. DSPy-style Logic**: It separates the "Logic" (Workflow) from the "Parameters" (Prompts), allowing the system to self-correct until it matches the Ground Truth. How it works Define**: Set your initial prompt and a test case with the expected answer (Ground Truth). Generate**: The workflow generates a response using the current prompt. Evaluate**: An AI Evaluator scores the response (0-100) based on accuracy and format. Optimize**: If the score is low, the Optimizer AI analyzes the failure and rewrites the prompt. Loop**: The process repeats until the score reaches 95/100 or the loop limit is hit. Setup steps Configure OpenAI: Ensure you have an OpenAI credential set up in the OpenAI Chat Model node. Customize: Open the Define Initial Prompt & Test Data node and set your initial_prompt, test_input, and ground_truth. Run: Execute the workflow and check the Manage Loop & State node output for the optimized prompt.
by Alberto Idrio
Gmail → AI Summary → Notion + Audio Digest This n8n workflow turns incoming Gmail emails into structured AI summaries and optional audio digests, automatically delivered to Notion and Google Drive. It is designed to reduce email overload by transforming raw messages into concise, readable, and listenable content. What this workflow does On a scheduled basis, the workflow: Retrieves Gmail messages (all subjects or filtered) Marks processed emails as read to avoid duplicates Extracts and normalizes the email body Uses OpenAI to generate a clean, structured summary From the summary, the workflow branches into two outputs: Text summary The final AI-generated summary is appended as a block in Notion Ideal for daily logs, knowledge bases, or team dashboards Audio transcript (optional) The summary text is converted into speech using a TTS model The audio file is uploaded to Google Drive A shareable link is generated The audio reference is added back into Notion Key features Automated Gmail ingestion AI-powered email summarization JavaScript preprocessing for clean input Notion integration for structured storage Text-to-speech audio generation Google Drive hosting for audio files Error-aware branching for TTS generation Idempotent and schedule-safe execution Typical use cases Daily or weekly email digests Executive summaries of inbox activity Audio briefings you can listen to on the go Knowledge capture from important emails Reducing cognitive load from long email threads Who this template is for Professionals dealing with high email volume Teams using Notion as a central workspace n8n users building AI productivity automations Anyone who wants emails summarized instead of skimmed This template is designed to be practical, extensible, and production-ready, and can be easily adapted to: multiple Gmail labels different summary styles alternative TTS providers additional destinations (Slack, Docs, databases)
by CentralStationCRM
Overview This template benefits anyone who wants to: automate web research on a prospect company compile that research into an easily readable note and save the note into CentralStationCRM Tools in this workflow CentralStationCRM, the easy and intuitive CRM Software for small teams. Here is our API Documentation if you want to customize the workflow. ChatGPT, the well-known ai chatbot Tavily, a web search service for large language models Disclaimer Tavily Web Search is (as of yet) a community node. You have to activate the use of community nodes inside your n8n account to use this workflow. Workflow Screenshot Workflow Description The workflow consists of: a webhook trigger an ai agent node an http request node The Webhook Trigger The Webhook is set up in CentralStationCRM to trigger when a new company is created inside the CRM. The Webhook Trigger Node in n8n then fetches the company data from the CRM. The AI Agent Node The node uses ChatGPT as ai chat model and two Tavily Web Search operations ('search for information' and 'extract URLs') as tools. Additionally, it uses a simple prompt as tool, telling the ai model to re-iterate on the research data if applicable. The AI Agent Node takes the Company Name and prompts ChatGPT to "do a deep research" on this company on the web. "The research shall help sales people get a good overview about the company and allow to identify potential opportunities." The AI Agent then formats the results into markdown format and passes them to the next node. The CentralStationCRM protocol node This is an HTTP Request to the CentralStationCRM API. It creates a 'protocol' (the API's name for notes in the CRM) with the markdown data it received from the previous node. This protocol is saved in CentralStationCRM, where it can easily be accessed as a note when clicking on the new company entry. Customization ideas Even though this workflow is pretty simple, it poses interesting possibilities for customization. For example, you can alter the Webhook trigger (in CentralstationCRM and n8n) to fire when a person is created. You have to alter the AI prompt as well and make sure the third node adds the research note to the person, not a company, via the CentralStationCRM API. You could also swap the AI model used here for another one, comparing the resulting research data and get a deeper understanding of ai chat models. Then of course there is the prompt itself. You can definitely double down on the information you are most interested in and refine your prompt to make the ai bot focus on these areas of search. Start experimenting a bit! Preconditions For this workflow to work, you need a CentralStationCRM account with API Access an n8n account with API Access an Open AI account with API Access Have fun with our workflow!
by Afareayo Soremekun
ChannelCrawler API to Google Slides Template This template shows how you can use the ChannelCrawler API alongside ChatGPT (or any LLM) to generate google slides using images and texts received from the API How it Works A user inputs the link to the Youtube channel(s) of their target creators The list is parsed by a python script, returning it in a format that can be ran in a loop The workflow iterates over each channel url The url is passed to the ChannelCrawler API, where it returns a json of the creators profile. The OpenAI node processes the description and content of the creators profile to create a summary We retrieve the google slides presentation using the get presentation node. We use the Google Slides API to duplicate an existing page and pull back the original page as it has a new revision ID We use the Google Slides API to change the image placeholder of the of the image Presentation Lastly we update other placeholders in with text from the ChannelCrawler and ChatGPT outputs How to Use From executing the workflow, a pop up form will come up where you can insert the Youtube Channel urls On submission, provided the prerequisites are set up - rest of the workflow will be triggered Use Cases You can create profiles on influencers and creators with extensive data points from the ChannelCrawler API and consistent summarisation from GPT Prerequisites ChannelCrawler Account - there's a great pay as you go options for access to the API OpenAI account - the you can access free Open AI credit if you are a first time n8n user! Check the credentials options in the node Google account (For slides) - You should have a google account or sign up for google with your non google email
by Robert Breen
This n8n workflow template creates an intelligent data analysis chatbot that can answer questions about data stored in Google Sheets using OpenAI's GPT-5 Mini model. The system automatically analyzes your spreadsheet data and provides insights through natural language conversations. What This Workflow Does Chat Interface**: Provides a conversational interface for asking questions about your data Smart Data Analysis**: Uses AI to understand column structures and data relationships Google Sheets Integration**: Connects directly to your Google Sheets data Memory Buffer**: Maintains conversation context for follow-up questions Automated Column Detection**: Automatically identifies and describes your data columns 🚀 Try It Out! 1. Set Up OpenAI Connection Get Your API Key Visit the OpenAI API Keys page. Go to OpenAI Billing. Add funds to your billing account. Copy your API key into your OpenAI credentials in n8n (or your chosen platform). 2. Prepare Your Google Sheet Connect Your Data in Google Sheets Data must follow this format: Sample Marketing Data First row** contains column names. Data should be in rows 2–100. Log in using OAuth, then select your workbook and sheet. 3. Ask Questions of Your Data You can ask natural language questions to analyze your marketing data, such as: Total spend** across all campaigns. Spend for Paid Search only**. Month-over-month changes** in ad spend. Top-performing campaigns** by conversion rate. Cost per lead** for each channel. 📬 Need Help or Want to Customize This? 📧 rbreen@ynteractive.com 🔗 LinkedIn 🔗 n8n Automation Experts
by Arkadiusz
📝 Workflow Description This workflow creates a conversational bridge between Telegram / n8n Chat and Home Assistant. It allows users to control smart home devices or request information using natural language (text or voice). ⸻ 🔑 Key Features Multi-channel input: Works with both Telegram and n8n’s chat interface. Voice support: Telegram voice messages are transcribed to text using OpenAI Whisper. AI-driven assistant: Google Gemini processes queries in natural language. Home Assistant integration: Uses MCP client tools to execute actions like turning devices on/off, adjusting lights, or broadcasting messages. Memory management: Short-term memory keeps context within conversations. Smart reply routing: Responses are automatically sent back to the correct channel (Telegram or chat). Message formatting: Telegram replies are beautified (bold, bullet points, inline code, links). ⸻ 📌 Node Overview Telegram Trigger: Captures incoming Telegram messages (text or voice). Bot Is Typing: Sends a “typing…” action to indicate the bot is working. Voice or Text: Separates voice and text inputs. Get Voice File → Speech to Text → Transcription to ChatInput: Handles Telegram voice notes by downloading the file, transcribing it, and preparing it for the chat pipeline. When Chat Message Received: Captures messages from n8n’s built-in chat interface. Process Messages: Normalizes incoming data (input text, source, session ID, voice flag). Home Agent: Main AI agent that processes queries. Google Gemini Chat Model: Language model for intent understanding and conversation. Simple Memory & Simple Memory1: Buffer memories to preserve conversation context. Home Assistant Connector: MCP client node that executes smart home actions (turn on/off devices, adjust lights, etc.). Reply Router: Routes the assistant’s response either to Telegram or to the n8n chat webhook. Telegram Message Beautifier → Telegram Send: Formats and sends responses back to Telegram. Respond to Webhook: Sends responses to n8n chat. ⸻ 🚀 Example Use Cases Send “Turn on the living room lights” via Telegram → Bot triggers Home Assistant action. Ask “What’s the temperature in the bedroom?” → Response comes back formatted in Telegram. Record a voice note “Goodnight mode” → Automatically transcribed and executed by Home Assistant. Use n8n chat to quickly trigger automations or check device statuses. ⸻ ⚡️ Benefits Unified chat & voice control for Home Assistant. AI-powered natural language understanding. Works seamlessly across platforms (Telegram & n8n chat). Extensible: new tools or intents can be added easily.
by Julian Kaiser
Turn Your Reading Habit into a Content Creation Engine This workflow is built for one core purpose: to maximize the return on your reading time. It turns your passive consumption of articles and highlights into an active system for generating original content and rediscovering valuable ideas you may have forgotten. Why This Workflow is Valuable End Writer's Block Before It Starts:** This workflow is your personal content strategist. Instead of staring at a blank page, you'll start your week with a list of AI-generated content ideas—from LinkedIn posts and blog articles to strategic insights—all based on the topics you're already deeply engaged with. It finds the hidden connections between articles and suggests novel angles for your next piece. Rescue Your Insights from the Digital Abyss:** Readwise is fantastic for capturing highlights, but the best ones can get lost over time. This workflow acts as your personal curator, automatically excavating the most impactful quotes and notes from your recent reading. It doesn't just show them to you; it contextualizes them within the week's key themes, giving them new life and relevance. Create an Intellectual Flywheel:** By systematically analyzing your reading, generating content ideas, and saving those insights back into your "second brain," you create a powerful feedback loop. Your reading informs your content, and the process of creating content deepens your understanding, making every reading session more valuable than the last. How it works This workflow automates the process of generating a "Weekly Reading Insights" summary based on your activity in Readwise. Trigger:** It can be run manually or on a weekly schedule Fetch Data:** It fetches all articles and highlights you've updated in the last 7 days from your Readwise account. Filter & Match:** It filters for articles that you've read more than 10% of and then finds all the corresponding highlights for those articles. Generate Insights:** It constructs a detailed prompt with your reading data and sends it to an AI model (via OpenRouter) to create a structured analysis of your reading patterns, key themes, and content ideas. Save to Readwise:** Finally, it takes the AI-generated markdown, converts it to HTML, and saves it back to your Readwise account as a new article titled "Weekly Reading Insights". Set up steps Estimated Set Up Time:** 5-10 minutes. Readwise Credentials: Authenticate the two HTTP Request nodes and the two Fetch nodes with your Readwise API token Get from Reader API. Also check how to set up Header Auth AI Model Credentials: Add your OpenRouter API key to the OpenRouter Chat Model node. You can swap this for any other AI model if you prefer. Customize the Prompt: Open the Prepare Prompt Code node to adjust the persona, questions, and desired output format. This is where you can tailor the AI's analysis to your specific needs. Adjust Schedule: Modify the Monday - 09:00 Schedule Trigger to run on your preferred day and time.
by Țugui Dragoș
How it works This workflow fetches articles from any RSS feed, processes them with an AI model (DeepSeek), and sends only the most relevant alerts directly to Slack. Normalizes and deduplicates RSS items Extracts article text and cleans HTML Summarizes and classifies with AI (sentiment + flags) Filters out irrelevant news Sends real-time alerts to your Slack channel Setup steps Add your Slack Bot Token (via Slack API) Add your DeepSeek API Key Import this workflow into n8n Deploy and start receiving smart news alerts in Slack Use case Perfect for tracking AI, startups, finance, and tech industry news without the noise.
by Abrar Sami
How it works Fetches a blog post HTML from your blog URL using an HTTP request node Extracts readable content using Cheerio (code node) Saves the raw blog text to Airtable Translates the content to a language of your choice using Google Translate Updates the same Airtable record with the translated version in a different column Set up steps Estimated setup time:** 15–20 minutes (includes connecting Airtable and Google Translate credentials) You’ll need an Airtable base with HTML and TRANSLATED fields Or use this pre-made base: Airtable Template Simply add your blog post URL inside the HTTP Request node
by Kumar Shivam
The AI-powered MIS Agent is an intelligent, automated system built using n8n that streamlines email-based data collection and document organization for businesses. It classifies incoming emails, extracts and processes attachments or Drive links, and routes them to the correct destination folders in Google Drive. Additionally, it provides advanced file operations like cleaning, merging, joining, and transforming data. Advantages 📥 Automated Email and File Management Detects and processes emails containing attachments or Drive links, ensuring seamless classification and routing of business-critical files. 🧠 AI-Based Classification Uses LLMs (like GPT-4o Mini) to classify emails into categories such as Daily Sales, Customer Info, and Address based on their content. 📂 Smart File Routing and Upload Recognizes whether a file is a direct attachment or a Google Drive link, extracts the file ID if necessary, and uploads it to predefined folders. 📊 Powerful Data Operations Supports operations like append, join, group by, aggregation, and standardization of data directly from spreadsheets using Python and Pandas within the workflow. 🔁 Scheduled and Triggered Automation Supports scheduled runs and real-time email triggers, making it highly reliable and timely. 🔧 Fully Modular and Scalable Easily expandable with more logic, new folders, or different workflows. Clean architecture and annotations make maintenance simple. How It Works Email Trigger The system uses a Gmail trigger to monitor incoming emails with specific labels or attachments. Classification An LLM-based text classifier identifies the purpose of the email (e.g., sales data, address list, customer details). Conditional Logic Regex-based conditions check if the email contains Google Drive links or attachments. File Handling If it's a Drive link, it extracts the file ID and copies it to the correct folder. If it's an attachment, it uploads directly. Scheduled Data Management Periodically moves or logs files from predefined folders using a schedule trigger. Data Cleaning and Processing Performs data cleaning and transformation tasks like replacing missing values, standardizing formats, and joining datasets based on criteria provided by the user. Final Output Cleaned and processed files are saved in designated folders with their public links shared back through the system. Set Up Steps Configure Nodes: Gmail Trigger: Detects relevant incoming emails. Text Classifier: Uses OpenAI model to categorize email content. Regex Conditions: Determine whether a link or attachment is present. Google Drive Operations: Upload or copy files to categorized folders. Python Nodes: Handle data manipulation using Pandas. Google Sheets Nodes: Extract, clean, and write structured data. LLM-based Chat Models: Extract and apply cleaning configurations. Connect Nodes: Seamlessly connect Gmail inputs, classification, file processing, and data logic. Output links or processed files are uploaded back to Drive and ready to share. Credentials: Ensure OAuth credentials for Gmail, Google Drive, and OpenAI are correctly set. Ideal For Sales & CRM teams managing large volumes of email-based reports. Data teams needing structured pipelines from unstructured email inputs. Businesses looking to automate classification, storage, and transformation of routine data. Testing and Additional customization If you want to test this bot capability before purchasing the workflow. ask me on my mail kumar.shivam19oce@gmail.com I will share the chat url and the links of associated google drives to see the result once you are satisfied then we are good to go. I have just kept $1 for testing purposes because of paid open ai . -If there is any customization needed like charts and other request like adding databases feel free to let me know i can do it accordingly. This is the first version i will come with more advancements based on the request and responses. Use it and let me know on kumar.shivam19oce@gmail.com
by Jacob @ vwork Digital
This workflow helps small business owners using Wave Apps to easily access the Wave Accounting API using n8n In this example, the workflow is triggered from a new payout from Stripe. It then logs the transaction as a journal entry in Wave Accounting, helping you automate your accounting without needing to pay for expensive subscriptions! What this workflow template does This workflow triggers when a payout is paid in Stripe and sends a GraphQL request to the Wave Accounting API recording the payout as a journal entry automatically. The benefits of this worklow are to instantaneously keep your books up to date and to ensure accuracy. How to setup Setup your Stripe credential in n8n using the native Stripe nodes. Follow this guide to get your Wave Apps authorization token. Setup the node with the correct header auth -> {"Authorization": "Bearer TOKEN-HERE"} Get your account IDs from Wave The payload uses GraphQL so ensure to update it accordingly with what you are trying to achieve, the current example creates a journal entry. Tips Getting Wave account and IDs It is easiest to use network logs to obtain the right account IDs from Wave, especially if you have created custom accounts (default Wave account IDs can be obtained via making that API call). Go to Wave and make a new dummy transaction using the accounts you want to use in n8n. Have the network logs open while you do this. Search network logs for the name of the account you are trying to use. You should see account IDs in the log data. Sales tax This example uses sales tax baked into the total Stripe payout amount (5%) You can find the sales tax account IDs the same way as you found the Wave account IDs using network logs Use AI to help you with your API call Ask Claude or ChatGPT to draft you your ideal payload