by Anthony
Use Case It is very convenient to add expenses via simple chat message. This workflow attempts to do exactly this using AI-powered n8n magic! Send message to a chat, something like "car wash; 59.3 usd; 25 jan 2024" And get a response: Your expense saved, here is the output of save sub-workflow:{"cost":59.3,"descr":"car wash","date":"2024-01-25","msg":"car wash; 59.3 usd; 25 jan 2024"} LLM will smartly parse your message to structured JSON and save the expense as a new row into Google Sheet! Installation 1. Set up Google Sheets: Clone this Sheet: https://docs.google.com/spreadsheets/d/1D0r3tun7LF7Ypb21CmbTKEtn76WE-kaHvBCM5NdgiPU/edit?gid=0#gid=0 (File -> Make a copy) Choose this sheet into "Save expense into Google Sheets" node. 2. Fix sub-workflow dropdown: open "Parse msg and save to Sheets" node (which is an n8n sub-workflow executor tool) and make sure the SAME workflow is chosen in the dropdown. it will allow n8n to locate and call "Workflow Input Trigger" properly when needed. 3. Activate the workflow to make chat work properly. Sent message to chat, something like "car wash; 59.3 usd; 25 jan 2024" you should get a response: Your expense saved, here is the output of save sub-workflow:{"cost":59.3,"descr":"car wash","date":"2024-01-25","msg":"car wash; 59.3 usd; 25 jan 2024"} and new row in Google sheets should be inserted!
by Evoort Solutions
🎥 YouTube Video Summarizer for Social Media Turn any YouTube video into a short, structured summary using AI — perfect for content creators, marketers, or social media managers. 🔧 What We Built We created a no-code automation in n8n that: Accepts a YouTube Video ID via a form Fetches the video transcript using an external API Summarizes the transcript using AI (Google Gemini) Automatically saves the summary to Google Docs for team use 🧩 Flow Overview | Step | Description | |------|-------------| | ✅ Form Trigger | User submits a YouTube video ID using an n8n form | | 🔁 Set Node | Maps the YouTube video ID for use in the API request | | 🌐 HTTP Request (External API) | Calls the YouTube Transcriptor AI API via RapidAPI to fetch transcript | | 🧹 Formatter (Code Node) | Joins transcript lines into a readable text block | | 🧠 AI Agent + Google Gemini (via Langchain) | Summarizes the full transcript into bullet points and tone | | 🧽 Optimizer (Code Node) | Extracts just the summary from the AI response | | 📝 Google Docs Node | Appends the clean summary to a shared Google Doc | 🌍 Real-World Problem Solved ❌ The Challenge Creators and marketers waste hours watching full videos just to extract the key points. Manual summarization is inconsistent, repetitive, and delays content planning. ✅ Our Solution ⏱️ Reduces time spent watching videos 🧠 AI-powered summaries keep tone consistent and structured 📄 Auto-sync with Google Docs makes summaries instantly available for teams 🔥 Bonus: This uses the YouTube Transcriptor AI API, so no need to manually scrape captions or use browser extensions. 🚀 Ideal Use Cases Repurpose YouTube content into Instagram Reels, LinkedIn posts, or blog content Build a video summary library for your editorial team Quickly extract talking points from podcast episodes 🛠️ Tech Stack n8n** – workflow automation engine YouTube Transcriptor AI API** – via RapidAPI Google Gemini (via Langchain)** – AI summarization Google Docs** – stores the final summary JavaScript nodes** – custom text parsing & formatting 💡 Want to customize it? Add Slack, Airtable, Notion, or Tweet auto-posting to expand the flow. Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Jaruphat J.
This workflow integrates LINE BOT, AI Agent (GPT), Google Sheets, and Google Drive to enable users to search for file URLs using natural language. The AI Agent extracts the filename from the message, searches for the file in Google Sheets, and returns the corresponding Google Drive URL via LINE BOT. Supports natural language queries (e.g., "Find file 1.pdf for me") AI-powered filename extraction Google Sheets Lookup for file URLs Auto-response via LINE BOT How to Use This Template 1. Download & Import Copy and save the Template Code as a .json file. Go to n8n Editor → Click Import → Upload the file. 2. Update Required Fields Replace YOUR_GOOGLE_SHEET_ID with your actual Google Sheet ID. Replace YOUR_LINE_ACCESS_TOKEN with your LINE BOT Channel Access Token. 3. Activate & Test Click Execute Workflow to test manually. Set Webhook URL in LINE Developer Console. Features of This Template Supports Natural Language Queries (e.g., “Find file 1.pdf for me”) AI-powered filename extraction using OpenAI (GPT-4/3.5) Real-time file lookup in Google Sheets Automatic LINE BOT Response Fully Automated Workflow
by scrapeless official
> ⚠️ Disclaimer: This workflow uses Scrapeless and Claude AI via community nodes, which require n8n self-hosted to work properly. 🔁 How It Works This intelligent B2B lead generation workflow combines search automation, website crawling, AI analysis, and multi-channel output: It starts by using Scrapeless’s Deep Serp API to find company websites from targeted Google Search queries. Each result is then individually crawled using Scrapeless's Crawler module, retrieving key business information from pages like /about, /contact, /services. The raw web content is processed via a Code node to clean, extract, and prepare structured data. The cleaned data is passed to Claude Sonnet (Anthropic) which analyzes and qualifies the lead based on content richness, contact data, and relevance. A filter step ensures only high-quality leads (e.g. lead score ≥ 6) are kept. Sent via Discord webhook for real-time notification (can be replaced with Slack, email, or CRM tools). > 📌 The result is a fully automated system that finds, qualifies, and organizes B2B leads with high efficiency and minimal manual input. ✅ Pre-Conditions Before using this workflow, make sure you have: An n8n self-hosted instance A Scrapeless account and API key (get it here) An Anthropic Claude API key A configured Discord webhook URL (or alternative notification service) ⚙️ Workflow Overview Manual Trigger → Scrapeless Google Search → Item Lists → Scrapeless Crawler → Code (Data Cleaning) → Claude Sonnet → Code (Response Parser) → Filter → Discord Notification 🔨 Step-by-Step Breakdown Manual Trigger – For testing purposes (can be replaced with Cron or Webhook) Scrapeless Google Search – Queries target B2B topics via Scrapeless’s Deep SERP API Item Lists – Splits search results into individual items Scrapeless Crawler – Visits each company domain and scrapes structured content Code Node (Data Cleaner) – Extracts and formats content for LLM input Claude Sonnet (via HTTP Request) – Evaluates lead quality, relevance, and contact info Code Node (Parser) – Parses Claude’s JSON response IF Filter – Filters leads based on score threshold Discord Webhook – Sends formatted message with company info 🧩 Customization Guidance You can easily adjust the workflow to match your needs: Lead Criteria**: Modify the Claude prompt and scoring logic in the Code node Output Channels**: Replace the Discord webhook with Slack, Email, Airtable, or any CRM node Search Topics**: Change your query in the Scrapeless SERP node to find leads in different niches or countries Scoring Threshold**: Adjust the filter logic (lead_score >= 6) to match your quality tolerance 🧪 How to Use Insert your Scrapeless and Claude API credentials in the designated nodes Replace or configure the Discord webhook (or alternative outputs) Run the workflow manually (or schedule it) View qualified leads directly in your chosen notification channel 📦 Output Example Each qualified lead includes: 🏢 Company Name 🌐 Website ✉️ Email(s) 📞 Phone(s) 📍 Location 📈 Lead Score 📝 Summary of relevant content 👥 Ideal Users This workflow is perfect for: AI SaaS companies** targeting mid-market & enterprise leads Marketing agencies** looking for B2B-qualified leads Automation consultants** building scraping solutions No-code developers** working with n8n, Make, Pipedream Sales teams** needing enriched prospecting data
by Lorena
This workflow is triggered when a new deal is created in HubSpot. Then, it processes the deal based on its value and stage. The first branching follows three cases: If the deal is closed and won, a message is sent in a Slack channel, so that the whole team can celebrate the success. If a presentation has been scheduled for the deal, then a Google Slides presentation template is created. If the deal is closed and lost, the deal’s details are added to an Airtable table. From here, you can analyze the data to get insights into what and why certain deals don’t get closed. The second branching follows two cases: If the deal is for a new business and has a value above 500, a high-priority ticket assigned to an experienced team member is created in HubSpot If the deal is for an existing business and has a value below 500, a low-priority ticket is created.
by Agent Circle
This n8n template helps you automatically discover, analyze, and track trending topics and videos on YouTube using an AI-powered agent. Use cases are many: This workflow is perfect for YouTube creators needing fresh video ideas, digital marketers scouting new campaign topics, social media managers who want to catch trends early, and researchers who want to analyze what’s viral. How It Works The workflow starts whenever a chat message is received (e.g., a trend question, a topic prompt, or a request for insights). Incoming chat is routed to the AI Agent – Trend Explorer node: First, the agent triggers the Workflow – YouTube Search tool to gather the latest trending topics and keywords from YouTube. Next, the agent supplies this real-time YouTube data to the OpenAI Chat Model for deep analysis, trend interpretation, and unique insights. To provide context-aware answers and track ongoing interests, the agent also references a Simple Memory module, recalling past queries, and user instructions. Finally, the result is a fast, data-driven, and smart trend report delivered instantly to your chat. How To Set Up Download the workflow package (including 2 .json files) and import it into your n8n interface. Set up necessary access in the following components of the AI Agent - Trend Explorer node: OpenAI Chat Model: allows API connection for trend insights. Workflow – YouTube Search: searches for trending videos based on the query. Simple Memory (optional): enhances experience for ongoing sessions. Start by sending a chat message on n8n. Check the response from the AI agent in the same chat box. Ask follow-ups, explore deeper, or trigger new searches - all in one chat thread. Requirements n8n instance (self-hosted or cloud). Set up API access to OpenAI Chat Model for chat-based AI. How To Customize Connect to your favorite chat platforms**: Easily integrate with additional chat triggers such as Telegram, Slack, or your preferred messaging app. Choose your preferred AI model**: If you want a different viewpoint, simply swap OpenAI Chat Model for Google Gemini, Claude, or any compatible LLM model in your workflow. Upgrade memory for smarter conversations: For long-term recall or more advanced, context-aware chats, replace **Simple Memory with a vector database like Pinecone or Redis. Need Help? If you’d like this workflow customized to fit your tools and platforms availability, or if you’re looking to build a tailored AI Agent for your own business - please feel free to reach out to Agent Circle. We’re always here to support and help you to bring automation ideas to life. Join our community on different platforms for support, inspiration and tips from others. Website: https://www.agentcircle.ai/ Etsy: https://www.etsy.com/shop/AgentCircle Gumroad: http://agentcircle.gumroad.com/ Discord Global: https://discord.gg/d8SkCzKwnP FB Page Global: https://www.facebook.com/agentcircle/ FB Group Global: https://www.facebook.com/groups/aiagentcircle/ X: https://x.com/agent_circle YouTube: https://www.youtube.com/@agentcircle LinkedIn: https://www.linkedin.com/company/agentcircle
by JPres
n8n Template: Store Chat Data in Supabase PostgreSQL for WhatsApp/Slack Integration This n8n template captures chat data (like user ID, name, or address) and saves it to a Supabase PostgreSQL database. It’s built for testing now but designed to work with WhatsApp, Slack, or similar platforms later, where chat inputs aren’t predefined. Guide with images can be found on: https://github.com/JimPresting/Supabase-n8n-Self-Hosted-Integration/ Step 1: Configure Firewall Rules in Your VPC Network To let your n8n instance talk to Supabase, add a firewall rule in your VPC network settings (e.g., Google Cloud, AWS, etc.). Go to VPC Network settings. Add a new firewall rule: Name: allow-postgres-outbound Direction: Egress (outbound traffic) Destination Filter: IPv4 ranges Destination IPv4 Ranges: 0.0.0.0/0 (allows all; restrict to Supabase IPs for security) Source Filter: Pick IPv4 ranges and add the n8n VM’s IP range, or Pick None if any VM can connect Protocols and Ports: Protocol: TCP Port: 5432 (default PostgreSQL port) Save the rule. Step 2: Get the Supabase Connection String Log into your Supabase Dashboard. Go to your project, click the Connect button in the header. Copy the PostgreSQL connection string: postgresql://postgres.fheraruzdahjd:[YOUR-PASSWORD]@aws-0-eu-central-1.pooler.supabase.com:6543/postgres Replace [YOUR-PASSWORD] with your Supabase account password (no brackets) and replace the string before that with your actual unique identifier. Note the port (6543 or 5432)—use what’s in the string. Step 3: Set Up the n8n Workflow This workflow takes chat data, maps it to variables, and stores it in Supabase. It’s built to handle messy chat inputs from platforms like WhatsApp or Slack in production. Workflow Steps Trigger Node: "When clicking 'Test workflow'" (manual trigger). For now, it’s manual. In production, this will be a WhatsApp or Slack message trigger, which won’t have a fixed input format. Set Node: "Set sample input variables (manual)". This node sets variables like id, name, address to mimic chat data. Why? Chat platforms send unstructured data (e.g., a message with a user’s name or address). We map it to variables so we can store it properly. The id will be something unique like a phone number, account ID, or account number. Sample Agent Node: Uses a model (e.g., GeminiFlash2.0 but doesn't matter). This is a placeholder to process data (e.g., clean or validate it) before saving. You can skip or customize it. Supabase PostgreSQL Node: "Supabase PostgreSQL Database". Connects to Supabase using the connection string from Step 2. Saves the variables (id, name, address) to a table. Why store extra fields? The id (like a phone number or account ID) is the key. Extra fields like name or address let us keep all user info in one place for later use (e.g., analytics or replies). Output Node: "Update additional values e.g., name, address". Confirms the data is saved. In production, this could send a reply to the chat platform. Why This Design? Handles Unstructured Chat Data**: WhatsApp or Slack messages don’t have a fixed format. The "Set" node lets us map any incoming data (e.g., id, name) to our database fields. Scales for Production**: Using id as a key (phone number, account ID, etc.) with extra fields like name makes this workflow flexible for many use cases, like user profiles or support logs. Future-Ready**: It’s built to swap the manual trigger for a real chat platform trigger without breaking. Step 4: Configure the Supabase PostgreSQL Node In the n8n workflow, set up the Supabase PostgreSQL node: Host: aws-0-eu-central-1.pooler.supabase.com (from the connection string) Port: 6543 (or what’s in the connection string) Database: postgres User: postgres.fhspudlibstmpgwqmumo (from the connection string) Password: Your Supabase password SSL: Enable (Supabase usually requires it) Set the node to Insert or Update: Map id to a unique column in your Supabase table (e.g., phone number, account ID). Map fields like name, address to their columns. Test the workflow to confirm data saves correctly. Security Tips Limit Firewall Rules**: Don’t use 0.0.0.0/0. Find Supabase’s IP ranges in their docs and use those. Hide Passwords**: Store your Supabase password in n8n’s environment variables. Use SSL**: Enable SSL in the n8n node for secure data transfer.
by phil
This workflow automates the process of summarizing or transcribing a WordPress article, converting the text into speech using Eleven Labs API, and uploading the resulting MP3 file back to WordPress. How It Works Trigger – The workflow starts manually when the user clicks “Test Workflow”. Retrieve Article – It fetches a WordPress article based on a given post ID. Summarize or Transcribe – An LLM (GPT-4o-mini) generates either: • A summary of the article, or • A full transcription, depending on the chosen prompt. Generate Speech – The processed text (summary or transcription) is converted into an MP3 audio file using Eleven Labs API. Upload MP3 to WordPress – The generated MP3 file is uploaded to WordPress. Update WordPress Post – The article is updated with an embedded audio player, allowing users to listen to the summary or transcription. Set Up Steps WordPress API Credentials • Configure your WordPress API credentials in n8n. Eleven Labs API Key • Obtain an API Key from Eleven Labs and configure it in n8n. Choose Between Summary or Transcription • Modify the AI prompt to either generate a summary or keep the full transcription. Test the Workflow • Run the workflow and ensure the MP3 file is correctly generated and uploaded. 💡 Customization Options • Modify the AI prompt to switch between a summary and a transcription. • Change the voice model in Eleven Labs for different speech styles. • Adjust output format to higher/lower quality MP3. 🚀 This automation improves content accessibility and engagement by allowing users to listen to a summarized or full version of the article. Phil | Inforeole
by n8n Team
This workflow is designed for dynamic and intelligent conversational capabilities. It incorporates OpenAI's GPT-4o model for natural language understanding and generation. Additional tools include SerpAPI and Wikipedia for enriched, data-driven responses. The workflow is triggered manually, and utilizes a 'Window Buffer Memory' to maintain the context of the last 20 interactions for better conversational continuity. All these components are orchestrated through n8n nodes, ensuring seamless interconnectivity. To use this template, you need to be on n8n version 1.50.0 or later.
by iamvaar
This n8n workflow automatically detects high‑spending hotel guests after checkout and emails them a personalized, one‑time reward offer. 🔧 What it does Watches Salesforce Guest__c custom object for checkout updates. Pulls guest spend data on optional paid amenities: Room Service Minibar Laundry Late Checkout Extra Bed Airport Transfer Calculates total spend to identify VIP guests (≥ $50). Uses AI to: Spot unused services. Randomly pick one unused service. Generate a realistic, short promo like: "Free late checkout on your next stay" Parses AI output into JSON. Sends a polished HTML email to the guest with their personalized offer. 📦 Key nodes Salesforce Trigger → monitors new checkouts. Salesforce → fetches detailed spend data. Function → sums up total amenity spend. IF → filters for VIP guests. LangChain LLM + Google Vertex AI → drafts the offer text. Structured Output Parser → cleans AI output. Brevo → delivers branded email. 📊 Example output > Subject: John, We Have Something Special for Your Next Stay > Offer in email: Enjoy a complimentary minibar selection on your next stay. ✨ Why it matters Rewarding guests who already spend boosts loyalty and repeat bookings — without generic discounts. The offer feels personal, relevant, and exclusive.
by Itamar
🧠 ICP Scoring Agent (n8n + Explorium + LLM) This workflow automates Ideal Customer Profile (ICP) scoring for any company using a combination of Explorium data and an LLM-driven evaluation framework. 🔧 How It Works Input: Company name is submitted via form. Data Enrichment: Explorium's MCP Server is used to fetch firmographic, hiring, and tech data about the company. Scoring Logic: An AI agent (LLM) applies a 3-pillar framework to assess and score the company. Output: A structured JSON or Google Doc summary is generated using the AgentGeeks formatter. 📊 Scoring System (100 points total) | Pillar | Max Points | |------------------------------|------------| | Strategic Fit | 40 | | AI / Tech Readiness | 40 | | Engagement & Reachability | 20 | 🧠 Scoring Criteria Strategic Fit**: Industry, size, use case, buyer roles Tech Readiness**: AI maturity, hiring trends, stack visibility Reachability**: Geography, contactability, data quality 🎯 Verdict Scale 🟩 90–100: Ideal ICP ✅ 70–89: Good Fit 🟨 40–69: Medium Fit ❌ < 40: Poor Fit 📦 Workflow Components Trigger**: Form submission via webhook MCP Client**: Pulls enriched company data via Explorium's MCP API AI Agent**: Uses Anthropic Claude (or other LLM) to calculate scores Output**: Results are posted to a structured endpoint (e.g. Google Doc or JSON API) 🧰 Dependencies n8n (self-hosted or cloud) Explorium MCP credentials and access LLM API (e.g., Anthropic Claude, OpenAI, etc.) Optional: AgentGeeks formatter or similar doc generator 💼 Use Case This ICP scoring system is designed for GTM and sales teams to: Automate lead prioritization Qualify accounts before outbounding Sync ICP data into CRMs, routing systems, or reporting layers 📈 Example Output in Google Doc { "company": "Acme Inc.", "score": 87, "verdict": "Good Fit", "pillars": { "strategic_fit": 35, "tech_readiness": 37, "reachability": 15 }, "summary": "Acme Inc. is a mid-sized SaaS company with strong AI hiring activity and a buyer profile aligned to enterprise IT. Moderate reachability via firmographic signals." }
by Don Jayamaha Jr
Get real-time cryptocurrency prices directly in Telegram! This workflow integrates CoinMarketCap API with Telegram, allowing users to request live crypto prices simply by sending a message to the bot. Ideal for crypto traders, analysts, and enthusiasts who need quick and easy access to market data. How It Works A Telegram bot listens for user input (e.g., "BTC" for Bitcoin). The workflow sends a request to the CoinMarketCap API to fetch the latest price. The response is processed using an AI-powered language model (GPT-4o-mini) for structured messaging. The workflow logs session data using a memory buffer for better response tracking. The latest price is sent back to the user via Telegram. Set Up Steps Create a Telegram Bot Use @BotFather on Telegram to create a bot and obtain an API token. Get a CoinMarketCap API Key Sign up at CoinMarketCap and retrieve your API key. Configure API Credentials in n8n Add the CoinMarketCap API key under HTTP Header Auth. Add your Telegram bot token under Telegram API credentials. Deploy and Test Send a message (e.g., "BTC") to your Telegram bot and receive live price updates instantly! Automate your crypto price tracking today with this powerful Telegram bot!