by Jimleuk
This n8n workflow demonstrates a simple approach to improve chat UX by staggering an AI Agent's reply for users who send in a sequence of partial messages and in short bursts. How it works Twilio webhook receives user's messages which are recorded in a message stack powered by Redis. The execution is immediately paused for 5 seconds and then another check is done against the message stack for the latest message. The purpose of this check lets use know if the user is sending more messages or if they are waiting for a reply. The execution is aborted if the latest message on the stack differs from the incoming message and continues if they are the same. For the latter, the agent receives the buffered messages up to that point and is able to respond to them in a single reply. Requirements A Twilio account and SMS-enabled phone number to receive messages. Redis instance for the messages stack. OpenAI account for the language model. Customising the workflow This workflow should work for other common messaging platforms such as Whatsapp and Telegram. 5 seconds too long or too short? Adjust the wait threshold to suit your customers.
by Leonardo Grigorio
Youtube Video This n8n workflow is designed to assist YouTube content creators in identifying trending topics within a specific niche. By leveraging YouTube's search and data APIs, it gathers and analyzes video performance metrics from the past two days to provide insights into what content is gaining traction. Here's how the workflow operates: Trigger Setup: The workflow begins when a user sends a query through the chat_message_received node. If no niche is provided, the AI prompts the user to select or input one. AI Agent (Language Model): The central node utilizes a GPT-based AI agent to: Understand the user's niche or content preferences. Generate tailored search terms related to the niche. Process YouTube API responses and summarize trends using insights such as common themes, tags, and audience engagement metrics (views, likes, and comments). YouTube Search: The youtube_search node runs a secondary workflow to query YouTube for relevant videos published within the last two days. It retrieves basic video data such as video IDs, relevance scores, and publication dates. Video Details Retrieval: The workflow fetches additional details for each video: Video Snippet: Metadata like title, description, and tags. Video Statistics: Metrics such as views, likes, and comments. Content Details: Video duration, ensuring only content longer than 3 minutes and 30 seconds is analyzed. Data Processing: Video metadata is cleaned, sanitized, and stored in memory. Tags, titles, and descriptions are analyzed to identify patterns and trends across multiple videos. Output: The workflow compiles insights and presents them to the user, highlighting: The most common themes or patterns within the niche. URLs to trending videos and their respective channels. Engagement statistics, helping the user understand the popularity of the content. Key Notes for Setup: API Keys**: Ensure valid YouTube API credentials are configured in the get_videos, find_video_snippet, find_video_statistics, and find_video_data nodes. Memory Buffer**: The window_buffer_memory node ensures the AI agent retains context during analysis, enhancing the quality of the generated insights. Search Term Customization**: The AI agent dynamically creates search terms based on the userโs niche to improve search precision. Use Case: This workflow is ideal for YouTubers or marketers seeking data-driven inspiration for creating content that aligns with current trends, maximizing the potential to engage their audience. Example Output: For the niche "digital marketing": Trending Topic: Videos about "mental triggers" and "psychological marketing." Tags: "SEO," "Conversion Rates," "Social Proof." Engagement: Videos with over 200K views and high likes/comment ratios are leading trends. Video links: https://www.youtube.com/watch?v=video_id1 https://www.youtube.com/watch?v=video_id2
by Federico De Ponte
๐ Loop & Optimize Meta Tags with Google Gemini This workflow automates the shortening of meta titles and descriptions for SEOโdirectly from your Google Sheet, row by row, using Google Gemini. โ What it does Reads rows from a Google Sheet (meta_title, meta_description, row_index) Loops through each row and checks if content exists Sends the data to Google Gemini for length-optimized output Cleans and parses the response Updates the original sheet with the shortened results ๐ ๏ธ Setup Requirements Google Sheets (OAuth2 credentials connected in n8n) Google Gemini API key (configured in n8n credentials) Sheet must contain: row_index meta_title meta_description Output will be written into: meta_titleFixed meta_descriptionFixed
by David Olusola
๐ค AI-Powered Lead Enrichment with Explorium MCP & Telegram Who it's for Sales reps, agencies, and growth teams who want to turn basic company info into qualified leads with automated research . Perfect for B2B prospecting. What it does This workflow lets you send a company name or domain via Telegram, and instantly returns: โ Enriched company profile (industry, size, tech, pain points) โ A clean, structured JSON โ ready for your CRM or sales tools How it works ๐ฌ Send company info to your Telegram bot ๐ Workflow pulls data from Explorium MCP + Tavily ๐ง AI analyzes model, tools, pain points & goals ๐ค JSON response sent back via Telegram or logged to your database Requirements ๐ OpenAI API (GPT-4) ๐ง Explorium MCP API ๐ Tavily Web Search API ๐ค Telegram Bot API ๐๏ธ PostgreSQL (for memory/logging) How to set up Add API keys in n8n Connect Telegram bot to webhook Set up PostgreSQL for memory persistence Customize prompts (tone, niche, etc.) Test by sending a company name via Telegram Customization Options ๐ฏ Focus enrichment on specific industries or keywords ๐ฌ Adjust the email sequence structure & style ๐งฉ Add extra data sources (e.g. Clearbit, Crunchbase) ๐งพ Format JSON to match your CRM schema โ๏ธ Add approval step before sending emails Highlights โ Uses multi-source enrichment โ Works 100% from Telegram โ Integrates into any sales pipeline
by Matt Chong
Who is this for? This workflow is ideal for: For freelancers, business owners, and finance teams who receive receipts via Gmail Automatically logs expenses for tax, bookkeeping and year-end audits What problem is this workflow solving? When tax season hits, missing receipts create panic. This workflow keeps everything in one place. It uses AI to extract details from Gmail attachments, logs them in a Google Sheet, and stores the PDFs in Google Drive. No digging. No copying. Just everything where it should be. How it works? Apply the label receipt to any incoming Gmail email. Do not mark it as read. On a schedule (e.g. daily at 8:00 AM), the workflow triggers. It searches for unread emails with the label receipt. For each matching email, it downloads the attached receipt file. It extracts text content from the receipt file. It uploads the original receipt file to a specified folder in Google Drive. It merges the extracted text with email metadata. It sends this combined data to OpenAI. OpenAI extracts structured fields: date merchant category description subtotal tax total The extracted data is appended as a new row in the specific Google Sheet. Finally, the email is marked as read to prevent it from being processed again. How to set up? Connect these services in your n8n credentials: Gmail (OAuth2) Google Drive Google Sheets OpenAI Configure the Google Drive upload: In the โUpload Fileโ node, select the target folder where you want receipt PDFs stored. Set your execution schedule: Open the โSchedule Triggerโ node and choose when it should run (default is once daily at 8:00 AM). Choose your Google Sheet and tab: In the โAppend to Google Sheetโ node, select your document and tab Ensure the sheet contains these columns: Date, Merchant, Category, Description, Subtotal, Tax, Total. How to customize this workflow to your needs? Change the Gmail label or search filter** to match your needs. Modify the OpenAI schema** to extract additional fields like currency, project, or notes.
by Mihai Farcas
How it works: The workflow starts by sending a request to a website to retrieve its HTML content. It then parses the HTML extracting the relevant information The extracted data is storted and converted into a CSV file. The CSV file is attached to an email and sent to your specified address. The data is simultaneously saved to both Google Sheets and Microsoft Excel for further analysis or use. Set-up steps: Change the website to scrape in the "Fetch website content" node Configure Microsoft Azure credentials with Microsoft Graph permissions (required for the Save to Microsoft Excel 365 node) Configure Google Cloud credentials with access to Google Drive, Google Sheets and Gmail APIs (the latter is required for the Send CSV via e-mail node).
by Nicolas Chourrout
This workflow automatically generates draft replies in Gmail. It's designed for anyone who manages a high volume of emails or often face writer's block when crafting responses. Since it doesn't send the generated message directly, you're still in charge of editing and approving emails before they go out. How It Works: Email Trigger: activates when new emails reach the Gmail inbox Assessment: uses OpenAI gpt-4o and a JSON parser to determine if a response is necessary. Reply Generation: crafts a reply with OpenAI GPT-4 Turbo Draft Integration: after converting the text to html, it places the draft into the Gmail thread as a reply to the first message Set Up Overview (~10 minutes): OAuth Configuration (follow n8n instructions here): Setup Google OAuth in Google Cloud console. Make sure to add Gmail API with the modify scope. Add Google OAuth credentials in n8n. Make sure to add the n8n redirect URI to the Google Cloud Console consent screen settings. OpenAI Configuration: add OpenAI API Key in the credentials Tweaking the prompt: edit the system prompt in the "Generate email reply" node to suit your needs Detailed Walkthrough Check out this blog post where I go into more details on how I built this workflow. Reach out to me here if you need help building automations for your business.
by Yaron Been
๐ Automated Job Hunter: Upwork Opportunity Aggregator & AI-Powered Notifier! Workflow Overview This cutting-edge n8n automation is a sophisticated job discovery and notification tool designed to transform freelance job hunting into a seamless, intelligent process. By intelligently connecting Apify, OpenAI, Google Sheets, and Gmail, this workflow: Discovers Job Opportunities: Automatically scrapes Upwork job listings Tracks recent freelance opportunities Eliminates manual job searching efforts Intelligent Data Processing: Filters and extracts key job details Structures job information Ensures comprehensive opportunity tracking AI-Powered Summarization: Generates concise job summaries Creates human-readable job digests Provides quick, actionable insights Seamless Notification: Automatically logs jobs to Google Sheets Sends personalized email digests Enables rapid opportunity assessment Key Benefits ๐ค Full Automation: Zero-touch job discovery ๐ก Smart Filtering: Targeted job opportunities ๐ Comprehensive Tracking: Detailed job market insights ๐ Multi-Platform Synchronization: Seamless data flow Workflow Architecture ๐น Stage 1: Job Discovery Scheduled Trigger**: Daily job scanning Apify Integration**: Upwork job scraping Intelligent Filtering**: Recent job postings Specific keywords Relevant opportunities ๐น Stage 2: Data Extraction Comprehensive Job Metadata Parsing** Key Information Retrieval** Structured Data Preparation** ๐น Stage 3: AI Summarization OpenAI GPT Processing** Professional Summary Generation** Contextual Job Insight Creation** ๐น Stage 4: Multi-Platform Distribution Google Sheets Logging** Gmail Integration** Automated Job Digest Delivery** Potential Use Cases Freelancers**: Opportunity tracking Job Seekers**: Automated job discovery Recruitment Agencies**: Market intelligence Skill Development Professionals**: Trend monitoring Career Coaches**: Client opportunity identification Setup Requirements Apify Upwork scraping actor API token Configured scraping parameters OpenAI API GPT model access Summarization configuration API key management Google Sheets Connected Google account Prepared job tracking spreadsheet Appropriate sharing settings Gmail Account Connected email Job digest configuration Appropriate sending permissions n8n Installation Cloud or self-hosted instance Workflow configuration API credential management Future Enhancement Suggestions ๐ค Advanced job matching algorithms ๐ Multi-platform job aggregation ๐ Customizable alert mechanisms ๐ Expanded job category tracking ๐ง Machine learning job recommendation Technical Considerations Implement robust error handling Use secure API authentication Maintain flexible data processing Ensure compliance with platform guidelines Ethical Guidelines Respect job poster privacy Use data for legitimate job searching Maintain transparent information gathering Provide proper attribution Hashtag Performance Boost ๐ #FreelanceJobHunting #CareerAutomation #JobDiscovery #AIJobSearch #WorkflowAutomation #FreelanceTech #CareerIntelligence #JobMarketInsights #ProfessionalNetworking #TechJobSearch Workflow Visualization [Daily Trigger] โฌ๏ธ [Fetch Upwork Jobs] โฌ๏ธ [Format Job Fields] โฌ๏ธ [Log to Google Sheets] โฌ๏ธ [AI Summarization] โฌ๏ธ [Send Email Digest] Connect With Me Ready to revolutionize your job hunting strategy? ๐ง Email: Yaron@nofluff.online ๐ฅ YouTube: @YaronBeen ๐ผ LinkedIn: Yaron Been Transform your job search with intelligent, automated workflows!
by Daniel Shashko
This workflow enables you to automate the daily monitoring of how an AI model (like ChatGPT) responds to specific queries relevant to your market. It identifies mentions of your brand and predefined competitors, logs detailed interactions in Google Sheets, and delivers a comprehensive email report. Main Use Cases Monitor how your brand is mentioned by AI in response to relevant user queries. Track mentions of key competitors to understand AI's comparative positioning. Gain insights into AI's current knowledge and portrayal of your brand and market landscape. Automate daily intelligence gathering on AI-driven brand perception. How it works The workflow operates as a scheduled process, organized into these stages: Configuration & Scheduling Triggers daily (or can be run manually). Key variables are defined within the workflow: your brand name (e.g., "YourBrandName"), a list of queries to ask the AI, and a list of competitor names to track in responses. AI Querying For each predefined query, the workflow sends a request to the OpenAI ChatGPT API (via an HTTP Request node). Response Analysis Each AI response is processed by a Code node to: Check if your brand name is mentioned (case-insensitive). Identify if any of the listed competitors are mentioned (case-insensitive). Extract the core AI response content (limited to 500 characters for brevity in logs/reports). Data Logging to Google Sheets Detailed results for each queryโincluding timestamp, date, the query itself, query index, your brand name, the AI's response, whether your brand was mentioned, and any errorsโare appended to a specified Google Sheet. Email Report Generation A comprehensive HTML email report is compiled. This report summarizes: Total queries processed, number of times your brand was mentioned, total competitor mentions, and any errors encountered. A summary of competitor mentions, listing each competitor and how many times they were mentioned. A detailed table listing each query, whether your brand was mentioned, and which competitors (if any) were mentioned in the AI's response. Automated Reporting The generated HTML email report is sent to specified recipients, providing a daily snapshot of AI interactions. Summary Flow: Schedule/Workflow Trigger โ Initialize Brand, Queries, Competitors (in Code node) โ For each Query: Query ChatGPT API โ Process AI Response (Check for Brand & Competitor Mentions) โ Log Results to Google Sheets โ Generate Consolidated HTML Email Report โ Send Email Notification Benefits: Fully automated daily monitoring of AI responses concerning your brand and competitors. Provides objective insights into how AI models are representing your brand in user interactions. Delivers actionable competitive intelligence by tracking competitor mentions. Centralized logging in Google Sheets for historical analysis and trend spotting. Easily customizable with your specific brand, queries, competitor list, and reporting recipients.
by Sidetool
Hello there! This is a supporting workflow for an Airtable Base that handles Recurring Tasks. The objective of the workflow is to handle creating tasks on a recurring basis depending on the Airtable Setup You can access that Airtable Template here for complete context- Airtable Universe The functionality of the workflow can be easliy adapted to any data source. Feel free to contact us with any doubts or questions at http://sidetool.co โ Use this as is, or adapted to your existing Airtable Base โ embrace automated simplicity! ๐๐
by Nasser
For Who? Content Creators Youtube Automation Marketing Team How it works? 1 - Enter the ID of the YTB channel to trigger the workflow when a new video is posted 2 - Apify scrape the last YTB video of the channel 3 - Wait until the dataset is completed in Apify and get it 4 - Verify if Metadata are not already generated and generate them with LLM 5 - Format all the data created and update YTB Video ๐บย YouTube Video Tutorial: SETUP Setup Input YTB Chanel : Go to the channel's page on YouTube, and look at the URL of the page. The channel ID is the value that comes after channel/ in the URL. Add it after "?channel_id=" You can also use free tools available to retrieve channel ID. Setup Output YTB Video Update : Connect your YTB account to your n8n instance thanks to the Google Cloud Console. You can find tutorials by typing "youtube api Oauth" on Google. APIs : For the following third-party integrations, replace ==[YOUR_API_TOKEN]== with your API Token or connect your account via Client ID / Secret to your n8n instance : Apify : https://docs.apify.com/api/v2/getting-started Youtube : https://docs.n8n.io/integrations/builtin/app-nodes/n8n-nodes-base.youtube/?utm_source=n8n_app&utm_medium=node_settings_modal-credential_link&utm_campaign=n8n-nodes-base.youTube#templates-and-examples ๐จโ๐ปย More Workflows : https://n8n.io/creators/nasser/
by Adam Janes
How it works The workflow loads a list of test cases from a Google Sheet (previous results stored from an LLM) For each test case, we execute a call to an LLM judge in parallel (using HTTP Request + Webhook nodes) The judge uses the Input, Output, and Reference Answer fields from the spreadsheet to mark each LLM response as Pass/Fail The results are logged into a separate sheet in the same Sheets file. Set up steps: Add your credentials for Google Sheets and OpenRouter (or replace the OpenRouter node with your favourite chat model). Make a copy of the example Sheet to populate it with you own test data. Run the workflow with the Execute Workflow button next to the Manual Trigger node.