by 長谷 真宏
Who is this for? This template is perfect for agencies, consultancies, freelancers, and project-based teams who want to eliminate repetitive onboarding tasks. If you're tired of manually creating folders, Slack channels, and project pages every time a new client signs a contract, this automation will save you hours. What this workflow does When a new contract PDF is uploaded to a designated Google Drive folder, this workflow automatically: Parses the filename to extract client name, project name, and contact email Creates a project folder structure in Google Drive with organized subfolders Creates a dedicated Slack channel for project communication Sets up a Notion project page with initial kickoff tasks Logs project details to a master Google Sheet for tracking Drafts a personalized welcome email using OpenAI GPT-4o-mini Notifies your team on Slack with all relevant links when complete Setup steps Time required: ~15 minutes Configure OAuth credentials for Google Drive, Gmail, Google Sheets, Slack, and Notion Add your OpenAI API key for AI-powered email drafting Update the "Set Config Variables" node with your specific IDs: Google Drive parent folder ID Notion database ID Google Sheet ID Slack notification channel ID Set up the trigger folder in Google Drive where contracts will be uploaded Prepare your Google Sheet with columns: Client, Project Code, Notion Link, Slack Channel, Drive Folder Requirements Google Workspace account (Drive, Gmail, Sheets) Slack workspace with bot permissions to create channels Notion workspace with API integration OpenAI API key File naming convention Upload PDF files using this format: ClientName_ProjectName_email@example.com.pdf Example: AcmeCorp_WebsiteRedesign_john@acme.com.pdf How to customize Add more subfolders: Duplicate the "Create Deliverables Subfolder" node Customize the email prompt: Edit the "AI Draft Welcome Email" node Add more Notion properties: Extend the "Create Notion Project Page" node Change notification format: Modify the "Notify Team on Slack" message
by May Ramati Kroitero
Automated Job Hunt with Tavily — Setup & Run Guide What this template does? Automatically searches for recent job postings (example: “Software Engineering Intern”), extracts structured details from each posting using an AI agent + Tavily, bundles results, and emails a single weekly digest. Estimated setup time: ~30 minutes 1. Required credentials Before you import or run the workflow, create/configure these credentials in your n8n instance: OpenAI (Chat model) — used by the OpenAI Chat Model and Message a model nodes. Add an OpenAI credential (name it e.g. OpenAi account) and paste your OpenAi API key. Tavily API — used by the Search in Tavily node. Add a Tavily credential (name it e.g. Tavily account) and add your Tavily API key. Gmail (OAuth2) — used by the Send a message node to deliver the digest email. Configure Gmail OAuth2 credential and select it for the Gmail node (e.g. Gmail account. 2. Node-by-node configuration (what to check/change) Schedule Trigger Node name: Schedule Trigger Configure interval: daily or weekly (example: weekly, trigger at 08:00). Note: This is the workflow start. Adjust to your preferred cadence. AI Agent Node name: AI Agent Important: First step — set the agent’s prompt / system message. Search in Tavily (Tavily Tool node) Node name: Tavily Query: user-editable field (example default: Roles posted this week for Software Engineering) Advice: keep query under 400 chars; change to target role/location keywords. Options recommended: Search Depth: advanced (optional, better extraction) Max Results: 15 Time Range: week (limit to past week) Include Raw Content: true (fetch full page content for better extraction) Include Domains: indeed.com, glassdoor.com,linkedin.com — prioritize trusted sources Edit Fields / Set (bundle) Node name: Edit Fields (Set) Purpose: Collect the agent output into one field (e.g., $json.output or Response) for downstream processing. Message a Model (OpenAI formatting step) Node name: Message a model Uses OpenAI (the openAiApi credential). This node can be used to reformat or normalize the agent output into consistent blocks if needed. Use the same system rules you used for the agent (the prompt/system message earlier). You can also leave this minimal if the agent already outputs structured blocks. Code Node (Parsing & structuring) Node name: Code Purpose: Split the agent/LLM text into separate job postings and extract fields with regex. Aggregate Node Node name: Aggregate Mode: aggregateAllItemData (this combines all parsed postings into a single data array so the Gmail node can loop over them) Gmail node (Send a message) Node name: Send a message sendTo: set to recipient(s) (e.g., your inbox) subject: e.g. New Jobs for this week! emailType: text (or html if you build HTML content) message (body): use the expression that loops through data and formats every posting. 3. How to test (quick steps) Set credentials in n8n (OpenAI, Tavily, Gmail). Run the Schedule Trigger manually (use the “Execute Workflow” or manually trigger nodes). Inspect the Search in Tavily node output — confirm it returns results. Inspect the AI Agent and Message a model outputs — ensure formatted postings are produced and separated by --- END JOB POSTING ---. Run the Code node — confirm it returns structured items with posting_number, job_title, requirements[], etc. Check Aggregate output: you should see a single item with data array. In Gmail node, run a test send — confirm the email receives one combined message with all postings. 4. Troubleshooting tips Gmail body shows [Array: …]: Avoid dragging the array raw — use the expression that maps data to formatted strings. Code node split error: Occurs when raw is undefined. Ensure previous node returns message.content or adjust to use $input.all() and join contents safely. Missing fields after parsing: Check LLM/agent output labels match the Code node’s regex (e.g., Job Title:). If labels differ, update regex or LLM formatting. 5. Customization ideas Filter by location or remote-only roles, or add keyword filters (seniority, stack). Send results to Google Sheets or Slack instead of/in addition to Gmail. Add an LLM summarization step to create a 1-line highlight per posting.
by Marth
Automated Employee Recognition Bot with Slack + Google Sheets + Gmail Description Turn employee recognition into an automated system. This workflow celebrates great work instantly it posts recognition messages on Slack, sends thank-you emails via Gmail, and updates your tracking sheet automatically. Your team feels appreciated. Your HR team saves hours. Everyone wins. ⚙️ How It Works You add a new recognition in Google Sheets. The bot automatically celebrates it in Slack. The employee receives a thank-you email. HR gets notified and the sheet updates itself. 🔧 Setup Steps 1️⃣ Prepare Your Google Sheet Create a sheet called “Employee_Recognition_List” with these columns: Name | Department | Reason | Date | Email | Status | EmailStatus Then add one test row — for example, your own name — to see it work. 2️⃣ Connect Your Apps Inside n8n: Google Sheets:** Connect your Google account so the bot can read the sheet. Slack:** Connect your Slack workspace to post messages in a channel (like #general). Gmail:** Connect your Gmail account so the bot can send emails automatically. 3️⃣ (Optional) Add AI Personalization If you want the messages to sound more natural, add an OpenAI node with this prompt: > “Write a short, friendly recognition message for {{name}} from {{dept}} who was recognized for {{reason}}. Keep it under 2 sentences.” This makes your Slack and email messages feel human and genuine. 4️⃣ Turn It On Once everything’s connected: Save your workflow Set it to Active Add a new row in your Google Sheet The bot will instantly post on Slack and send a thank-you email 🎉
by Bhuvanesh R
The competitive edge, delivered. This Customer Intelligence Engine simultaneously analyzes the web, Reddit, and X/Twitter to generate a professional, actionable executive briefing. 🎯 Problem Statement Traditional market research for Customer Intelligence (CI) is manual, slow, and often relies on surface-level social media scraping or expensive external reports. Service companies, like HVAC providers, struggle to efficiently synthesize vast volumes of online feedback (Reddit discussions, real-time tweets, web articles) to accurately diagnose systemic service gaps (e.g., scheduling friction, poor automated systems). This inefficiency leads to delayed strategic responses and missed opportunities to invest in high-impact solutions like AI voice agents. ✨ Solution This workflow deploys a sophisticated Multisource Intelligence Pipeline that runs on a scheduled or ad-hoc basis. It uses parallel processing to ingest data from three distinct source types (SERP API, Reddit, and X/Twitter), employs a zero-cost Hybrid Categorization method to semantically identify operational bottlenecks, and uses the Anthropic LLM to synthesize the findings into a clear, executive-ready strategic brief. The data is logged for historical analysis while the brief is dispatched for immediate action. ⚙️ How It Works (Multi-Step Execution) 1. Ingestion and Parallel Processing (The Data Fabric) Trigger:** The workflow is initiated either on an ad-hoc basis via an n8n Form Trigger or on a schedule (Time Trigger). Parallel Ingestion:** The workflow immediately splits into three parallel branches to fetch data simultaneously: SERP API: Captures authoritative content and industry commentary (Strategic Context). Reddit (Looping Structure): Fetches posts from multiple subreddits via an Aggregate Node workaround to get authentic user experiences (Qualitative Signal). X/Twitter (HTTP Request): Bypasses standard rate limits to capture real-time social complaints (Sentiment Signal). 2. Analysis and Fusion (The Intelligence Layer) Cleanup and Labeling (Function Nodes):** Each branch uses dedicated Function Nodes to filter noise (e.g., low-score posts) and normalize the data by adding a source tag (e.g., 'Reddit'). Merge:** A Merge Node (Append Mode) fuses all three parallel streams into a single, unified dataset. Hybrid Categorization (Function Node):** A single Function Node applies the Hybrid Categorization Logic. This cost-free step semantically assigns a pain_point category (e.g., 'Call Hold/Availability') and a sentiment_score to every item, transforming raw text into labeled metrics. 3. Dispatch and Reporting (The Executive Output) Aggregation and Split (Function Node):** The final Function Node calculates the total counts, deduplicates the final results, and generates the comprehensive summaryString. Data Logging:* The aggregated counts and metrics are appended to *Google Sheets** for historical logging. LLM Input Retrieval (Function Node):** A final Function Node retrieves the summary data using the $items() helper (the serial route workaround). AI Briefing:* The *Message a model (Anthropic) Node receives the summaryString and uses a strict HTML System Prompt to synthesize the strategic brief, identifying the top pain points and suggesting AI features. Delivery:* The *Gmail Node** sends the final, professional HTML brief to the executive team. 🛠️ Setup Steps Credentials Anthropic:** Configure credentials for the Language Model (Claude) used in the Message a model node. SERP API, Reddit, and X/Twitter:** Configure API keys/credentials for the data ingestion nodes. Google Services:** Set up OAuth2 credentials for Google Sheets (for logging data) and Gmail (for email dispatch). Configuration Form Configuration:** If using the Form Trigger, ensure the Target Keywords and Target Subreddits are mapped correctly to the ingestion nodes. Data Integrity:** Due to the serial route, ensure the Function (Get LLM Summary) node is correctly retrieving the LLM_SUMMARY_HOLDER field from the preceding node's output memory. ✅ Benefits Proactive CI & Strategy:** Shifts market research from manual, reactive browsing to proactive, scheduled data diagnostic. Cost Efficiency:** Utilizes a zero-cost Hybrid Categorization method (Function Node) for intent analysis, avoiding expensive per-item LLM token costs. Actionable Output:** Delivers a fully synthesized, HTML-formatted executive brief, ready for immediate presentation and strategic sales positioning. High Reliability:** Employs parallel ingestion, API workarounds, and serial routing to ensure the complex workflow runs consistently and without failure.
by Franz
🚀 AI Lead Generation and Follow-Up Template 📋 Overview This n8n workflow template automates your lead generation and follow-up process using AI. It captures leads through a form, enriches them with company data, classifies them into different categories, and sends appropriate follow-up sequences automatically. Key Features: 🤖 AI-powered lead classification (Demo-ready, Nurture, Drop) 📊 Automatic lead enrichment with company data 📧 Intelligent email responses and follow-up sequences 📅 Automated demo scheduling for qualified leads 📝 Complete lead logging in Google Sheets 💬 AI assistant for immediate query responses 🛠️ Prerequisites Before setting up this workflow, ensure you have: n8n Instance: Self-hosted or cloud version OpenAI API Key: For AI-powered features Google Workspace Account with: Gmail access Google Sheets Google Calendar Basic understanding of your Ideal Customer Profile (ICP) ⚡ Quick Start Guide Step 1: Import the Workflow Copy the workflow JSON Import into your n8n instance The workflow will appear with all nodes connected Step 2: Configure Credentials You'll need to set up the following credentials: OpenAI API**: For AI agents and classification Gmail OAuth2**: For sending emails Google Sheets OAuth2**: For lead logging Google Calendar OAuth2**: For demo scheduling Step 3: Create Your Lead Log Sheet Create a Google Sheet with these columns: Date Name Email Company Job Title Message Number of Employees Industry Geography Annual Revenue Technology Pain Points Lead Classification Step 4: Update Configuration Nodes Replace Sheet ID: Update all Google Sheets nodes with your sheet ID Update Email Templates: Customize all email content Set Escalation Email: Replace "your-email@company.com" with your team's email Configure ICP Criteria: Edit the "Define ICP and Lead Criteria" node 🎯 Lead Classification Setup Define Your ICP (Ideal Customer Profile) Edit the "Define ICP and Lead Criteria" node to set your criteria: 📌 ICP Criteria Example: Company Size: 50+ employees Industry: SaaS, Finance, Healthcare, Manufacturing Geography: North America, Europe Pain Points: Manual processes, compliance needs, scaling challenges Annual Revenue: $5M+ ✅ Demo-Ready Criteria: High-intent prospects who meet multiple qualifying factors: Large company size (your threshold) Clear pain points mentioned Urgent timeline Budget authority indicated Specific solution requests 🌱 Nurture Criteria: Prospects with future potential: Meet basic size requirements In target industry General interest expressed Planning future implementation Exploring options ❌ Drop Criteria: Only drop leads that clearly don't fit: Outside target geography Wrong industry (B2C if you're B2B) Too small with no growth Already with competitor Spam or test messages 📧 Email Customization Customize Follow-Up Sequences: Demo-Ready Sequence: Immediate calendar invitation Personalized demo confirmation Meeting reminder (optional) Nurture Sequence: Welcome email with resources Educational content (Day 2) Webinar/event invitation (Day 3) Demo offer (Day 4) Drop Message: Polite acknowledgment Clear explanation Keep door open for future 🔧 Advanced Configuration AI Answer Agent Setup: Update the system prompt with your company information Add common Q&A patterns Set escalation rules Configure language preferences Lead Enrichment Options: Add API keys for additional data sources Configure enrichment fields Set data quality thresholds Enable duplicate detection Calendar Integration: Set available meeting times Configure meeting duration Add buffer times Set timezone handling 📊 Monitoring and Optimization Track Key Metrics: Lead volume by classification Response rates Demo conversion rates Time to first response Enrichment success rate Optimization Tips: Regular Review: Check classification accuracy weekly A/B Testing: Test different email sequences Feedback Loop: Use outcomes to refine ICP criteria AI Training: Update prompts based on results 🎉 Best Practices Start Simple: Begin with basic criteria and refine over time Test Thoroughly: Use test leads before going live Monitor Daily: Check logs for the first week Iterate Quickly: Adjust based on results Document Changes: Keep track of criteria updates 📈 Scaling Your Workflow As your lead volume grows: Add Sub-workflows: Separate complex processes Implement Queuing: Handle high volumes Add CRM Integration: Sync with your sales tools Enable Analytics: Track detailed metrics Set Up Alerts: Monitor for issues
by Rakin Jakaria
How it works: This project creates a personal AI knowledge assistant that operates through Telegram. The assistant can extract summaries from YouTube videos or online articles, store them in Google Sheets for later reference, and retrieve stored summaries when requested by the user. Step-by-step: Google Sheets Trigger:* The workflow starts by detecting a new YouTube or article URL added to a dedicated sheet (Sheet2*). It checks whether the link is already processed. Link Type Detection:** The system identifies if the URL is from YouTube or a standard article. Data Retrieval:** If it’s YouTube: Uses Apify to fetch the transcript. If it’s an article: Uses an HTTP Request node to fetch the webpage content. AI Summarization:* The transcript or article content is passed to *Google Gemini** for refined summarization. Google Sheets Storage:* The summary and title are appended to another sheet (Sheet1*) for long-term storage, along with a “Stored” status update in Sheet2. Telegram Assistant:** A Telegram Trigger listens for messages from the user. The assistant searches stored summaries in Google Sheets. If a match is found, it returns the result to the user on Telegram; otherwise, it politely apologizes.
by Lakindu Siriwardana
📄 Automated Lease Renewal Offer by Email ✅ Features Automated Lease Offer Generation using AI (Ollama model). Duplicate File Check to avoid reprocessing the same customer. Personalized Offer Letter creation based on customer details from Supabase. PDF/Text File Conversion for formatted output. Automatic Google Drive Management for storing and retrieving files. Email Sending with generated offer letter attached. Seamless Integration with Supabase, Google Drive, Gmail, and AI LLM. ⚙️ How It Works Trigger: Workflow starts on form submission with customer details. Customer Lookup: Searches Supabase for customer data. Updates customer information if needed. File Search & Duplication Check: Looks for existing lease offer files in Google Drive. If duplicate found, deletes old file before proceeding. AI Lease Offer Creation: Uses the LLM Chain (offerLetter) to generate a customized lease renewal letter. File Conversion: Converts AI-generated text into a downloadable file format. Upload to Drive: Saves the new lease offer in Google Drive. Email Preparation: Uses Basic LLM Chain-email to draft the email body. Downloads the offer file from Drive and attaches it. Email Sending: Sends the renewal offer email via Gmail to the customer. 🛠 Setup Steps Supabase Connection: Add Supabase credentials in n8n. Ensure a customers table exists with relevant columns. 🔜Future Steps Add specific letter template (organization template). PDF offer letter
by Anna Bui
🎯 Universal Meeting Transcript to LinkedIn Content Automatically transform your meeting insights into engaging LinkedIn content with AI Perfect for coaches, consultants, sales professionals, and content creators who want to share valuable insights from their meetings without the manual effort of content creation. How it works Calendar trigger detects when your coaching/meeting ends Waits for meeting completion, then sends you a form via email You provide the meeting transcript and specify post preferences AI analyzes the transcript using your personal brand guidelines Generates professional LinkedIn content based on real insights Creates organized Google Docs with both transcript and final post Sends you links to review and publish your content How to use Connect your Google Calendar and Gmail accounts Update the calendar filter to match your meeting types Customize the AI prompts with your brand voice and style Replace email addresses with your own Test with a sample meeting transcript Requirements Google Calendar (for meeting detection) Gmail (for form delivery and notifications) Google Drive & Docs (for content storage) LangChain AI nodes (for content generation) Good to know AI processing may incur costs based on your LangChain provider Works with any meeting platform - just copy/paste transcripts Can be adapted to use webhooks from recording tools like Fireflies.ai Memory nodes store your brand guidelines for consistent output Happy Content Creating!
by vinci-king-01
How it works This workflow automatically extracts data from invoice documents (PDFs and images) and processes them through a comprehensive validation and approval system. Key Steps Multi-Input Triggers - Accepts invoices via email attachments or direct file uploads through webhook. AI-Powered Extraction - Uses ScrapeGraphAI to extract structured data from invoice documents. Data Cleaning & Validation - Processes and validates extracted data against business rules. Approval Workflow - Routes invoices requiring approval through a multi-stage approval process. System Integration - Automatically sends validated invoices to your accounting system. Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for invoice data extraction. Set up Telegram connection - Connect your Telegram account for approval notifications. Configure email trigger - Set up IMAP connection for processing emailed invoices. Customize validation rules - Adjust business rules, amount thresholds, and vendor lists. Set up accounting system integration - Configure the HTTP request node with your accounting system's API endpoint. Test the workflow - Upload a sample invoice to verify the extraction and approval process. Features Multi-format support**: PDF, PNG, JPG, JPEG, TIFF, BMP Intelligent validation**: Business rules, duplicate detection, amount thresholds Approval automation**: Multi-stage approval workflow with role-based routing Data quality scoring**: Confidence levels and completeness analysis Audit trail**: Complete processing history and metadata tracking
by vinci-king-01
How it works Transform your business with intelligent deal monitoring and automated customer engagement! This AI-powered coupon aggregator continuously tracks competitor deals and creates personalized marketing campaigns that convert. Key Steps 24/7 Deal Monitoring - Automatically scans competitor websites daily for the best deals and offers Smart Customer Segmentation - Uses AI to intelligently categorize and target your customer base Personalized Offer Generation - Creates tailored coupon campaigns based on customer behavior and preferences Automated Email Marketing - Sends targeted email campaigns with personalized deals to the right customers Performance Analytics - Tracks campaign performance and provides detailed insights and reports Daily Management Reports - Delivers comprehensive analytics to management team every morning Set up steps Setup time: 10-15 minutes Configure competitor monitoring - Add target websites and deal sources you want to track Set up customer database - Connect your customer data source for intelligent segmentation Configure email integration - Connect your email service provider for automated campaigns Customize deal criteria - Define what types of deals and offers to prioritize Set up analytics tracking - Configure Google Sheets or database for performance monitoring Test automation flow - Run a test cycle to ensure all integrations work smoothly Never miss a profitable deal opportunity - let AI handle the monitoring and targeting while you focus on growth!
by Anna Bui
This n8n template automatically syncs website visitors identified by RB2B into your Attio CRM, creating comprehensive contact records and associated sales deals for immediate follow-up. Perfect for sales teams who want to capture every website visitor as a potential lead without manual data entry! Good to know RB2B identifies anonymous website visitors and sends structured data via Slack notifications The workflow prevents duplicate contacts by checking email addresses before creating new records All RB2B leads are automatically tagged with source tracking for easy identification How it works RB2B sends website visitor notifications to your designated Slack channel with visitor details The workflow extracts structured data from Slack messages including name, email, company, LinkedIn, and location It searches Attio CRM to check if the person already exists based on email address For new visitors, it creates a complete contact record with all available information For existing contacts, it updates their record and manages deal creation intelligently Automatically creates sales deals tagged as "RB2B Website Visitor" for proper lead tracking How to use Configure RB2B to send visitor notifications to a dedicated Slack channel The Slack trigger can be replaced with other triggers like webhooks if you prefer different notification methods Customize the deal naming conventions and stages to match your sales pipeline Requirements RB2B account with Slack integration enabled Attio CRM account with API access Slack workspace with bot permissions for the designated RB2B channel Customising this workflow Modify deal stages and values based on your sales process Add lead scoring based on company domain or visitor behavior patterns Integrate additional enrichment APIs to enhance contact data Set up automated email sequences or Slack notifications for high-value leads
by Vishal Kumar
Trigger The workflow runs when a GitLab Merge Request (MR) is created or updated. Extract & Analyze It retrieves the code diff and sends it to Claude AI or GPT-4o for risk assessment and issue detection. Generate Report AI produces a structured summary with: Risk levels Identified issues Recommendations Test cases Notify Developers The report is: Emailed to developers and QA teams Posted as a comment on the GitLab MR Setup Guide Connect GitLab Add GitLab API credentials Select repositories to track Configure AI Analysis Enter Anthropic (Claude) or OpenAI (GPT-4o) API key Set Up Notifications Add Gmail credentials Update the email distribution list Test & Automate Create a test MR to verify analysis and email delivery Key Benefits Automated Code Review** – AI-driven risk assessment and recommendations Security & Compliance** – Identifies vulnerabilities before code is merged Integration with GitLab CI/CD** – Works within existing DevOps workflows Improved Collaboration** – Keeps developers and QA teams informed Developed by Quantana, an AI-powered automation and software development company.