by AI Sales Agent HQ
What this workflow does Automatically re-engages dormant leads by detecting trigger events (funding rounds, company news, leadership changes) and generating personalized outreach emails. The workflow identifies leads inactive for 90+ days, checks multiple data sources for re-engagement opportunities, and sends AI-generated email drafts to sales reps for approval. How it works Schedule - Runs weekly (every Monday by default) CRM Data - Pulls leads that have been inactive for 90+ days Trigger Detection - Checks three sources in parallel: Crunchbase for funding events NewsAPI for company news Hunter.io for leadership changes AI Email - Claude generates a personalized re-engagement email based on the detected trigger Rep Notification - Sends the complete lead profile and email draft to the assigned sales rep for review Setup steps Replace Load inactive leads (mock) with your CRM integration (Salesforce, HubSpot, Pipedrive) Add API credentials for trigger detection: NewsAPI (free at newsapi.org) Crunchbase (optional, paid) Hunter.io (free tier available) Add your Anthropic API credentials for Claude Add your Gmail OAuth credentials Test with Test workflow manually node before enabling schedule API keys required | Service | Purpose | Cost | |---------|---------|------| | NewsAPI | Company news detection | Free (100 req/day) | | Crunchbase | Funding event detection | $99/mo (optional) | | Hunter.io | Leadership change detection | Free tier available | | Anthropic | AI email generation | Pay per use | | Gmail | Send notifications | Free |
by Fayzul Noor
Description This workflow is built for e-commerce store owners, customer support teams, and retail businesses who want to provide instant, intelligent email support without hiring additional staff. If you're tired of manually responding to customer inquiries, searching through product catalogs, and copying information into emails, this automation will transform your support process. It turns your inbox into a smart AI-powered support system that reads, understands, and responds to customer questions about your store products while you focus on growing your business. How it works / What it does: This n8n automation completely transforms how you handle customer email inquiries using AI and Retrieval-Augmented Generation (RAG) technology. Here's a simple breakdown of how it works: Monitor your Gmail inbox using the Gmail Trigger node, which checks every minute for new customer emails (excluding emails sent by you). Assess if a reply is needed with an AI-powered classification system. The workflow uses GPT-4.1 with a structured JSON parser to determine whether incoming emails are genuine customer inquiries about your men's clothing store that require a response. Filter relevant emails through the If Needs Reply node, which only passes emails that need attention to the AI Agent, preventing unnecessary processing. Generate intelligent responses using an AI Agent powered by GPT-4.1-nano. The agent uses a friendly, professional tone and starts each email with "Dear" and ends with "Best regards" to maintain proper email etiquette. Search your knowledge base with a Vector Store RAG tool connected to Pinecone. The AI Agent queries your men's clothing product database using OpenAI embeddings to find accurate, up-to-date information about prices, features, and product details. Send personalized replies automatically through the Gmail node, which responds directly to the original email thread with clear, concise, and empathetic answers to customer questions. Once everything is set up, the system runs on autopilot and provides 24/7 customer support without any manual intervention. How to set up: Follow these steps to get your AI-powered email support system running: Import the JSON file into your n8n instance. Add your API credentials: Gmail OAuth2 credentials for reading and sending emails OpenAI API key for the AI Agent and embeddings Pinecone API credentials for vector storage Set up your Pinecone vector database: Create a Pinecone index. Create a namespace. Upload your store data to the vector store Configure the Gmail Trigger node to monitor the correct email account. Customize the AI Agent's system message to match your brand voice and support policies. Activate the workflow to enable automatic monitoring and responses. Requirements: Before running the workflow, make sure you have the following: An n8n account or instance (self-hosted or n8n Cloud) A Gmail account for receiving and sending customer emails OpenAI API access for the AI Agent and embeddings (GPT-4.1 and GPT-4.1-nano models) A Pinecone account with a configured vector database containing your product information Your store data, product catalog prepared and uploaded to Pinecone How to customize the workflow: This workflow is flexible and can be customized to fit your business needs. Here's how you can tailor it: Adjust the response style by modifying the system message in the AI Agent node. You can make it more casual, formal, or brand-specific. Add response length controls by updating the system message instructions. Currently set to keep responses short and concise, you can adjust this for more detailed explanations. Change the polling frequency in the Gmail Trigger node. The default is every minute, but you can adjust it to check more or less frequently based on your email volume. Filter specific types of emails by modifying the filters in the Gmail Trigger and "Assess if message needs a reply" nodes to handle specific subjects, senders, or keywords. Connect to different email platforms by replacing the Gmail nodes with other email services like Outlook, IMAP, or customer support platforms. Add human-in-the-loop approval by inserting a webhook or notification node before the Gmail reply node, allowing manual review before sending responses. Implement response tracking by adding database nodes to log all AI-generated responses for quality control and training purposes. Add multi-language support by incorporating translation nodes or configuring the AI Agent to detect and respond in the customer's language.
by Onur
Template Description: > Stop manually reading every CV and copy-pasting data into a spreadsheet. This workflow acts as an AI recruiting assistant, automating your entire initial screening process. It captures applications from a public form, uses AI to read and understand PDF CVs, structures the candidate data, saves it to Google Sheets, and notifies all parties. This template is designed to save HR professionals and small business owners countless hours, ensuring no applicant is missed and all data is consistently structured and stored. 🚀 What does this workflow do? Provides a public web form for candidates to submit their name, email, experience, and PDF CV. Automatically reads the text content from the uploaded PDF CV. Uses an AI Agent (OpenAI) to intelligently parse the CV text, extracting key data like contact info, work experience, education, skills, and more. Writes a concise summary** of the CV, perfect for quick screening by HR. Checks for duplicate applications** based on the candidate's email address. Saves all structured applicant data** into a new row in a Google Sheet, creating a powerful candidate database. Sends an automated confirmation email to the applicant. Sends a new application alert with the CV summary to the recruiter. 🎯 Who is this for? HR Departments & Recruiters:** Streamline your hiring pipeline and build a structured candidate database. Small Business Owners:** Manage job applications professionally without dedicated HR software. Hiring Managers:** Quickly get a summarized overview of each candidate without reading the full CV initially. ✨ Benefits Massive Time Savings:** Drastically reduces the time spent on manual CV screening and data entry. Structured Candidate Data:** Turns every CV into a consistently formatted row in a spreadsheet, making it easy to compare candidates. Never Miss an Applicant:** Every submission is logged, and you're instantly notified. Improved Candidate Experience:** Applicants receive an immediate confirmation that their submission was successful. AI-Powered Summaries:** Get a quick, AI-generated summary of each CV delivered to your inbox. ⚙️ How it Works Form Submission: A candidate fills out the n8n form and uploads their CV. PDF Extraction: The workflow extracts the raw text from the PDF file. AI Analysis: The text is sent to OpenAI with a prompt to structure all key information (experience, skills, etc.) into a JSON format. Duplicate Check: The workflow checks your Google Sheet to see if the applicant's email already exists. If so, it stops. Save to Database: If the applicant is new, their structured data is saved as a new row in Google Sheets. Send Notifications: Two emails are sent simultaneously: a confirmation to the applicant and a notification with the CV summary to the recruiter. 📋 n8n Nodes Used Form Trigger Extract From File OpenAI Code (or JSON Parser) Google Sheets If Gmail 🔑 Prerequisites An active n8n instance. OpenAI Account & API Key**. Google Account** with access to Google Sheets and Gmail (OAuth2 Credentials). A Google Sheet** prepared with columns to store the applicant data (e.g., name, email, experience, skills, cv_summary, etc.). 🛠️ Setup Import the workflow into your n8n instance. Configure Credentials: Connect your credentials for OpenAI and Google (for Sheets & Gmail) in their respective nodes. Customize the Form: In the 1. Applicant Submits Form node, you can add or remove fields as needed. Activate the workflow. Once active, copy the Production URL from the Form Trigger node and share it to receive applications. Set Your Email: In the 8b. Send Notification... (Gmail) node, change the "To" address to your own email address to receive alerts. Link Your Google Sheet: In the 5. Check for Duplicate... and 7. Save Applicant Data... nodes, select your spreadsheet and sheet.
by go-surfe
🚀 Build Hyper-Targeted Prospecting Lists with Surfe & HubSpot This template automatically discovers companies that match your Ideal Customer Profile (ICP), finds the right people inside those companies and enriches them — ready to drop straight into HubSpot. Launch the workflow, sit back, and get a clean list of validated prospects in minutes. 1. ❓ What Problem Does This Solve? Sourcing prospects that truly fit your ICP is slow and repetitive. You jump between databases, copy domains, hunt down decision-makers, and then still have to enrich emails and phone numbers one by one. This workflow replaces all that manual effort: It queries Surfe’s database for companies that match your exact industry, size, revenue and geography filters. It pulls the best-fit people inside each company and enriches them in bulk. It keeps only records with both a direct email and mobile phone, then syncs them to HubSpot automatically. No spreadsheets, no copy-paste — just a fresh, qualified prospect list ready for outreach. 2. 🧰 Prerequisites You’ll need: A self-hosted or cloud instance of n8n A Surfe API Key A HubSpot Private App Token with contact read/write scopes A Gmail account (OAuth2) for the completion notification The workflow JSON file linked above N8N_FLOW_2__Building_Prospecting_Lists.json 3. 📌 Search ICP Companies Configuration — Fine-Tune Your Targeting 3.1 Editing the JSON Every targeting rule lives inside the “🔍 Search ICP Companies” HTTP node. Open the node Search ICP Companies → Parameters tab → JSON Body to edit the filters. | Filter | JSON path | What it does | Example | | --- | --- | --- | --- | | industries | filters.industries | Narrow to specific verticals (case-sensitive strings) | ["Software","Apps","SaaS"] | | employeeCount.from / to | filters.employeeCount | from / to | 1 / 35 | | countries | filters.countries | 2-letter ISO codes | ["FR","DE"] | | revenues | filters.revenues | Annual revenue brackets | ["1-10M"] | | limit | limit | Companies per run | 20 | 3.2 Where to find allowed values Surfe exposes an “🗂 Get Filters” endpoint that returns every accepted value for: industries employeeCounts revenues countries (always ISO-2 codes) You can hit it with a simple GET /v1/people/search/filters request or browse the interactive docs here: https://developers.surfe.com/public-008-people-filters developers.surfe.com For company-level searches, the same enumerations apply. 4. ⚙️ Setup Instructions 4.1 🔐 Create Your Credentials in n8n 4.1.1 🚀 Surfe API In your Surfe dashboard → Use Surfe Api → copy your API key Go to n8n → Credentials → Create Credential Choose Credential Type: Bearer Auth Name it something like SURFE API Key Paste your API key into the Bearer Token Save 4.1.2 📧 Gmail OAuth2 API Go to n8n → Credentials Create new credentials: Type: Gmail OAuth2 API A pop-up window will appear where you can log in with your Google account that is linked to Gmail Make sure you grant email send permissions when prompted 4.1.3 🎯 HubSpot 🔓 Private App Token Go to HubSpot → Settings → Integrations → Private Apps Create an app with scopes: crm.objects.contacts.read crm.objects.contacts.write crm.schemas.contacts.read Save the App token Go to n8n → Credentials → Create Credential → HubSpot App Token Paste your App Token ✅ You are now all set for the credentials 4.2 📥 Import and Configure the N8N Workflow Import the provided JSON workflow into N8N Create a New Blank Workflow click the … on the top left Import from File 4.2.1 🔗 Link Nodes to Your Credentials In the workflow, link your newly created credentials to each node of this list : Surfe HTTP nodes: Authentication → Generic Credential Type Generic Auth Type → Bearer Auth Bearer Auth → Select the credentials you created before Gmail Node Credentials to connect with → Gmail account Hubspot Node →Credentials to connect with → Gmail account Surfe HTTP nodes Surfe HTTP nodes HubSpot node → Credentials to connect with → select your HubSpot credentials in the list 5. 🔄 How This N8N Workflow Works Manual Trigger – Click Execute Workflow (or schedule it) to start. Search ICP Companies – Surfe returns company domains that match your filter set. Prepare JSON Payload with Company Domains – Formats the domain list for the next call. Search People in Companies – Finds people inside each company. Prepare JSON Payload Enrichment Request – Builds the bulk-enrichment request. Surfe Bulk Enrichments API – Launches one enrichment job for the whole batch. Wait + Polling loop – Checks job status every 3 seconds until it’s COMPLETED. Extract List of People – Pulls the enriched contacts from Surfe’s response. Filter: phone AND email – Keeps only fully reachable prospects (email and mobile). HubSpot: Create or Update – Inserts/updates each contact in HubSpot. Gmail – Sends you a “Your ICP prospecting enrichment is done” email. 6. 🧩 Use Cases Weekly prospect list refresh** – Generate 50 perfectly-matched prospects every Monday morning. Territory expansion** – Spin up a list of SMB software CEOs in a new country in minutes. ABM prep** – Build multi-stakeholder buying-group lists for target accounts. Campaign-specific lists** – Quickly assemble contacts for a limited-time product launch. 7. 🛠 Customization Ideas prepare 🎯 Refine filters for people – Add seniorities or other filters in the node JSON PAYLOAD WITH Company Domains use the surfe search people api doc https://developers.surfe.com/public-009-search-people-v2 ♻️ Deduplicate – Check HubSpot first to skip existing contacts. 🟢 Slack alert – Replace Gmail with a Slack notification. 📊 Reporting – Append enriched contacts to a Google Sheet for analytics. 8. ✅ Summary Fire off the workflow, and n8n will find ICP-fit companies, pull key people, enrich direct contact data and drop everything into HubSpot — all on autopilot. Prospecting lists, done for you.
by Veena Pandian
Who is this for? SEO managers, content marketers, bloggers, and growth teams who want to automatically catch declining content performance before it's too late — without manually checking Google Search Console every week. What this workflow does This workflow runs weekly to compare your recent Google Search Console performance against a historical baseline. It identifies pages experiencing traffic decay at three severity levels, sends detailed reports via Slack and email, logs all data to a tracking sheet, and auto-generates prioritized fix tasks for your most critical pages. How it works Weekly trigger fires every Monday at 8 AM. Fetches two GSC date ranges in parallel — the last 7 days (recent) and the previous 28 days (baseline, normalized to weekly averages). Compares per-page metrics including clicks, impressions, average position, and CTR. Classifies each page into one of five signals: CRITICAL_DECAY — clicks dropped 50%+ or position fell 5+ spots with 30%+ click loss DECAYING — clicks dropped 30%+ or position fell 3+ spots EARLY_DECAY — clicks dropped 15%+ or position fell 1.5+ spots STABLE — no significant change GROWING — clicks increased 20%+ Logs all results to a Decay Log Google Sheet tab for historical trending. Builds a weekly report with summary counts, estimated clicks lost, and per-page breakdowns. Sends the report to Slack and email simultaneously. Auto-generates fix tasks for critical pages with specific recommendations (backlink audit, content refresh, CTR optimization, or technical investigation) and logs them to a Fix Tasks sheet tab. Setup steps Set environment variables in your n8n instance: GSC_SITE_URL — your verified site URL (e.g., https://yoursite.com) DECAY_SHEET_URL — URL of your Google Sheet for logging Create a Google Sheet with two tabs: Decay Log with headers: date, page_path, signal, clicks_now, clicks_before, click_change_pct, position_now, position_before, position_change, impressions_now, impression_change_pct, ctr_now Fix Tasks with headers: created, priority, page_path, page_url, signal, click_change_pct, position_change, recommended_action Connect Google Search Console OAuth2 credentials (your site must be verified in GSC). Connect Google Sheets OAuth2 credentials. Connect Slack OAuth2 credentials and configure your alert channel. Configure email (SMTP) credentials and update the recipient email address in the "Email Weekly Report" node. Activate the workflow. Requirements n8n instance (self-hosted or cloud) Google Search Console property with verified ownership Google Cloud project with Search Console API and Sheets API enabled Slack workspace with a bot configured SMTP email credentials (or swap for Gmail node) How to customize Decay thresholds** — Adjust the percentage and position-change cutoffs in the "Compare Periods and Detect Decay" code node to match your sensitivity needs. Schedule** — Change from weekly to daily or bi-weekly in the trigger node. Baseline period** — Modify the 28-day comparison window to 14 or 90 days. Row limit** — Increase the rowLimit in GSC API calls beyond 500 if you have a large site. Fix task logic** — Enhance the remediation recommendations with AI-powered content analysis or integrate with project management tools (Notion, Asana, Trello). Notifications** — Add Telegram, Discord, or Microsoft Teams alongside or instead of Slack.
by Dean Pike
Client Form → Draft → Approve → Sign → Deliver, fully automated This workflow automates the entire agreement lifecycle from client form submission to signed document delivery. It generates personalized agreements from templates, manages internal approvals, orchestrates e-signatures via Signwell, and delivers fully executed documents with complete audit trails in n8n Data Tables. Good to know Handles client data collection via JotForm with custom field mapping Automatically populates Google Doc templates with client-specific details Internal approval workflow with email-based confirmation Signwell integration for embedded e-signatures - test mode enabled by default - disable for legally binding documents Complete lifecycle tracking in n8n Data Tables (draft → approval → sent → signed) Auto-cleanup: removes documents from Signwell after completion to save storage Who's it for Service businesses, consultants, agencies, and freelancers who send agreements to clients regularly. Perfect for anyone wanting to avoid other costly e-signature platforms with limited API and automation capabilities. Signwell has an affordable entry level tier with generous API limits. If you're looking to eliminate manual document preparation, have an approval workflow, and track signatures while maintaining professional client communication, then this solution is a good fit. How it works Phase 1: Draft Creation JotForm trigger captures client submission (company name, address, contact details, position) Standardizes form data and duplicates Google Doc template with custom filename Replaces template variables with client information (company name, address, full name, position, dates) Creates clean document URL and logs initial record to Data Tables Emails internal team with draft review link and client details Phase 2: Approval & Preparation Gmail monitors inbox for "Approved" reply email Fetches agreement record from Data Tables and marks as approved Downloads Google Doc as PDF and uploads to Drive folder Grants temporary public sharing access (required for Signwell file import) Creates Signwell document with embedded signature fields and signing URL Emails client with personalized signing link Revokes public sharing access for security and updates Data Tables with Signwell details Phase 3: Signature & Delivery Gmail monitors for Signwell completion notification Extracts signed document download link from notification email Downloads fully executed PDF from Signwell Uploads to "Final Versions" folder in Google Drive Updates Data Tables with completion status and final document URLs Sends confirmation email to client with signed PDF attached Deletes document from Signwell to free up storage Requirements JotForm account (free tier works) Gmail account with OAuth2 access Google Drive account (OAuth2) Google Docs account (OAuth2) with a draft Agreement template Signwell account with API key n8n Data Tables (built-in, no external service needed) Google Drive folders: "Services Agreements - Drafts" and "Services Agreements - Final Versions" How to set up Add credentials: JotForm API, Gmail OAuth2, Google Drive OAuth2, Google Docs OAuth2, Signwell API key Create JotForm: Build form with fields: Company Name, Company Address (address field), Full Name (name field), Your Position/Job Title, Email In "JotForm Trigger" node: select your form Create Google Doc template: Add variables {{clientCompanyName}}, {{clientFullName}}, {{clientNamePosition}}, {{clientCompanyAddress}}, {{agreementDate1}}, {{agreementDate2}} In "Copy and Rename File" node: select your template document and update folder ID to your "Drafts" folder Create Data Table: Name it "Services Agreements" with columns: documentFileName, clientEmail, clientFullName, clientNamePosition, clientCompanyName, clientCompanyAddress, documentUrl, approvalStatus, sentDocumentPdfUrl, sentDate, signwellUrl, signwellDocID, docSigned, finalExecutedDocGDrive, finalExecutedDocSignwellUrl In "Insert Row" and all "Get/Update Row" nodes: select your Data Table Create Gmail labels: "_AGREEMENTS" with 2 nested (sublabels) Agreement-Approvals" and "Agreement-Completed" for filtering In "Check for Email Approval" node: select your approval label and update internal email address In "Check Email for Completed Notification" node: select your completed label In "Create Document in Signwell" node: update API key and adjust signature field coordinates for your document Set Signwell to live mode: Change "test_mode": true to "test_mode": false when ready for production Activate workflow Customizing this workflow Change template variables: Edit "Update New File" node to add/remove fields (e.g., pricing, terms, scope of work) Modify approval email: Edit "Share Email Draft" node to change recipient, subject line, or message format Adjust Signwell fields: Edit "Create Document in Signwell" node to change signature/date field positions (x, y coordinates) to match your agreement template, and add any other fields you'd like Add approval deadline: Add Wait node with timeout after "Share Email Draft" to auto-remind for pending approvals Multi-signer support: Modify "Create Document in Signwell" recipients array to add multiple signers (e.g., both parties) Change storage folders: Update folder IDs in "Upload PDF File" and "Upload Completed Doc" nodes Add Slack notifications: Add Slack nodes after key milestones (draft created, approved, signed) Custom client messaging: Edit "Send Prepared Agreement to Client" and "Send Client Completed Agreement PDF" nodes for personalized communication Add reminder logic: Insert Wait + Send Email nodes between signing and completion to remind client if not signed within X days Quick Troubleshooting JotForm not triggering: Verify webhook is active in JotForm settings and form ID matches "JotForm Trigger" node Template variables not replacing: Check variable names in template doc exactly match {{variableName}} format (case-sensitive) Wrong internal email for approval: Update email address in "Share Email Draft" node to your own email Approval email not detected: Confirm Gmail label "Agreement-Approvals" exists and reply contains exact word "Approved" Signwell document creation fails: Verify PDF has public sharing enabled before API call AND Signwell API key is valid in "Create Document in Signwell" node Signature fields in wrong position: Adjust x/y coordinates in "Create Document in Signwell" node (test in Signwell UI first to find correct pixel positions) Completed document not downloading: Check Signwell completion email format - Code node extracts link via regex pattern Data Tables errors: Ensure documentFileName exactly matches between "Insert Row" and "Get/Update Row" operations Client emails not sending: Re-authorize Gmail OAuth2 credentials and verify sender name/address in Gmail nodes Drive folder not found: Update folder IDs in "Copy and Rename File", "Upload PDF File", and "Upload Completed Doc" nodes to your own folder IDs Signwell deletion fails: Verify signwellDocID was correctly stored in Data Tables before deletion (check "Update Row - Additional Doc Details" output) 401/403 API errors: Re-authorize all OAuth2 credentials (Gmail, Google Drive, Google Docs) Test mode documents: Change "test_mode": true to "test_mode": false in "Create Document in Signwell" node for production signatures Sample Outputs Agreement Drafts and Final folders/files in Google Drive File References Agreement Template (sample) Final Agreement Signed (sample)
by Seb
Stripe invoicing automation that is connected to your CRM, in this example, it is ClickUp. At the end of the flow, once your lead has been sent an invoice, you (or your team) will be sent an email notifying you of the newly sent invoice with all relevant details. How it works: • Monitors ClickUp task status → triggers workflow when status changes to send invoice. • Fetches task details from ClickUp, including customer name, email, and project cost. • Creates a Stripe customer using the fetched information. • Generates a Stripe invoice via HTTP request, including description, footer, and due date (calculated in Unix timestamp). • Adds invoice items automatically with correct amounts (converted to cents for Stripe). • Sends the invoice to the customer automatically (manual or auto-charge option). • Sends notification emails to team members with a link to the ClickUp task. Works with other CRMs like Monday or HubSpot, not just ClickUp. Test mode is available in Stripe to validate the workflow without sending real invoices. Setup steps: • You will need to connect your ClickUp Account • Connect your Stripe Account via HTTP Request (Shown in YouTube Video Linked Below) • You will need to connect your email account to N8N (Gmail, Outlook etc) for sending the emails to your team and the client Important Have your Stripe account and PUT IT IN TEST/DEVELOPER MODE when testing and developing the automation. Alternatively, set up an entirely separate account from your main Stripe account. This is only up until the point where you want to send the invoice, as you cannot send an invoice when your Stripe account is in test/developer mode For a complete rundown on how to set this up watch my YouTube tutorial linked below See full video tutorial here: https://youtu.be/vthK5I8x33k?si=W0Nreu403pDs-ud3 My LinkedIn: https://www.linkedin.com/in/seb-gardner-5b439a260/
by Cheng Siong Chin
How It Works Automates monthly revenue aggregation from multiple sources with intelligent tax forecasting using GPT-4 structured analysis. Fetches revenue data from up to three distinct sources, consolidates datasets into unified records, applies OpenAI GPT-4 model for predictive tax obligation forecasting with context awareness. System generates formatted reports with structured forecast outputs and automatically sends comprehensive tax projections to agents via Gmail, storing results in Google Sheets for audit trails. Designed for tax professionals, accounting firms, and finance teams requiring accurate predictive tax planning, cash flow forecasting, and proactive compliance strategy without manual calculations. Setup Steps Configure OpenAI API key for GPT-4 model access Connect three revenue data sources with appropriate credentials Map data aggregation logic for multi-source consolidation Define structured output schema for forecast results Set up Gmail for automated agent notification Configure Google Sheets destination Prerequisites OpenAI API key with GPT-4 access, Gmail account, Google Sheets, three revenue data source credentials Use Cases Monthly tax liability projections, quarterly estimated tax planning Customization Adjust forecast model parameters, add additional revenue sources, modify email templates Benefits Eliminates manual tax calculations, enables proactive tax planning, improves cash flow forecasting accuracy
by yuta tokumitsu
Automate intelligent customer support responses with AI and Slack How it works Receive request via webhook with customer question Analyze sentiment and detect urgency using JavaScript Send urgent alerts to Slack for critical cases Search knowledge base and fetch conversation history from PostgreSQL Generate AI response with context-aware prompts Route intelligently: Auto-respond via email OR escalate to Slack Log everything to Google Sheets and PostgreSQL for analytics Setup steps Slack webhooks: Replace YOUR_URGENT_WEBHOOK and YOUR_ESCALATION_WEBHOOK with your webhook URLs Google Sheets: Replace YOUR_SPREADSHEET_ID with your spreadsheet ID and authenticate Email: Configure SMTP/Gmail credentials in the email node PostgreSQL (optional): Create support_conversations table or disable DB nodes Production: Replace mock AI nodes with OpenAI/Anthropic API nodes Key features Multi-language support (Japanese & English) Sentiment analysis with urgency detection Smart escalation routing Real-time Slack notifications Comprehensive analytics logging
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Keep your SEO performance on track with this automated SEO Watchlist Monitor! This workflow combines AI-powered strategy analysis with real-time search ranking checks to track keyword positions, identify content gaps, and alert you to critical ranking drops. Perfect for marketing teams ensuring search visibility and competitive intelligence across platforms. 🚀🔍 What This Template Does 1️⃣ Triggers daily SEO intelligence checks to monitor keyword performance. 2️⃣ Configures target keywords, competitor domains, and geographic focus. 3️⃣ Validates SEO configuration to ensure proper setup. 4️⃣ Uses AI to analyze keyword competitiveness and strategic opportunities. 5️⃣ Checks real-time search rankings using Google Search scraper. 6️⃣ Detects critical ranking drops below position 10. 7️⃣ Saves SEO intelligence to Google Sheets for tracking. 8️⃣ Sends email alerts for urgent ranking issues. 9️⃣ Provides daily Slack summaries of SEO performance. Key Benefits ✅ Monitors keyword rankings and competitor movements daily ✅ Identifies content gaps and strategic opportunities with AI analysis ✅ Alerts instantly to critical ranking drops for quick action ✅ Centralizes SEO intelligence in Google Sheets for team visibility ✅ Combines AI insights with real-time search data for comprehensive monitoring Features Daily automated schedule for continuous monitoring AI-powered SEO strategy analysis and competitive intelligence Real-time search ranking checks using Decodo scraper Critical alert system for ranking drops Google Sheets integration for data centralization Slack and Gmail notifications for team awareness Configuration validation and error logging Structured data parsing for consistent reporting Requirements OpenAI API credentials for AI analysis Decodo API credentials for search scraping Google Sheets OAuth2 credentials with edit access Gmail OAuth2 credentials for email alerts Slack Bot Token with chat:write permission Environment variables for configuration settings Target Audience SEO and digital marketing teams 🎯 Content strategy and growth teams 📈 Competitive intelligence professionals 🔍 Marketing operations teams 🚀 Agency account managers managing multiple clients 💼 Step-by-Step Setup Instructions 1️⃣ Connect OpenAI credentials for AI analysis capabilities 2️⃣ Set up Decodo API credentials for search scraping functionality 3️⃣ Configure Google Sheets with required headers (Keyword, Rank, description, etc.) 4️⃣ Add Gmail and Slack credentials for alerting and notifications 5️⃣ Set your target keywords, competitors, and geographic focus in the configuration node 6️⃣ Configure the cron schedule (hourly) for daily monitoring frequency 7️⃣ Run once manually to verify all integrations and data flow 8️⃣ Activate for ongoing SEO performance tracking and alerting ✅ Pro Tip: Use coupon code "YARON" to get 23K requests for testing (in Decodo)
by Praneel S
⚠️ Disclaimer: This workflow uses WhatsApp, Google Calendar, and Gmail nodes that must be configured manually. Who’s it for This workflow is built for professionals, teams, and automation enthusiasts who want to manage their Google Calendar and Gmail directly from WhatsApp, powered by an AI assistant using OpenAI GPT or Google Gemini. It enables users to chat naturally through WhatsApp to schedule meetings, send emails, and check events — all without opening Gmail or Google Calendar. How it works The WhatsApp Trigger node captures incoming messages from users. The AI Agent (powered by Gemini or GPT) interprets user queries and determines the best tool to use. The Simple Memory node keeps context between messages using the user’s phone number. The Google Calendar nodes handle: Listing, creating, and updating events. Checking your availability before scheduling. The Gmail nodes handle: Sending emails. Reading and summarizing recent messages. The Date & Time node converts natural language like “next Monday at 3 PM” into proper ISO time format. The assistant responds via Send WhatsApp Response, sending clear confirmations and replies. Features Manage Gmail and Calendar entirely via WhatsApp. AI-powered understanding of natural language commands. Integrated with Google Meet for automatic conferencing links. Short-term memory for context retention. Fully modular – swap Gemini with OpenAI GPT or any LLM. Setup Steps Configure WhatsApp Cloud API via Meta for Developers. Set up Google Calendar and Gmail OAuth2 credentials. Add your Google API keys and calendar email. Connect your OpenAI or Gemini model credentials. Activate and test the workflow with messages like: “Schedule a meeting tomorrow at 5 PM.” “Check my latest emails.” “Send an email to alex@example.com about our project.” Requirements n8n instance (self-hosted or cloud) WhatsApp Business API (Meta Developer Account) Google Workspace or Gmail account OpenAI API key or Google Gemini API key Properly configured webhooks for WhatsApp Trigger Example Prompts “What’s on my calendar this week?” “Email John to confirm our meeting.” “When am I free tomorrow afternoon?” Customization Replace Gemini with OpenAI GPT in the AI Agent node. Adjust memory length for longer or shorter conversations. Add Slack or Teams notification nodes. Modify the prompt personality or response tone. Credits Created by Praneel For detailed setup help, visit praneel.tech/contact
by Habeeb Mohammed
Who's it for This workflow transforms hours of manual video editing into an automated AI-powered pipeline. Perfect for anyone looking to repurpose long-form content into viral short-form clips. Ideal users include: Content Creators** - YouTubers producing long-form videos who want to maximize reach by automatically generating TikTok, Reels, and Shorts from their content Social Media Managers** - Agencies and freelancers handling multiple clients who need to scale clip production without hiring additional editors Podcasters** - Audio and video podcast hosts wanting to create promotional clips highlighting the best moments from each episode Video Editors** - Professional editors looking to automate repetitive clipping tasks and focus on creative decisions rather than technical execution Marketing Teams** - B2B and B2C teams extracting key moments from webinars, product demos, tutorials, and educational content for social campaigns Whether you're a solo creator or managing content at scale, this workflow saves 5-10 hours per video while maintaining professional quality output. How it works This workflow combines AI analysis with professional video editing tools to automatically identify and produce viral-ready clips from any YouTube video. The process flows through three main stages: Stage 1: Download and Analysis Submit a YouTube URL through the built-in form trigger yt-dlp simultaneously downloads the video in highest quality and extracts subtitles or auto-generated transcripts The transcript is intelligently chunked into 150-segment batches for optimal AI processing Each batch is analyzed by Gemini AI using specialized prompts that evaluate viral potential based on hooks, pacing, emotional peaks, and engagement triggers AI identifies 3-5 high-quality moments per batch and assigns virality scores to each potential clip Stage 2: Clip Selection and Extraction All AI-identified clips are merged and sorted by their virality scores The top 10 candidates are automatically selected for processing FFmpeg extracts each clip segment from the original video at precise timestamps Clips are processed sequentially to prevent system overload Stage 3: Professional Editing Pipeline Each clip enters a multi-stage editing subworkflow with automated operations: Smart 9:16 cropping that intelligently frames the subject for vertical platforms Precise trimming to remove dead air and optimize pacing Dynamic subtitle generation with sizing calculated based on video resolution Professional subtitle styling including bold text, high-contrast colors, strategic positioning, and text wrapping Subtitles are burned directly into the video as permanent overlays Final Delivery: The workflow processes clips with configurable wait times to match your system's capabilities. When all clips complete processing, you receive an email notification and find your social-ready clips in the /data/clips/ directory, ready for upload to any platform. Requirements ⚠️ Self-hosted n8n only - This workflow requires command-line access and cannot run on n8n Cloud due to its dependency on system-level tools. System dependencies you must install: FFmpeg** - Industry-standard video processing tool for trimming, cropping, and subtitle burning. Install on your n8n host system following this comprehensive guide. Most Linux systems can install via package manager: apt-get install ffmpeg or yum install ffmpeg. yt-dlp** - Advanced YouTube downloader that handles video and subtitle extraction. Follow official installation instructions. Recommended: pip install yt-dlp or direct binary download. FFprobe** - Usually included with FFmpeg, used for detecting video dimensions for dynamic subtitle sizing. Credentials needed: Google Gemini API account** - Powers the AI analysis for clip identification and editing instructions. Get your free API key with generous free tier limits. Gmail OAuth2 credentials** - Enables email notifications when clips are ready. Set up through n8n's credential system. Storage requirements: Ensure /data/clips/ directory exists with write permissions Plan for 2-3x the original video size in temporary storage during processing Final clips typically use 10-30% of original video size How to set up Step 1: Install system dependencies SSH into your n8n host and install required tools. For Ubuntu/Debian systems, run: apt-get update apt-get install ffmpeg pip install yt-dlp Verify installations by running ffmpeg -version and yt-dlp --version. Step 2: Configure directory structure Create the clips output directory with proper permissions: mkdir -p /data/clips chmod 755 /data/clips Step 3: Import the workflow Download the workflow JSON and import it into your n8n instance. You'll see several sticky notes color-coded by stage: yellow for description, blue for download/analysis, pink for editing operations, and green for clipping. Step 4: Set up credentials Navigate to the "viral clips identification" node and add your Google Gemini API credentials. The workflow uses the gemini-2.5-flash model for optimal speed and quality balance. Then configure Gmail OAuth2 in the "Send a message" node following n8n's authentication wizard. Step 5: Update email notification Open the "Send a message" node and replace habeebmohammedfaiz@gmail.com with your email address. Step 6: Create the editing subworkflow The workflow references a separate subworkflow for the editing pipeline. Create a new workflow in n8n, copy all nodes from the "EDITING" section (between the Execute Workflow Trigger and the final output), and save it. Note the workflow ID from the URL. Step 7: Link the subworkflow In the main workflow, open the "Call subworkflow" node and update the workflow ID to match your newly created editing workflow. Step 8: Test with a short video Start with a 5-10 minute YouTube video for your first test. Use the manual trigger or form submission. Monitor the execution to ensure all nodes complete successfully and clips appear in /data/clips/. Step 9: Adjust performance settings Based on your system's performance during the test, modify the Wait node durations. Systems with 8GB+ RAM and modern CPUs can reduce wait times to 30 seconds. Limited systems should keep 60-second waits or increase them. How to customize the workflow Adjust clip quantity and quality thresholds Open the "filter out top clips according to score" node. The code currently uses .slice(0, 10) to select the top 10 clips. Change this number based on your needs: use .slice(0, 5) for only the best clips, or .slice(0, 20) for more options. You can also add score filtering by adding results.filter(c => c.score > 0.7) before the slice operation to only include clips with virality scores above 70%. Customize subtitle appearance Navigate to the "calculate relative subtitle size" node. The JavaScript code defines several styling variables you can modify: fontSize - Currently calculated dynamically, but you can hardcode it: const fontSize = 48; fontName - Change from Arial to any system font: const fontName = 'Impact'; primaryColor - Modify text color using BGR hex format: '&H00FF00&' for green, '&HFF0000&' for red borderColor - Adjust outline color for better contrast outlineWidth - Increase from 1 to 2 or 3 for thicker borders marginV - Control vertical position (higher values move text up from bottom) Modify AI analysis prompts In the "viral clips identification" node, edit the Gemini prompt to target specific content types. For educational content, add "Focus on key teaching moments and actionable tips." For entertainment, emphasize "Identify funny moments, reactions, and unexpected events." For podcast clips, specify "Extract controversial opinions, storytelling segments, and quotable statements." Change aspect ratios The workflow defaults to 9:16 for vertical video. To create horizontal clips for YouTube or other platforms, open the "Analyze the actual whole video" node and change the aspect ratio in the JSON schema from "aspect_ratio": "9:16" to "aspect_ratio": "16:9". The AI will automatically adjust cropping coordinates accordingly. Enable audio normalization By default, audio normalization is disabled for faster processing. To enable it, open the "extract all actionable operations" node, find the audio_normalize task object, and change enabled: false to enabled: true. This ensures consistent volume levels across all clips but adds processing time. Add custom editing operations The editing pipeline is modular. You can add new operations like: Color grading by inserting FFmpeg color filters Logo overlays by adding watermark commands Intro/outro sequences by concatenating video files Background music by mixing audio tracks Add these as new task objects in the "extract all actionable operations" node following the existing pattern. Customize notification content Open the "Send a message" node to modify the email subject, body text, or add clip details. You can include clip metadata like timestamps, scores, and descriptions using expressions like {{ $json.hook }} or {{ $json.score }}. Integrate with cloud storage Add nodes after clip generation to automatically upload finished clips to Google Drive, Dropbox, AWS S3, or any n8n-supported storage service. Use the Loop Over Items1 output to access completed clip file paths. Schedule automated processing Replace the Form Trigger with a Schedule Trigger to automatically process videos from a spreadsheet or RSS feed. Combine with Google Sheets integration to maintain a queue of videos to process overnight.