by Jitesh Dugar
Jotform AI-Powered Loan Application & Pre-Approval Automation System Transform manual loan processing into same-day pre-approvals - achieving 50% faster closings, 90% reduction in manual review time, and automated underwriting decisions with AI-powered financial analysis and instant applicant notifications. What This Workflow Does Revolutionizes mortgage and loan processing with AI-driven financial analysis and automated decision workflows: 📝 Digital Application Capture - Jotform collects complete applicant data, income, employment, and loan details 🤖 AI Financial Analysis - GPT-4 calculates debt-to-income ratio, loan-to-value ratio, and approval likelihood 💳 Automated Credit Assessment - Instant credit score evaluation and payment history analysis 📊 Risk Scoring - AI assigns 1-100 risk scores based on multiple financial factors ✅ Intelligent Routing - Automatic pre-approval, conditional approval, or denial based on lending criteria 📧 Instant Notifications - Applicants receive approval letters within minutes of submission 👔 Underwriter Alerts - Pre-approved loans automatically route to loan officers with complete analysis 📋 Document Tracking - Required documents list generated based on application specifics 📅 Closing Scheduling - Approved loans trigger closing coordination workflows 📈 Complete Audit Trail - Every application logged with financial metrics and decision rationale Key Features AI Underwriting Analyst: GPT-4 evaluates loan applications across 10+ financial dimensions including debt ratios, risk assessment, and approval recommendations Debt-to-Income Calculation: Automatically calculates DTI ratio and compares against lending standards (43% threshold for qualified mortgages) Loan-to-Value Analysis: Evaluates down payment adequacy and property value against loan amount requested Credit Score Integration: Simulated credit assessment (ready for real credit bureau API integration like Experian, Equifax, TransUnion) Approval Likelihood Scoring: AI predicts approval probability as high/medium/low based on complete financial profile Risk Assessment: 1-100 risk score considers income stability, debt levels, credit history, and employment status Interest Rate Recommendations: AI suggests appropriate rate ranges based on applicant qualifications Conditional Approval Logic: Identifies specific requirements needed for final approval (additional documentation, debt paydown, etc.) Multi-Path Routing: Different workflows for pre-approved (green path), conditional (yellow path), and denied (red path) applications Monthly Payment Estimates: AI calculates estimated mortgage payments including principal, interest, taxes, and insurance Employment Verification Tracking: Flags employment status and stability in approval decision Document Requirements Generator: Custom list of required documents based on applicant situation and loan type Underwriter Dashboard Integration: Pre-approved applications automatically notify underwriters with complete financial summary Applicant Communication: Professional, branded emails for every outcome (pre-approval, conditional, denial) Alternative Options for Denials: Denied applicants receive constructive guidance on improving qualifications Compliance Ready: Decision rationale documented for regulatory compliance and audit requirements Perfect For Mortgage Lenders: Banks and credit unions processing home loan applications (purchase, refinance, HELOC) Commercial Lenders: Business loan and commercial real estate financing institutions Auto Finance Companies: Car dealerships and auto loan providers needing instant credit decisions Personal Loan Providers: Fintech companies and online lenders offering consumer loans Credit Unions: Member-focused financial institutions streamlining loan approval processes Mortgage Brokers: Independent brokers managing applications for multiple lenders Hard Money Lenders: Alternative lenders with custom underwriting criteria Student Loan Services: Educational financing with income-based qualification What You'll Need Required Integrations Jotform - Loan application form (free tier works, Pro recommended for file uploads) Create your form for free on Jotform using this link: https://www.jotform.com OpenAI API - GPT-4 for AI financial analysis and underwriting decisions (approximately 0.30-0.50 USD per application) Gmail - Automated notifications to applicants and underwriters Google Sheets - Loan application database and pipeline tracking Optional Integrations (Recommended for Production) Credit Bureau APIs - Experian, Equifax, or TransUnion for real credit pulls Document Management - DocuSign, HelloSign for e-signatures and document collection Property Appraisal APIs - Automated valuation models for property verification Calendar Integration - Calendly or Google Calendar for closing date scheduling CRM Systems - Salesforce, HubSpot for lead management and follow-up Loan Origination Software (LOS) - Encompass, Calyx, BytePro integration Quick Start Import Template - Copy JSON and import into n8n Add OpenAI Credentials - Set up OpenAI API key (GPT-4 required for accurate underwriting) Create Jotform Loan Application: Full Name (q3_fullName) Email (q4_email) Phone (q5_phone) Social Security Number (q6_ssn) - encrypted field Monthly Income (q7_monthlyIncome) - number field Monthly Debts (q8_monthlyDebts) - number field (credit cards, car loans, student loans) Loan Amount Requested (q9_loanAmount) - number field Down Payment (q10_downPayment) - number field Property Value (q11_propertyValue) - number field Employment Status (q12_employmentStatus) - dropdown (Full-time, Part-time, Self-employed, Retired) Additional fields: Date of Birth, Address, Employer Name, Years at Job, Property Address Configure Gmail - Add Gmail OAuth2 credentials (same for all 4 Gmail nodes) Setup Google Sheets: Create spreadsheet with "Loan_Applications" sheet Replace YOUR_GOOGLE_SHEET_ID in workflow 16 columns auto-populate: timestamp, applicationId, applicantName, email, phone, loanAmount, downPayment, monthlyIncome, monthlyDebts, creditScore, dtiRatio, ltvRatio, riskScore, approvalStatus, monthlyPayment, interestRate Customize Approval Criteria (Optional): Edit "Check Approval Status" node Adjust credit score minimum (default: 680) Modify DTI threshold (default: 43%) Set LTV requirements Configure Credit Integration: Replace "Simulate Credit Check" node with real credit bureau API Or keep simulation for testing/demo purposes Brand Email Templates: Update company name, logo, contact information Customize approval letter formatting Add compliance disclosures as required Set Underwriter Email: Update underwriter contact in "Notify Underwriter" node Add CC recipients for loan ops team Test Workflow - Submit test applications with different scenarios: High income, low debt (should pre-approve) Moderate income, high debt (should conditional) Low income, excessive debt (should deny) Compliance Review - Have legal/compliance team review automated decision logic Go Live - Deploy form on website, share with loan officers, integrate with marketing Customization Options Loan Type Variations: Customize for conventional, FHA, VA, USDA, jumbo, or commercial loans Custom Underwriting Rules: Adjust DTI limits, credit minimums, LTV requirements per loan product Manual Review Triggers: Flag edge cases for manual underwriter review before automation Document Upload Integration: Add Jotform file upload fields for paystubs, tax returns, bank statements Income Verification APIs: Integrate with Plaid, Finicity, or Argyle for automated income verification Employment Verification: Connect to The Work Number or other employment databases Property Appraisal Automation: Integrate AVMs (Automated Valuation Models) from CoreLogic, HouseCanary Co-Borrower Support: Add fields and logic for joint applications with multiple income sources Business Loan Customization: Modify for business financials (revenue, EBITDA, business credit scores) Rate Shopping: Integrate rate tables to provide real-time interest rate quotes Pre-Qualification vs Pre-Approval: Create lighter version for soft credit pull pre-qualification Conditional Approval Workflows: Automated follow-up sequences for document collection Closing Coordination: Integrate with title companies, attorneys, closing services Regulatory Compliance: Add TRID timeline tracking, adverse action notices, HMDA reporting Multi-Language Support: Translate forms and emails for Spanish, Chinese, other languages Expected Results Same-day pre-approval - Applications processed in minutes vs 3-5 days manual review 50% faster closings - Streamlined process reduces time from application to closing 90% reduction in manual review time - AI handles initial underwriting, humans only review exceptions 95% applicant satisfaction - Instant decisions and clear communication improve experience 75% reduction in incomplete applications - Required fields force complete submission 60% fewer applicant calls - Automated status updates reduce "where's my application" inquiries 100% application tracking - Complete audit trail from submission to final decision 40% increase in loan officer productivity - Focus on high-value activities, not data entry 80% decrease in approval errors - Consistent AI analysis eliminates human calculation mistakes 30% improvement in compliance - Automated documentation and decision rationale for audits Pro Tips Test with Multiple Scenarios: Submit applications with various income/debt combinations to validate routing logic works correctly Adjust DTI Thresholds for Loan Type: Conventional mortgages: 43% max. FHA loans: 50% max. Auto loans: 35-40% max. Personal loans: 40-45% max. Credit Score Tiers Matter: Build rate sheets with score tiers (740+: prime, 680-739: near-prime, 620-679: subprime, below 620: denied or hard money) Income Verification Priorities: W-2 employees (easy), self-employed (complex), commission/bonus heavy (average 2 years), rental income (75% counts), gig economy (difficult) Document Checklist Customization: Vary required docs by loan type, amount, and risk profile to avoid over-documentation for low-risk loans Conditional Approval vs Outright Denial: When in doubt, use conditional - gives applicants path to approval and keeps them in pipeline Adverse Action Notices: For denials, include specific reasons (per FCRA requirements) and instructions for disputing credit report errors Pre-Qualification vs Pre-Approval: Pre-qual uses soft credit pull (no impact on score), pre-approval uses hard pull (official decision) Co-Borrower Logic: When DTI is high, automatically suggest co-borrower as option to strengthen application Rate Lock Automation: Pre-approved applications should include rate lock expiration date (typically 30-60 days) Property Appraisal Triggers: Auto-order appraisals for pre-approved mortgage applications to keep process moving Underwriter Dashboard: Build Google Sheets dashboard with filters for underwriters to sort by approval status, loan amount, date Compliance Monitoring: Regular audits of AI decisions to ensure no discriminatory patterns (disparate impact analysis) Customer Service Integration: Link application IDs to support tickets so agents can quickly pull up loan status Marketing Attribution: Track lead sources in form to measure which marketing channels produce best-quality applicants Learning Resources This workflow demonstrates advanced automation: AI Agents for Financial Analysis: Multi-dimensional loan qualification using BANT-style underwriting criteria Complex Conditional Logic: Multi-path routing with nested IF conditions for approval/conditional/denial workflows Financial Calculations: Automated DTI, LTV, DSCR, and payment estimation algorithms Risk Scoring Models: Comprehensive risk assessment combining credit, income, debt, and employment factors Decision Documentation: Complete audit trail with AI reasoning for regulatory compliance Email Customization: Dynamic content generation based on approval outcomes and applicant situations Data Pipeline Design: Structured data flow from application through analysis to decision and notification Simulation vs Production: Credit check node designed for easy swap from simulation to real API integration Parallel Processing: Simultaneous logging and notification workflows for efficiency Workflow Orchestration: Coordination of multiple decision points and communication touchpoints Questions or customization? The workflow includes detailed sticky notes explaining each analysis component and decision logic. Template Compatibility ✅ n8n version 1.0+ ✅ Works with n8n Cloud and Self-Hosted ✅ Production-ready for financial institutions ✅ Fully customizable for any loan type Compliance Note: This template is designed for demonstration and automation purposes. Always consult with legal counsel to ensure compliance with TILA, RESPA, ECOA, FCRA, and applicable state lending regulations before deploying in production.
by Servify
This n8n template demonstrates how to build an autonomous AI assistant that handles real business tasks through natural conversation on Telegram. The example shows meeting scheduling with CRM lookup and calendar management, but the architecture supports any business automation you can imagine - simply add tools and the AI learns to use them automatically. Use cases are many: Try automating appointment scheduling, customer support tickets, invoice generation, lead qualification, email management, report generation, data entry, or task coordination! Good to know OpenAI API costs are minimal at ~$0.001 per conversation with GPT-4o-mini The AI agent makes autonomous decisions and can chain multiple tool calls to complete complex tasks Conversation context is not persisted between sessions (can be extended with a memory database) Calendar availability is checked for business hours (9 AM - 4 PM) by default The workflow assumes contacts are stored in Google Sheets with Name and Email columns This is production-ready code that can be deployed immediately for real business use How it works User sends a natural language message to the Telegram bot requesting a meeting The workflow extracts message content, chat ID, and user information CRM database is loaded from Google Sheets containing contact information The AI agent analyzes the request and autonomously decides which tools to use AI searches CRM for contacts, checks Google Calendar availability, and proposes 3 available time slots User confirms their preferred time through conversational reply Upon confirmation, the workflow creates a Google Calendar event with both parties invited A professional confirmation email is automatically sent via Gmail to the meeting attendee The entire multi-step process executes autonomously through simple conversation How to use Set up a Google Sheet as your CRM with columns: Name, Email, Phone Create a Telegram bot via BotFather and get your bot token Import this workflow and connect your credentials (Telegram, OpenAI, Google Sheets, Calendar, Gmail) Replace placeholder IDs with your actual Google Sheet ID and Calendar ID in the workflow nodes Activate the workflow to start listening for Telegram messages Test with: "Schedule a meeting with [contact name] tomorrow at 2 PM" Customize the AI Agent system prompt to match your scheduling preferences and timezone Requirements Telegram Bot Token (free from BotFather) OpenAI API account with GPT-4o-mini access Google Sheets OAuth2 credentials for CRM database access Google Calendar OAuth2 credentials for availability checking and event creation Gmail OAuth2 credentials for sending confirmation emails Customising this workflow Add new tools (APIs, databases, services) and the AI automatically learns to use them - no code changes needed Replace Telegram with Slack, WhatsApp, or SMS for different communication channels Extend capabilities beyond scheduling: invoice generation, customer support, lead qualification, report generation Connect external systems like HubSpot, Salesforce, QuickBooks, Asana, or custom APIs Add conversation memory by integrating a vector database for context-aware multi-turn conversations Implement approval workflows where AI drafts actions for human review before execution Build multiple specialized assistants with different tools and expertise for various business functions
by vinci-king-01
Software Vulnerability Tracker with Pushover and Notion ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically scans multiple patent databases on a weekly schedule, filters new filings relevant to selected technology domains, saves the findings to Notion, and pushes instant alerts to your mobile device via Pushover. It is ideal for R&D teams and patent attorneys who need up-to-date insights on emerging technology trends and competitor activity. Pre-conditions/Requirements Prerequisites An n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Active Notion account with an integration created Pushover account (user key & application token) List of technology keywords / CPC codes to monitor Required Credentials ScrapeGraphAI API Key** – Enables web scraping of patent portals Notion Credential** – Internal Integration Token with database write access Pushover Credential** – App Token + User Key for push notifications Additional Setup Requirements | Service | Needed Item | Where to obtain | |---------|-------------|-----------------| | USPTO, EPO, WIPO, etc. | Public URLs for search endpoints | Free/public | | Notion | Database with properties: Title, Abstract, URL, Date | Create in Notion | | Keyword List | Text file or environment variable PATENT_KEYWORDS | Define yourself | How it works This workflow automatically scans multiple patent databases on a weekly schedule, filters new filings relevant to selected technology domains, saves the findings to Notion, and pushes instant alerts to your mobile device via Pushover. It is ideal for R&D teams and patent attorneys who need up-to-date insights on emerging technology trends and competitor activity. Key Steps: Schedule Trigger**: Fires every week (default Monday 08:00 UTC). Code (Prepare Queries)**: Builds search URLs for each keyword and data source. SplitInBatches**: Processes one query at a time to respect rate limits. ScrapeGraphAI**: Scrapes patent titles, abstracts, links, and publication dates. Code (Normalize & Deduplicate)**: Cleans data, converts dates, and removes already-logged patents. IF Node**: Checks whether new patents were found. Notion Node**: Inserts new patent entries into the specified database. Pushover Node**: Sends a concise alert summarizing the new filings. Sticky Notes**: Document configuration tips inside the workflow. Set up steps Setup Time: 10-15 minutes Install ScrapeGraphAI: In n8n, go to “Settings → Community Nodes” and install @n8n-nodes/scrapegraphai. Add Credentials: ScrapeGraphAI: paste your API key. Notion: add the internal integration token and select your database. Pushover: provide your App Token and User Key. Configure Keywords: Open the first Code node and edit the keywords array (e.g., ["quantum computing", "Li-ion battery", "5G antenna"]). Point to Data Sources: In the same Code node, adjust the sources array if you want to add/remove patent portals. Set Notion Database Mapping: In the Notion node, map properties (Name, Abstract, Link, Date) to incoming JSON fields. Adjust Schedule (optional): Double-click the Schedule Trigger and change the CRON expression to your preferred interval. Test Run: Execute the workflow manually. Confirm that the Notion page is populated and a Pushover notification arrives. Activate: Switch the workflow to “Active” to enable automatic weekly execution. Node Descriptions Core Workflow Nodes: Schedule Trigger** – Defines the weekly execution time. Code (Build Search URLs)** – Dynamically constructs patent search URLs. SplitInBatches** – Sequentially feeds each query to the scraper. ScrapeGraphAI** – Extracts patent metadata from HTML pages. Code (Normalize Data)** – Formats dates, adds UUIDs, and checks for duplicates. IF** – Determines whether new patents exist before proceeding. Notion** – Writes new patent records to your Notion database. Pushover** – Sends real-time mobile/desktop notifications. Data Flow: Schedule Trigger → Code (Build Search URLs) → SplitInBatches → ScrapeGraphAI → Code (Normalize Data) → IF → Notion & Pushover Customization Examples Change Notification Message // Inside the Pushover node "Message" field return { message: 📜 ${items[0].json.count} new patent(s) detected in ${new Date().toDateString()}, title: '🆕 Patent Alert', url: items[0].json.firstPatentUrl, url_title: 'Open first patent' }; Add Slack Notification Instead of Pushover // Replace the Pushover node with a Slack node { text: ${$json.count} new patents published:\n${$json.list.join('\n')}, channel: '#patent-updates' } Data Output Format The workflow outputs structured JSON data: { "title": "Quantum Computing Device", "abstract": "A novel qubit architecture that ...", "url": "https://patents.example.com/US20240012345A1", "publicationDate": "2024-06-01", "source": "USPTO", "keywordsMatched": ["quantum computing"] } Troubleshooting Common Issues No data returned – Verify that search URLs are still valid and the ScrapeGraphAI selector matches the current page structure. Duplicate entries in Notion – Ensure the “Normalize Data” code correctly checks for existing URLs or IDs before insert. Performance Tips Limit the number of keywords or schedule the workflow during off-peak hours to reduce API throttling. Enable caching inside ScrapeGraphAI (if available) to minimize repeated requests. Pro Tips: Use environment variables (e.g., {{ $env.PATENT_KEYWORDS }}) to manage keyword lists without editing nodes. Chain an additional “HTTP Request → ML Model” step to auto-classify patents by CPC codes. Create a Notion view filtered by publicationDate is within past 30 days for quick scanning.
by Juan Carlos Cavero Gracia
This automation workflow is designed for e-commerce businesses, digital marketers, and entrepreneurs who need to create high-quality promotional content for their products quickly and efficiently. From a single product image and description, the system automatically generates 4 promotional carousel-style images, perfect for social media, advertising campaigns, or web catalogs. Note: This workflow uses Gemini 2.5 Flash API for image generation, imgbb for image storage, and upload-post.com for automatic Instagram, Tiktok, Facebook and Youtube publishing* Who Is This For? E-commerce Owners:** Transform basic product photos into professional promotional content featuring real people using products in authentic situations. Digital Marketers & Agencies:** Generate multiple advertising content variations for Facebook Ads, Instagram Stories, and digital marketing campaigns. Small Businesses & Entrepreneurs:** Create professional promotional material without expensive photo shoots or graphic designers. Social Media Managers:** Produce engaging and authentic content that drives engagement and conversions across all social platforms. What Problem Does This Workflow Solve? Creating quality promotional content requires time, resources, and design skills. This workflow addresses these challenges by: Automatic Carousel Generation:** Converts a single product photo into 4 promotional images featuring people using the product naturally. Authentic & Engaging Content:** Generates images showing real product usage, increasing credibility and conversions. Integrated Promotional Text:** Automatically includes visible offers, benefits, and call-to-actions in the images. Social Media Optimization:** Produces vertical 9:16 format images, perfect for Instagram, TikTok, and Facebook Stories. Automatic Publishing:** Optionally publishes the complete carousel directly to Instagram with AI-generated optimized descriptions. How It Works Product Upload: Upload a product image and provide detailed description through the web form. Smart Analysis: The AI agent analyzes the product and creates a storyboard of 4 different promotional images. Image Generation: Gemini 2.5 Flash generates 4 variations showing people using the product in authentic contexts. Automatic Processing: Images are automatically processed, optimized, and stored in imgbb. Promotional Description: GPT-4 generates an attractive, social media-optimized description based on the created images. Optional Publishing: The system can automatically publish the complete carousel to Instagram. Setup fal.ai Credentials: Sign up at fal.ai and add your API token to the Gemini 2.5 Flash nodes. imgbb API: Create an account at imgbb.com Get your API key and configure it in the "Set APIs Vars" node Upload-Post (Optional): For automatic Instagram publishing: Register your account at upload-post.com Connect your Instagram business account Configure credentials in the "Upload Post" node OpenAI API: Configure your OpenAI API key for promotional description generation. Requirements Accounts:** n8n, fal.ai, imgbb.com, OpenAI, upload-post.com (optional), Instagram business (optional). API Keys:** fal.ai token, imgbb API key, OpenAI API key, upload-post.com credentials. Image Format:** Any standard image format (JPG, PNG, WebP) of the product to promote. Features Advanced Generative AI:** Uses Gemini 2.5 Flash to create realistic images of people using products Smart Storyboard:** Automatically creates 4 different concepts to maximize engagement Integrated Promotional Text:** Includes offers, benefits, and CTAs directly in the images Optimized Format:** Generates vertical 9:16 images perfect for social media Parallel Processing:** Generates all 4 images simultaneously for maximum efficiency Automatic Publishing:** Option to publish directly to Instagram with optimized descriptions Use this template to transform basic product photos into complete promotional campaigns, saving time and resources while generating high-quality content that converts visitors into customers.
by WeblineIndia
AI-Powered Smart Deal Close Prediction and Salesforce CRM Auto-Update Workflow This workflow acts as an automated, intelligent sales operations assistant. It continuously monitors your Salesforce account for newly updated opportunities, compares them against your historical win data and uses a powerful AI (Groq Llama-3) to predict realistic close dates and win probabilities. If the AI is highly confident in its prediction, it automatically updates the deal in Salesforce. If the AI is uncertain, it emails a manager to review the deal manually. Everything is neatly logged in a Google Spreadsheet for easy tracking. Quick Implementation Steps Connect Credentials: Authenticate your Salesforce, Groq, Gmail and Google Sheets accounts within your n8n account. Prepare the Audit Sheet: Create a new Google Sheet and copy its Document ID into the two Google Sheets nodes. Set the Schedule: Adjust the Schedule Trigger to run at your preferred interval (default is optimized for frequent checks). Activate: Turn on the workflow and watch your pipeline automatically clean itself. What It Does First, the workflow wakes up on a set schedule and looks for two things in Salesforce: a small batch of your recently won deals (to understand what success looks like) and any open opportunities that were modified recently. It filters these to ensure it only spends time on active deals that actually have a dollar amount attached to them. Next, it acts like a data scientist. It grabs the recent task history for each deal and calculates custom metrics—like how fast the deal is moving, how long it has been open and a "Risk Score" based on user engagement. All this data is packaged up and securely sent to a Groq LLM agent. The AI acts as a seasoned sales strategist, weighing these factors to predict a realistic timeline and the actual chance of winning the deal. Finally, the workflow makes a smart decision based on the AI's confidence score. If the AI is 70% or more confident in its assessment, it goes straight into Salesforce and updates the target close date to keep your pipeline accurate. If the confidence is lower, it sends a formatted email via Gmail to alert a sales manager that a deal needs human attention. Regardless of the path taken, every single prediction and action is logged into a Google Sheet for your RevOps team to review. Who It's For Sales Managers & Directors** Who want an unbiased, data-driven view of when deals will actually close, rather than relying on gut feelings. Revenue Operations (RevOps)** Who need accurate pipeline data and want to automate the tedious process of "pipeline scrubbing." CRM Administrators** Who want to reduce the administrative burden on sales reps by automatically updating stagnant close dates. Requirements to use this workflow To use this workflow, you will need n8n account with the following active accounts: Salesforce:** With API access enabled to read opportunities and tasks and update opportunities. Groq:** An API key to access the Llama-3.3-70b AI model. Gmail:** To send the low-confidence alerts. Google Workspace / Sheets:** To maintain the automated audit logs. How It Works & Set Up 1. App Authentication Before doing anything, ensure you have added your credentials for Salesforce (OAuth2), Groq (API Key), Gmail (OAuth2) and Google Sheets (OAuth2) in your n8n environment. 2. Configure the Google Sheet You need a destination for the audit logs. Create a new Google Sheet and ensure it has the following exact column headers in the first row: timestamp opportunity_id opportunity_name stage_name current_amount risk_score risk_label predicted_close_date predicted_win_probability confidence_score reasoning next_best_action action_taken status Open both Google Sheets nodes ("Log Auto-Update Success" and "Log Pending Review") and replace the Document ID with the ID of your newly created sheet. 3. Timing and Lookback Setup The workflow uses a "Set Lookback Timeframe" node to only grab deals modified in the last 5 minutes. If you change your "Run Schedule" to run every hour, you must also update the code in the "Set Lookback Timeframe" node to look back 60 minutes instead of 5, so you don't miss any deals. 4. Review the AI Prompt Open the "AI Deal Timeline Predictor" LangChain node. Review the System Message. If your company has specific sales stages or unique risk factors, you can type them directly into the prompt to make the AI's predictions even smarter for your specific business. How To Customize Nodes Adjusting the Confidence Threshold** Open the check confidence score If node. It is currently set to 70. If you want the AI to be more aggressive with automatic updates, lower this number. If you want more manual reviews, raise it to 80 or 90. Modifying Risk Calculations** The Calculate Deal Risk & Velocity Code node contains JavaScript that assigns risk based on how long a deal has been open and how many tasks are associated with it. You can tweak the numbers in this code to better fit your typical sales cycle length. Changing the Alert System** If you don't use Gmail, you can easily delete the Gmail node and replace it with a Slack or Microsoft Teams node to send the review alerts directly to a sales channel. Add‑ons You can easily extend this workflow to do even more: Push AI Advice to CRM** Add another Salesforce update node to push the AI's next_best_action directly into a custom field on the Opportunity, giving the sales rep instant coaching. Urgent SMS Alerts** Connect a Twilio node alongside the Gmail node to text the VP of Sales if a massive deal (e.g., over $100k) receives a high risk score. Bi-Weekly Summary** Create a separate simple workflow that reads the Google Sheet every Friday and emails a summary of all AI predictions to the executive team. Use Case Examples Automated Pipeline Scrubbing Automatically push out the close dates of neglected deals to the next quarter, ensuring the current quarter's forecast remains mathematically realistic without nagging sales reps. Early Warning System for Stalled Deals Instantly alert managers when a high-value opportunity shows a sudden drop in engagement or task activity, allowing leadership to step in before the deal is lost. Data-Driven Sales Coaching Use the AI's generated reasoning and recommended next steps to help junior account executives figure out how to unblock a complex negotiation. Historical Win-Rate Benchmarking Compare the current active pipeline against what actually won in the past, giving RevOps a clear picture of whether the current pipeline quality is better or worse than the previous quarter. Enforcing CRM Hygiene Identify and flag opportunities that have a 90% probability but haven't had a single phone call or email logged in three weeks. Troubleshooting Guide | Issue | Possible Cause | Solution | | :--- | :--- | :--- | | Workflow isn't processing any deals | Schedule and lookback timeframes don't match or no deals were modified recently. | Ensure the minutes in the Schedule node match the mathematical subtraction in the "Set Lookback Timeframe" node. | | "Invalid JSON returned from AI" error | The LLM ignored instructions and added extra conversational text (like "Here is your data:"). | The workflow already has a "Parse AI Output" cleanup node. If it still fails, adjust the Groq prompt to strictly enforce JSON-only responses. | | Google Sheets node fails to write data | The Google Sheet ID is missing or the column headers in your sheet do not perfectly match the node. | Verify the Document ID. Ensure the headers in your Sheet exactly match the 14 fields listed in the setup instructions above. | | Salesforce API Limit errors | Fetching too much data too frequently. | Increase the interval on your Schedule trigger (e.g., run every 30 minutes instead of 5) to reduce API calls. | | AI Close Dates are completely wrong | The AI lacks context about your specific average sales cycle length. | Edit the AI's System Message prompt to tell it your average sales cycle (e.g., "Our standard enterprise deal takes 90 days to close"). | Need Help? Building dynamic, AI-driven automation workflows can transform your business, but getting the data logic perfectly tuned sometimes requires an expert touch. If you need help setting up this workflow, customizing the custom JavaScript risk scoring, integrating it with a different CRM or building more advanced automation solutions tailored to your unique operations, we are here for you. Reach out to our n8n workflow developers at WeblineIndia to get expert assistance and start maximizing the value of your business process automations today!
by Zain Ali
🧠 RAG with Full Gmail history + Real time email updates in RAG using OpenAI & Qdrant > Summary: > This workflow listens for new Gmail messages, extracts and cleans email content, generates embeddings via OpenAI, stores them in a Qdrant vector database, and then enables a Retrieval‑Augmented‑Generation (RAG) agent to answer user queries against those stored emails. It’s designed for teams or bots that need conversational access to past emails. 🧑🤝🧑 Who’s it for Support teams** who want to surface past customer emails in chatbots or help‑desk portals Sales ops** that need AI‑powered summaries and quick lookup of email histories Developers** building RAG agents over email archives ⚙️ How it works / What it does Trigger Gmail Trigger polls every minute for new messages. Fetch & Clean Get Mail Data pulls full message metadata and body. Code node normalizes the body (removes line breaks, collapses spaces). Embed & Store Embeddings OpenAI node computes vector embeddings. Qdrant Vector Store inserts embeddings + metadata into the emails_history collection. Batch Processing SplitInBatches handles large inbox loads in chunks of 50. RAG Interaction When chat message received → RAG Agent → uses Qdrant Email Vector Store as a tool to retrieve relevant email snippets before responding. Memory Simple Memory buffer ensures the agent retains recent context. 🛠️ How to set up n8n Instance Deploy n8n (self‑hosted or via Coolify/Docker). Credentials Create an OAuth2 credential in n8n for Gmail (with Gmail API scopes). Add your OpenAI API key in n8n credentials. Qdrant Stand up a Qdrant instance (self‑hosted or Qdrant Cloud). Note your host, port, and API key (if any). Import Workflow In n8n, go to Workflows → Import → paste the JSON you provided. Ensure each credential reference (Gmail & OpenAI) matches your n8n credential IDs. Test Click Execute Workflow or send a test email to your Gmail. Monitor n8n logs: you should see new points in Qdrant and RAG responses. 📋 Requirements n8n** (Self-hosted or Cloud) Gmail API** enabled on a Google Cloud project OpenAI API** access (with Embedding & Chat endpoints) Qdrant** (hosted or cloud) with a collection named emails_history 🎨 How to customize the workflow Change Collection Name** Update the qdrantCollection.value in all Qdrant nodes if you prefer a different collection. Adjust Polling Frequency** In the Gmail Trigger node, switch from everyMinute to everyFiveMinutes or a webhook‑style trigger. Metadata Tags** In Enhanced Default Data Loader, tweak the metadataValues to tag by folder, label, or sender domain. Batch Size** In SplitInBatches, change batchSize to suit your inbox volume. RAG Agent Prompt** Customize the systemMessage in the RAG Agent node to set the assistant’s tone, instruct on date handling, or add additional tools. Additional Tools** Chain other n8n nodes (e.g., Slack, Discord) after the RAG Agent to broadcast AI answers to team channels.
by Avkash Kakdiya
How it works This workflow automatically generates personalized follow-up messages for leads or customers after key interactions (e.g., demos, sales calls). It enriches contact details from HubSpot (or optionally Monday.com), uses AI to draft a professional follow-up email, and distributes it across multiple communication channels (Slack, Telegram, Teams) as reminders for the sales team. Step-by-step 1. Trigger & Input Schedule Trigger – Runs automatically at a defined interval (e.g., daily). Set Sample Data – Captures the contact’s name, email, and context from the last interaction (e.g., “had a product demo yesterday and showed strong interest”). 2. Contact Enrichment HubSpot Contact Lookup – Searches HubSpot CRM by email to confirm or enrich contact details. Monday.com Contact Fetch (Optional) – Can pull additional CRM details if enabled. 3. AI Message Generation AI Language Model (OpenAI) – Provides the underlying engine for message creation. Generate Follow-Up Message – Drafts a short, professional, and friendly follow-up email: References previous interaction context. Suggests clear next steps (call, resources, etc.). Ends with a standardized signature block for consistency. 4. Multi-Channel Communication Slack Reminder – Posts the generated message as a reminder in the sales team’s Slack channel. Telegram Reminder – Sends the follow-up draft to a Telegram chat. Teams Reminder – Shares the same message in a Microsoft Teams channel. Benefits Personalized Outreach at Scale – AI ensures each follow-up feels tailored and professional. Context-Aware Messaging – Pulls in CRM details and past interactions for relevance. Cross-Platform Delivery – Distributes reminders via Slack, Teams, and Telegram so no follow-up is missed. Time-Saving for Sales Teams – Eliminates manual drafting of repetitive follow-up emails. Consistent Branding – Ensures every message includes a unified signature block.
by explorium
Inbound Agent - AI-Powered Lead Qualification with Product Usage Intelligence This n8n workflow automatically qualifies and scores inbound leads by combining their product usage patterns with deep company intelligence. The workflow pulls new leads from your CRM, analyzes which API endpoints they've been testing, enriches them with firmographic data, and generates comprehensive qualification reports with personalized talking points—giving your sales team everything they need to prioritize and convert high-quality leads. DEMO Template Demo Credentials Required To use this workflow, set up the following credentials in your n8n environment: Salesforce Type:** OAuth2 or Username/Password Used for:** Pulling lead reports and creating follow-up tasks Alternative CRM options: HubSpot, Zoho, Pipedrive Get credentials at Salesforce Setup Databricks (or Analytics Platform) Type:** HTTP Request with Bearer Token Header:** Authorization Value:** Bearer YOUR_DATABRICKS_TOKEN Used for:** Querying product usage and API endpoint data Alternative options: Datadog, Mixpanel, Amplitude, custom data warehouse Explorium API Type:** Generic Header Auth Header:** Authorization Value:** Bearer YOUR_API_KEY Used for:** Business matching and firmographic enrichment Get your API key at Explorium Dashboard Explorium MCP Type:** HTTP Header Auth Used for:** Real-time company intelligence and supplemental research Connect to: https://mcp.explorium.ai/mcp Anthropic API Type:** API Key Used for:** AI-powered lead qualification and analysis Get your API key at Anthropic Console Go to Settings → Credentials, create these credentials, and assign them in the respective nodes before running the workflow. Workflow Overview Node 1: When clicking 'Execute workflow' Manual trigger that initiates the lead qualification process. Type:** Manual Trigger Purpose:** On-demand execution for testing or manual runs Alternative Trigger Options: Schedule Trigger:** Run automatically (hourly, daily, weekly) Webhook:** Trigger on CRM updates or new lead events CRM Trigger:** Real-time activation when leads are created Node 2: GET SF Report Pulls lead data from a pre-configured Salesforce report. Method:** GET Endpoint:** Salesforce Analytics Reports API Authentication:** Salesforce OAuth2 Returns: Raw Salesforce report data including: Lead contact information Company names Lead source and status Created dates Custom fields CRM Alternatives: This node can be replaced with HubSpot, Zoho, or any CRM's reporting API. Node 3: Extract Records Parses the Salesforce report structure and extracts individual lead records. Extraction Logic: Navigates report's factMap['T!T'].rows structure Maps data cells to named fields Node 4: Extract Tenant Names Prepares tenant identifiers for usage data queries. Purpose: Formats tenant names as SQL-compatible strings for the Databricks query Output: Comma-separated, quoted list: 'tenant1', 'tenant2', 'tenant3' Node 5: Query Databricks Queries your analytics platform to retrieve API usage data for each lead. Method:** POST Endpoint:** /api/2.0/sql/statements Authentication:** Bearer token in headers Warehouse ID:** Your Databricks cluster ID Platform Alternatives: Datadog:** Query logs via Logs API Mixpanel:** Event segmentation API Amplitude:** Behavioral cohorts API Custom Warehouse:** PostgreSQL, Snowflake, BigQuery queries Node 6: Split Out Splits the Databricks result array into individual items for processing. Field:** result.data_array Purpose:** Transform single response with multiple rows into separate items Node 7: Rename Keys Normalizes column names from database query to readable field names. Mapping: 0 → TenantNames 1 → endpoints 2 → endpointsNum Node 8: Extract Business Names Prepares company names for Explorium enrichment. Node 9: Loop Over Items Iterates through each company for individual enrichment. Node 10: Explorium API: Match Businesses Matches company names to Explorium's business entity database. Method:** POST Endpoint:** /v1/businesses/match Authentication:** Header Auth (Bearer token) Returns: business_id: Unique Explorium identifier matched_businesses: Array of potential matches Match confidence scores Node 11: Explorium API: Firmographics Enriches matched businesses with comprehensive company data. Method:** POST Endpoint:** /v1/businesses/firmographics/bulk_enrich Authentication:** Header Auth (Bearer token) Returns: Company name, website, description Industry categories (NAICS, SIC, LinkedIn) Size: employee count range, revenue range Location: headquarters address, city, region, country Company age and founding information Social profiles: LinkedIn, Twitter Logo and branding assets Node 12: Merge Combines API usage data with firmographic enrichment data. Node 13: Organize Data as Items Structures merged data into clean, standardized lead objects. Data Organization: Maps API usage by tenant name Maps enrichment data by company name Combines with original lead information Creates complete lead profile for analysis Node 14: Loop Over Items1 Iterates through each qualified lead for AI analysis. Batch Size:** 1 (analyzes leads individually) Purpose:** Generate personalized qualification reports Node 15: Get many accounts1 Fetches the associated Salesforce account for context. Resource:** Account Operation:** Get All Filter:** Match by company name Limit:** 1 record Purpose: Link lead qualification back to Salesforce account for task creation Node 16: AI Agent Analyzes each lead to generate comprehensive qualification reports. Input Data: Lead contact information API usage patterns (which endpoints tested) Firmographic data (company profile) Lead source and status Analysis Process: Evaluates lead quality based on usage, company fit, and signals Identifies which Explorium APIs the lead explored Assesses company size, industry, and potential value Detects quality signals (legitimate company email, active usage) and red flags Determines optimal sales approach and timing Connected to Explorium MCP for supplemental company research if needed Output: Structured qualification report with: Lead Score:** High Priority, Medium Priority, Low Priority, or Nurture Quick Summary:** Executive overview of lead potential API Usage Analysis:** Endpoints used, usage insights, potential use case Company Profile:** Overview, fit assessment, potential value Quality Signals:** Positive indicators and concerns Recommended Actions:** Next steps, timing, and approach Talking Points:** Personalized conversation starters based on actual API usage Node 18: Clean Outputs Formats the AI qualification report for Salesforce task creation. Node 19: Update Salesforce Records Creates follow-up tasks in Salesforce with qualification intelligence. Resource:** Task Operation:** Create Authentication:** Salesforce OAuth2 Alternative Output Options: HubSpot:** Create tasks or update deal stages Outreach/SalesLoft:** Add to sequences with custom messaging Slack:** Send qualification reports to sales channels Email:** Send reports to account owners Google Sheets:** Log qualified leads for tracking Workflow Flow Summary Trigger: Manual execution or scheduled run Pull Leads: Fetch new/updated leads from Salesforce report Extract: Parse lead records and tenant identifiers Query Usage: Retrieve API endpoint usage data from analytics platform Prepare: Format data for enrichment Match: Identify companies in Explorium database Enrich: Pull comprehensive firmographic data Merge: Combine usage patterns with company intelligence Organize: Structure complete lead profiles Analyze: AI evaluates each lead with quality scoring Format: Structure qualification reports for CRM Create Tasks: Automatically populate Salesforce with actionable intelligence This workflow eliminates manual lead research and qualification, automatically analyzing product engagement patterns alongside company fit to help sales teams prioritize and personalize their outreach to the highest-value inbound leads. Customization Options Flexible Triggers Replace the manual trigger with: Schedule:** Run hourly/daily to continuously qualify new leads Webhook:** Real-time qualification when leads are created CRM Trigger:** Activate on specific lead status changes Analytics Platform Integration The Databricks query can be adapted for: Datadog:** Query application logs and events Mixpanel:** Analyze user behavior and feature adoption Amplitude:** Track product engagement metrics Custom Databases:** PostgreSQL, MySQL, Snowflake, BigQuery CRM Flexibility Works with multiple CRMs: Salesforce:** Full integration (pull reports, create tasks) HubSpot:** Contact properties and deal updates Zoho:** Lead enrichment and task creation Pipedrive:** Deal qualification and activity creation Enrichment Depth Add more Explorium endpoints: Technographics:** Tech stack and product usage News & Events:** Recent company announcements Funding Data:** Investment rounds and financial events Hiring Signals:** Job postings and growth indicators Output Destinations Route qualification reports to: CRM Updates:** Salesforce, HubSpot (update lead scores/fields) Task Creation:** Any CRM task/activity system Team Notifications:** Slack, Microsoft Teams, Email Sales Tools:** Outreach, SalesLoft, Salesloft sequences Reporting:** Google Sheets, Data Studio dashboards AI Model Options Swap AI providers: Default: Anthropic Claude (Sonnet 4) Alternatives: OpenAI GPT-4, Google Gemini Setup Notes Salesforce Report Configuration: Create a report with required fields (name, email, company, tenant ID) and use its API endpoint Tenant Identification: Ensure your product usage data includes identifiers that link to CRM leads Usage Data Query: Customize the SQL query to match your database schema and table structure MCP Configuration: Explorium MCP requires Header Auth—configure credentials properly Lead Scoring Logic: Adjust AI system prompts to match your ideal customer profile and qualification criteria Task Assignment: Configure Salesforce task assignment rules or add logic to route to specific sales reps This workflow acts as an intelligent lead qualification system that combines behavioral signals (what they're testing) with firmographic fit (who they are) to give sales teams actionable intelligence for every inbound lead.
by Intuz
This n8n template from Intuz provides a complete end-to-end content factory to automate the entire lifecycle of creating and publishing AI-driven videos. It transforms a single text prompt into a fully scripted, visually rich video with AI-generated images and voiceovers, then distributes it across multiple social media platforms. Who's this workflow for? Content Creators & YouTubers Social Media Managers & Agencies Digital Marketers & Entrepreneurs Brands looking to scale video content production Objective Generate viral video scripts with Gemini AI (via LangChain). Break scripts into structured scenes(hooks, retention, CTA). Create image prompts and high-quality background visuals automatically. Store all scenes, prompts, images, and metadata into Airtable. Handle file formatting, uploads, and naming automatically. Add error handling and retry logic for smooth execution. Uploading into Social Media platforms. How it works 1. AI Script Generation: The workflow starts with a single command. A powerful Google Gemini AI model, acting as a "Content Brain," generates a complete, viral video script with a title, description, and multiple scenes. 2. Content Management in Airtable: The entire script is broken down and saved systematically into an Airtable base, which acts as the central Content Management System (CMS) for the video production pipeline. 3. AI Image Generation: The workflow loops through each scene in Airtable. It uses an AI agent to enhance the image prompts and sends them to an image generation API (like Pollinations.ai) to create a unique, high-quality image for each scene. These images are then automatically uploaded back into Airtable. 4. Automated Video Rendering: Once all images are ready, the workflow gathers the script text and the corresponding image URLs from Airtable and sends them to Creatomate. Creatomate uses a pre-defined template to assemble the final video, complete with AI-generated voiceovers for each scene. 5. Multi-Platform Publishing: After a brief wait for the video to render, the workflow retrieves the final video file and automatically publishes it across your connected social media channels, including YouTube and Instagram. Key Requirements to Use This Template Before you start, please ensure you have the following accounts and assets ready: 1. n8n Instance & Required Nodes: An active n8n account (Cloud or self-hosted). This workflow relies on the official n8n LangChain integration (@n8n/n8n-nodes-langchain). If you are using a self-hosted version of n8n, please ensure this package is installed on your instance. 2. AI & Video Accounts: Google Gemini AI Account: A Google Cloud account with the Vertex AI API enabled and an API Key. Creatomate Account: It's platform to generate videos. An account with Creatomate for video rendering, and your API key and a pre-designed video template ID. 3. Content & Social Media Accounts: Airtable Account: An Airtable base set up to manage the video content. Using the complementary Airtable template is highly recommended. YouTube Account: A YouTube channel with API access enabled via Google Cloud Console. Upload-Post.com Account: An account for posting to other platforms like Instagram. Workflow Steps 1.▶️ Trigger (Manual/Auto) Start workflow manually or via schedule. 2.🧠 Content Brain (Gemini + LangChain) Role-trained viral strategist prompt. Generates 6 scene scripts with: Hook → Retention → Value → CTA. Follows viral content rules (engagement triggers, curiosity gaps, shareable moments). 3.📑 Structured Output Parser Enforces JSON schema: video_id video_title description scenes[] → scene_number, text, image_prompt 4.📄 Airtable – Store Scenes Each scene stored with: Video ID, Title, Description Scene Number & Text Image Prompt & Generated Image link 5.🤖 AI Agent – Image Prompt Creator Transforms scene text →optimized image prompts using structured rules. 6.🎨 Pollination AI – Image Generation Generates vertical 9:16 visuals with parameters: Model: flux Resolution: 1080x1920 Steps: 12 Guidance Scale: 3.5 Safety Checker: Enabled Options: seed=42, nologo=true 7.📂 File Handling & Conversion Assigns filenames automatically (e.g., images_001.png). Converts binary image → base64 for Airtable storage. 8.📤 Airtable Upload – Store Images Attaches generated visuals directly into Generated Image field. 9.⚡ Switch & Error Handling Branches for: ✅ Success → continue ❌ Failed → stop with error message ⏳ Processing → wait/retry 10.Social Media Upload In YouTube via YouTube API from official documentation In Instagram Via Upload Post API Setup Instructions 1. AI Configuration: In the Google Gemini Chat Model nodes, connect your Google Gemini API account. In the Content Brain node, you can customize the core prompt to change the video's niche, style, or topic. 2. Airtable Setup (Crucial): Connect your Airtable account in the Airtable nodes. Set up your Airtable base using the provided template or ensure your base has identical table and field names. Update the Base ID and Table IDs in the Airtable nodes. Airtable Schema: 3. Video Rendering Setup (Creatomate): In the Video Rendering - Creatomate node, add your Creatomate API key to the header authorization. In the Template for Creatomate node, replace the template_id with the ID of your own video template from your Creatomate account. 4. Social Media Connections: In the Upload on YouTube node, connect your YouTube account via OAuth2. In the Upload on Instagram node, replace the API key in the header authorization with your key from Upload-Post.com. 5. Execute the Workflow: Click "Execute workflow" to kick off your automated video content factory. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Worflow Automation Click here- Get Started
by Madame AI
Automate social media content aggregation to a Telegram channel This n8n template automatically aggregates and analyzes key updates from your social media platforms Home Page, delivering them as curated posts to a Telegram channel. This workflow is perfect for digital marketers, brand managers, or data analysts and Busy people, seeking to monitor real-time trends and competitor activity without manual effort. How it works The workflow is triggered automatically on a schedule to aggregate the latest social media posts. A series of If and Wait nodes monitor the data processing job until the full data is ready. An AI Agent, powered by Google Gemini, refines the content by summarizing posts and removing duplicates. An If node checks for an image in the post to decide if a photo or a text message should be sent. Finally, the curated posts are sent to your Telegram channel as rich media messages. How to use Set up BrowserAct Template: In your BrowserAct account, set up “Twitter/X Content Aggregation” template. Set up Credentials: Add your credentials for BrowserAct In Run Node , Google Gemini in Agent Node, and Telegram in Send Node. Add Workflow ID: Change the workflow_id value inside the HTTP Request inside the Run Node, to match the one from your BrowserAct workflow. Activate Workflow: To enable the automated schedule, simply activate the workflow. Requirements BrowserAct** API account BrowserAct* *“Twitter/X Content Aggregation”** Template Gemini** account Telegram** credentials customizing this workflow This workflow provides a powerful foundation for social media monitoring. You could: Replace the Telegram node with an email or Slack node to send notifications to a different platform. Add more detailed prompts to the AI Agent for more specific analysis or summarization. customize BrowserAct Workflow to reach your desire. Need Help ? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Automate Your Social Media: Get All X/Twitter Updates Directly in Telegram!
by isaWOW
Description Automate LinkedIn organization page posting with precise time scheduling and Google Drive media management. Runs hourly during business hours, processes approved posts scheduled for today, waits until exact time, publishes to LinkedIn, and updates tracking sheet—perfect for maintaining consistent LinkedIn presence. What this workflow does This workflow automates LinkedIn organization page posting with precise timing control and a centralized Google Sheets content calendar. It runs four times daily (9:45 AM, 10:45 AM, 11:45 AM, 12:45 PM) and reads your Google Sheet to find posts marked with Approval Status = "Good" and Platform = "LinkedIn" that are scheduled for today. Unlike batch processing workflows, this processes ONE post per run to prevent duplicate scheduling. Once it finds a post, it marks it as "Scheduled" in your sheet, then uses a Wait node to pause the workflow execution until the exact scheduled time (with automatic timezone conversion from Eastern to India time). At the scheduled moment, it either downloads an image from Google Drive and publishes a Creative Post, or publishes an Article link—depending on the Post Type. After successfully publishing to your LinkedIn organization page, it updates your Google Sheet with the live post URL and changes the Approval Status to "Published", creating a complete audit trail with precise timing control. Perfect for social media managers maintaining consistent LinkedIn presence, marketing agencies scheduling client LinkedIn content with approval checkpoints, content creators batch-planning professional posts, and teams needing collaborative LinkedIn calendars with exact time control. Key features Precise time scheduling with Wait node: Unlike immediate publishing, this workflow uses n8n's Wait node to pause execution until the exact scheduled time—ensuring posts publish at 10:00 AM sharp, not 9:45 AM when the workflow runs. Timezone conversion included: Automatically converts "Scheduled On" times from Eastern Time (content creator's timezone) to India Time (server timezone) using DateTime.fromFormat—no manual calculation needed. One post per run: Processes only the first pending post each time the workflow runs, preventing duplicate scheduling if multiple posts share the same time slot—ensures clean execution and tracking. Dual post type support: Handles both Creative Posts (image posts downloaded from Google Drive) and Articles (link posts with article URL)—automatically routes based on Post Type column. LinkedIn organization posting: Posts directly to your LinkedIn organization/company page (not personal profile) using LinkedIn Community Management API with proper OAuth authentication. Status progression tracking: Three-stage workflow: Good (approved) → Scheduled (waiting for time) → Published (live on LinkedIn)—always know what's queued vs. what's live. Google Sheets content calendar: Manage LinkedIn posts in a familiar spreadsheet with columns for Scheduled On, Platform, Post Type, Caption, Media URL, and Approval Status—no complex tools needed. Google Drive media integration: Stores images in Google Drive (centralized storage), then automatically downloads them when publishing Creative Posts—supports shared drives and private files. Post URL tracking: After publishing, updates Google Sheet with the live LinkedIn post URL (https://www.linkedin.com/feed/update/{urn})—enables easy performance tracking and reporting. Runs during business hours only: Schedule trigger fires at :45 minutes (9-12 AM) to catch morning posts—won't run at night or on weekends unless you modify the cron. How it works 1. Hourly trigger during business hours A cron trigger runs at 9:45 AM, 10:45 AM, 11:45 AM, and 12:45 PM every day. The :45 timing gives a 15-minute buffer before the hour to schedule posts for :00, :15, :30, or :45 times. 2. Load LinkedIn credentials The workflow reads a separate ".env" sheet in your Google Sheets document containing: LinkedIn Organization ID:** Your company page's unique ID (e.g., 56420402) Other LinkedIn-specific configuration This centralizes credentials so multiple workflows can share the same settings. 3. Fetch approved LinkedIn posts The workflow reads your main "Post URL" sheet and applies two filters: Approval Status = "Good":** Only processes approved posts Platform = "LinkedIn":** Filters out Facebook, Instagram, etc. This returns all approved LinkedIn posts regardless of date. 4. Filter posts scheduled for today A Code node compares the "Scheduled On" column value against today's date (ignores time, just checks the date part). Only posts scheduled for today pass through. Supported date format: "2025-10-30 10:00" (YYYY-MM-DD HH:MM) Time uses 24-hour format 5. Process first post only Critical difference from Facebook workflow: A Code node extracts only the FIRST item from the filtered posts. Why only one post? Prevents duplicate scheduling if workflow runs multiple times Wait node works on single execution paths Ensures precise timing per post Next run will pick up the next pending post 6. Route by post type A Switch node checks the Post Type column: Output 0 (Creative Post):** Posts with images (routes to Branch A) Output 1 (Article):** Posts with article links (routes to Branch B) Branch A: Creative Post with Image 7a. Mark as scheduled in sheet Before waiting, the workflow updates the Google Sheet: Approval Status:** "Scheduled" This prevents the same post from being picked up again in the next hourly run. 8a. Wait until scheduled time Most critical node: The Wait node pauses workflow execution until the exact scheduled time. Timezone conversion logic: DateTime.fromFormat( $('Route by Post Type').item.json['Scheduled On'], 'yyyy-MM-dd HH-mm', { zone: 'America/New_York' } // Input timezone (Eastern) ) .setZone('Asia/Kolkata') // Server timezone (India) .toFormat("yyyy-MM-dd'T'HH:mm:ss") Example: Sheet value: "2025-10-30 14:00" (Eastern Time) Converted to: "2025-10-30T23:30:00" (India Time) Workflow resumes at exactly 11:30 PM India time = 2:00 PM Eastern 9a. Prepare post data Aggregate node (keeps data structure intact for next nodes). 10a. Download image from Google Drive Uses the Media URL (Google Drive sharing link) to download the image file. Supports: Direct Google Drive file URLs Shared Drive files Public or private files (as long as OAuth account has access) The image is downloaded as binary data. 11a. Publish creative post to LinkedIn Uses n8n's LinkedIn node with these settings: Authentication:** communityManagement (LinkedIn Community Management API) Post as:** organization Organization:** 56420402 (your company page ID) Text:** Caption from Google Sheet Share media category:** IMAGE Binary data:** Downloaded image The LinkedIn API returns a URN (unique post identifier). 12a. Save post URL & mark published Constructs the LinkedIn post URL: https://www.linkedin.com/feed/update/{urn} Then updates the Google Sheet row: Approval Status:** "Published" Post URL:** Constructed LinkedIn URL Branch B: Article Post with Link 7b. Mark article as scheduled Updates Google Sheet: Approval Status = "Scheduled" 8b. Wait until article scheduled time Same Wait node logic as Creative Posts—pauses until exact scheduled time with timezone conversion. 9b. Prepare article data Aggregate node. 10b. Publish article link to LinkedIn Uses LinkedIn node with these settings: Authentication:** communityManagement Post as:** organization Organization:** 56420402 Text:** Caption from Google Sheet Share media category:** ARTICLE Original URL:** Media URL (the article link) LinkedIn scrapes the article URL and creates a rich preview card. 11b. Save article URL & mark published Same as Creative Posts—constructs post URL and updates sheet with "Published" status. Setup requirements Tools you'll need: Active n8n instance (self-hosted or n8n Cloud) Google Sheets with OAuth access Google Drive with OAuth access LinkedIn organization/company page (not personal profile) LinkedIn Community Management API credentials Estimated setup time: 35–40 minutes Configuration steps 1. Set up LinkedIn Community Management API Go to LinkedIn Developer Console Create an app (or use existing) Add "Community Management API" product Request access for your organization page Under Auth → OAuth 2.0 settings: Add redirect URL: https://your-n8n-instance.com/rest/oauth2-credential/callback Note your Client ID and Client Secret In n8n: Credentials → Add credential → LinkedIn Community Management OAuth2 API Complete OAuth flow and select your organization page 2. Find your LinkedIn Organization ID Method 1 (URL): Go to your LinkedIn company page URL will be: https://www.linkedin.com/company/{company-name}/ View page source and search for "organizationId" Copy the numeric ID (e.g., 56420402) Method 2 (API): Use LinkedIn API endpoint: /v2/organizationalEntityAcls?q=roleAssignee Find your organization in the response 3. Set up Google Sheets Create two sheets in one Google Sheets document: Sheet 1: ".env" (credentials) | LinkedIn Organization ID | |---| | 56420402 | Sheet 2: "Post URL" (content calendar) | Scheduled On | Platform | Post Type | Caption | Media URL | Approval Status | Post URL | row_number | |---|---|---|---|---|---|---|---| | 2025-10-30 10:00 | LinkedIn | Creative Post | Excited to announce our new product! | https://drive.google.com/file/d/xxx | Good | | 1 | | 2025-10-30 14:00 | LinkedIn | Article | Check out our latest blog post | https://blog.example.com/post | Good | | 2 | Important column details: Scheduled On:** Format YYYY-MM-DD HH-MM (24-hour, Eastern Time) Platform:** Must be "LinkedIn" (case-sensitive) Post Type:** "Creative Post" (with image) or "Article" (with link) Caption:** Post text (LinkedIn supports up to 3,000 characters) Media URL:** Google Drive URL for Creative Post, article URL for Article Approval Status:** "Good" (publish), "Pending" (hold), "Rejected" (skip) Post URL:** Leave empty (auto-filled after publishing) row_number:** Auto-generated by Google Sheets 4. Connect Google Sheets OAuth In n8n: Credentials → Add credential → Google Sheets OAuth2 API Complete OAuth authentication Update these nodes with your sheet URL: "Load LinkedIn Organization Credentials" → .env sheet "Fetch Approved LinkedIn Posts" → Post URL sheet "Mark as Scheduled in Sheet" → Post URL sheet "Save Post URL & Mark Published" → Post URL sheet "Mark Article as Scheduled" → Post URL sheet "Save Article URL & Mark Published" → Post URL sheet 5. Connect Google Drive OAuth In n8n: Credentials → Add credential → Google Drive OAuth2 API Complete OAuth authentication Open "Download Image from Google Drive" node Select your Google Drive credential 6. Update LinkedIn Organization ID Open these nodes and replace "56420402" with your organization ID: "Publish Creative Post to LinkedIn"** "Publish Article Link to LinkedIn"** 7. Adjust timezone (if needed) If your content calendar uses a different timezone than Eastern: Open "Wait Until Scheduled Time" node Change { zone: 'America/New_York' } to your timezone Common values: 'America/Los_Angeles' (Pacific), 'UTC', 'Europe/London' If your n8n server is not in India Time: Change .setZone('Asia/Kolkata') to your server's timezone 8. Test the workflow Add a test post scheduled for 5 minutes from now Set Platform = "LinkedIn", Post Type = "Creative Post", Approval Status = "Good" Manually trigger the workflow (or wait for next hourly run) Verify: Sheet updated to "Scheduled" Workflow execution shows as "Waiting" in n8n At scheduled time, post publishes to LinkedIn Sheet updated to "Published" with URL 9. Activate the workflow Toggle the workflow to Active The workflow will now run automatically 4 times daily Check your LinkedIn page to verify posts are publishing correctly Use cases Social media managers: Schedule 15-20 LinkedIn posts per week from one Google Sheet. Team members add content, you approve, workflow handles precise timing and publishing—no manual LinkedIn.com logins. B2B marketing teams: Maintain consistent LinkedIn company page presence with thought leadership articles, product updates, and team highlights. Schedule weeks in advance, let automation publish at optimal times. Content creators: Batch-create LinkedIn content on Mondays, schedule throughout the week with precise timing. Focus on creation, not distribution—workflow handles publishing. Agencies managing client pages: One Google Sheet per client, separate workflows per organization ID. Centralized content calendar with approval workflow before posting to client pages. Recruiting teams: Schedule hiring posts, culture updates, and employee spotlights to maintain active company presence. Track all post URLs for performance analysis. Personal brands using company pages: If you manage a LinkedIn company page for your personal brand or business, schedule promotional content, case studies, and announcements with professional timing control. Customization options Process multiple posts per run Change "Process First Post Only" node logic: Current: Returns only item 0 Modified: Return all items (use Loop node to process sequentially) Note: Wait nodes won't work with loops—consider using scheduled_publish_time if LinkedIn API supports it Change scheduling frequency Edit the "Run Every Hour" cron expression: Current: 45 9-12 * * * (9:45-12:45 AM hourly) All day: 45 * * * * (every hour at :45) Business hours extended: 45 9-17 * * 1-5 (9 AM-5 PM, Monday-Friday) Twice daily: 0 9,15 * * * (9:00 AM and 3:00 PM) Add video post support LinkedIn supports video posts via Community Management API: Add "Post Type" = "Video" Download video from Google Drive (instead of image) Change share media category to VIDEO Upload video to LinkedIn media upload endpoint first, then create post Support personal profiles If you want to post to personal profiles (not organization): Change authentication from "communityManagement" to "oAuth2" Change "Post as" from "organization" to "person" Use LinkedIn OAuth2 API credentials (not Community Management) Add Slack notifications After publishing nodes, add: Slack node** to send confirmation Format:** "Published LinkedIn post: [URL] at [time]" Include:** Post caption preview for context Multi-organization support Modify .env sheet to support multiple company pages: | Organization Name | LinkedIn Organization ID | |---|---| | Main Brand | 56420402 | | Sub Brand | 78901234 | Add "Organization Name" column to Post URL sheet, then filter and route by organization. Troubleshooting Posts not publishing OAuth expired:** Re-authenticate LinkedIn Community Management API credentials in n8n. Organization permissions:** Verify your LinkedIn account has admin/content creator role on the organization page. API access:** Ensure your LinkedIn app has Community Management API product added and approved. Organization ID wrong:** Double-check the ID matches your actual company page. Wait node not working Scheduled time in past:** n8n Wait node requires future times. If "Scheduled On" is in the past when workflow runs, it fails. Ensure posts are scheduled for future times only. Timezone mismatch:** If posts publish at wrong times, verify timezone conversion is correct (Eastern → your server timezone). DateTime format error:** Ensure "Scheduled On" uses exactly "YYYY-MM-DD HH-MM" format with space between date and time. Images not downloading from Google Drive OAuth expired:** Re-authenticate Google Drive credentials. File permissions:** Ensure the Google account connected to n8n has "Viewer" access to Drive files. Sharing link format:** Media URL must be full Google Drive URL (https://drive.google.com/file/d/FILE_ID/view), not shortened. Multiple posts with same time Current limitation:** This workflow processes ONE post per run. If multiple posts share the same scheduled time, only the first will publish. Solution:** Stagger times by 1 minute (10:00, 10:01, 10:02) or modify workflow to process multiple posts. Sheet not updating OAuth expired:** Re-authenticate Google Sheets credentials. Sheet name mismatch:** Verify sheet tab name is exactly "Post URL" and ".env" (case-sensitive). row_number missing:** Ensure the sheet has a row_number column auto-generated by formula: =ROW()-1 Article previews not showing URL not accessible:** LinkedIn needs to scrape the article URL. Ensure it's publicly accessible (not behind login/paywall). No Open Graph tags:** Article pages need Open Graph meta tags (og:title, og:description, og:image) for LinkedIn to generate previews. LinkedIn cache:** Sometimes LinkedIn caches old previews. Use LinkedIn Post Inspector to refresh cache. Resources n8n documentation LinkedIn Community Management API LinkedIn OAuth Guide Google Sheets API Google Drive API n8n Wait node n8n LinkedIn node Support Need help or custom development? 📧 Email: info@isawow.com 🌐 Website: https://isawow.com/
by Bhuvanesh R
Your Cold Email is Now Researched. This pipeline finds specific bottlenecks on prospect websites and instantly crafts an irresistible pitch 🎯 Problem Statement Traditional high-volume cold email outreach is stuck on generic personalization (e.g., "Love your website!"). Sales teams, especially those selling high-value AI Receptionists, struggle to efficiently find the one Unique Operational Hook (like manual scheduling dependency or high call volume) needed to make the pitch relevant. This forces reliance on expensive, slow manual research, leading to low reply rates and inefficient spending on bulk outreach tools. ✨ Solution This workflow deploys a resilient Dual-AI Personalization Pipeline that runs on a batch basis. It uses the Filter (Qualified Leads) node as a cost-saving Quality Gate to prevent processing bad leads. It executes a Targeted Deep Dive on successful leads, using GPT-4 for analytical insight extraction and Claude Sonnet for coherent, human-like copy generation. The entire process outputs campaign-ready data directly to Google Sheets and sends a critical QA Draft via Gmail. ⚙️ How It Works (Multi-Step Execution) 1\. Ingestion and Cost Control (The Quality Gate) Trigger and Ingestion:* The workflow starts via a *Manual Trigger, pulling leads directly from **Get All Leads (Google Sheets). Cost Filtering:* The *Filter (Qualified Leads)** node removes leads that lack a working email or website URL. Execution Isolation:* The *Loop Over Leads* node initiates individual processing. The *Capture Lead Data (Set)** node immediately captures and locks down the original lead context for stability throughout the loop. Hybrid Scraping:* The *Scrape Site (HTTP Request)* and *Extract Text & Links (HTML)* nodes execute the *Hybrid Scraping* strategy, simultaneously capturing *website text* and *external links**. Data Shaping & Status:* The *Filter Social & Status (Code)* node is the control center. It filters links, bundles the context, and critically, assigns a *status** of 'Success' or 'Scrape Fail'. Cost Control Branch:* The *If (IF node)* checks this status. Items with 'Scrape Fail' bypass all AI steps (saving *100% of AI token costs) and jump directly to **Log Final Result. Successful items proceed to the AI core. 2\. Dual-AI Coherence & Dispatch (The Executive Output) Analytical Synthesis:* The *Summarize Website (OpenAI)* node uses *GPT-4* to synthesize the full context and extract the *Unique Operational Hook** (e.g., manual booking overhead). Coherent Copy Generation:* The *Generate Subject & Body (Anthropic)* node uses the *Claude Sonnet* model to generate the subject and the multi-line body, guaranteeing *coherence** by creating both simultaneously in a single JSON output. Final Parsing:* The *Parse AI Output (Code)* node reliably strips markdown wrappers and extracts the clean *subject* and *body** strings. Final Delivery:* The data is logged via *Log Final Result (Google Sheets), and the completed email is sent to the user via **Create a draft (Gmail) for final Quality Assurance before sending. 🛠️ Setup Steps Before running the workflow, ensure these credentials and data structures are correctly configured: Credentials Anthropic:** Configure credentials for the Language Model (Claude Sonnet). OpenAI:** Configure credentials for the Analytical Model (GPT-4/GPT-4o). Google Services:* Set up OAuth2 credentials for *Google Sheets* (Input/Output) and *Gmail** (Draft QA and Completion Alert). Configuration Google Sheet Setup:* Your input sheet must include the columns *email, **website\_url, and an empty Icebreaker column for initial filtering. HTTP URL:* Verify that the *Scrape Site** node's URL parameter is set to pull the website URL from the stabilized data structure: ={{ $json.website\_url }}. AI Prompts:** Ensure the Anthropic prompt contains your current Irresistible Sales Offer and the required nested JSON output structure. ✅ Benefits Coherence Guarantee:* A single *Anthropic** node generates both the subject and body, guaranteeing the message is perfectly aligned and hits the same unique insight. Maximum Cost Control:* The *IF node* prevents spending tokens on bad or broken websites, making the campaign highly *budget-efficient**. Deep Personalization:* Combines *website text* and *social media links**, creating an icebreaker that implies thorough, manual research. High Reliability:* Uses robust *Code nodes** for data structuring and parsing, ensuring the workflow runs consistently under real-world conditions without crashing. Zero-Risk QA:* The final *Gmail (Create a draft)** step ensures human review of the generated copy before any cold emails are sent out.