by Influencers Club
How it works: Get multi social platform data for SaaS clients with their email and send personalized comms to onboard them as organic creators, partners and ambassadors. Step by step workflow to enrich customer emails with multi social (Instagram, Tiktok, Youtube, Twitter, Onlyfans, Twitch and more) profiles, analytics and metrics using the influencers.club API and sending tailored outreach to activate them as creators. Set up: Hubspot (can be swapped for any CRM like Salesforce, Attio or DB) Influencers.club Gmail Sendgrid (can be swapped for any programmatic email sender like Mailgun)
by Ruth Aju
Who it's for SaaS founders and developers who want to automate their customer onboarding experience from payment to welcome email, without any manual work. How it works A Stripe Trigger listens for successful payment events. The payment amount is converted and used to identify the subscription tier. Customer details are extracted from the Stripe payload. The AI Agent queries Pinecone to retrieve the correct plan details and generates a personalised HTML welcome email with the renewal date calculated automatically. The email is parsed and sent via Gmail. Customer details and subscription info are logged to Google Sheets for renewal tracking. Set up steps Connect your Stripe account and point it to listen for checkout.session.completed events. Store your tier information as chunks in Pinecone. Add your OpenAI credentials for the AI Agent and Embeddings nodes. Connect Gmail as your sending account. Create a Google Sheet with columns: Name, Email, Amount, Tier, Renewal Date, Status. Requirements Stripe account Pinecone account with tier knowledge base uploaded OpenAI account Gmail account Google Sheets
by John Alejandro SIlva
Rizz AI: The Multimodal Dating Assistant 💘 Rizz AI is not just a chatbot; it's a full-featured, AI-powered CRM for your dating life. Built entirely in n8n, this workflow turns Telegram into a powerful "Wingman" that helps you craft the perfect reply, remember details about your matches, and optimize your dating strategy using Google Gemini 1.5 Pro. 🔥 Key Features 👁️ Multimodal Vision:** Send a screenshot of a Tinder/Hinge profile or a WhatsApp chat, and the AI will analyze the text, subtext, and vibe to give you tactical advice. 🗣️ Audio Analysis:** Forward voice notes, and the AI will transcribe and analyze the tone to tell you if they are interested. 🧠 Long-Term Memory:** Remembers details about specific matches (e.g., "Sofia likes sushi") so you don't ask the same thing twice. 📊 Lead Management (CRM):** Automatically tracks matching stage, interest level, and next steps in Google Sheets. 🎨 Personalized Style:** Adapts advice to your specific "Rizz Style" (e.g., Mystery, Direct, Funny) defined in your profile. 🛠️ How It Works Ingest: You send text, audio, or images to your private Telegram Bot. Process: The workflow routes the input through Gemini Vision (for images) or Whisper/Gemini (for audio). Retrieve: It queries your Google Sheet to see if this person is a new lead or an existing match. Reason: The AI Agent (with tools) decides the best move: suggesting a reply, logging a red flag, or scheduling a date. Respond: You receive 3 draft options to copy-paste directly into your dating app. 📋 Setup Instructions 1. Google Sheets (Database) Make a copy of the Rizz AI Database Template. Share/Connect your Google Drive credential in n8n. Update the Sheet ID in the Get Rizzler Profile and other Sheet nodes. 2. Telegram Bot Talk to @BotFather on Telegram to create a new bot. Copy the API Token into the Telegram Trigger and Send Message nodes. 3. Google Gemini Get a free API Key from Google AI Studio. Connect it to the Google Gemini Chat Model node. 💡 Need Assistance? If you’d like help customizing or extending this workflow, feel free to reach out: 📧 Email: johnsilva11031@gmail.com 🔗 LinkedIn: John Alejandro Silva Rodríguez
by Victor Manuel Lagunas Franco
Turn any topic into a ready-to-study Anki deck. This workflow generates vocabulary flashcards with AI images and native pronunciation, then sends the .apkg file straight to your inbox. What it does You fill out a simple form (topic, languages, difficulty) GPT-4 creates vocabulary with translations, readings, and example sentences DALL-E 3 generates a unique image for each word ElevenLabs adds native pronunciation audio (word + example) Everything gets packaged into a real .apkg file The deck lands in your email, ready to import into Anki A backup copy saves to Google Sheets Why I built this I was spending hours making flashcards by hand for language learning. Finding images, recording audio, formatting everything for Anki... it took forever. This workflow does all of that in about 3 minutes. Setup (~15 min) Install npm packages: jszip and sql.js Add OpenAI credentials (for GPT-4 + DALL-E) Add ElevenLabs credentials Connect Gmail and Google Sheets via OAuth Update OPENAI_API_KEY in the DALL-E code node Update the Spreadsheet ID in the Sheets node Features 20 languages supported 7 image styles (minimal icons, kawaii, realistic, watercolor, pixel art...) 6 difficulty levels (A1 to C2) Optional reverse cards (target→native AND native→target) Works on Anki desktop and mobile
by Cheng Siong Chin
How It Works This workflow automates enterprise claims cost leakage detection by identifying overpayments, policy deviations, and pricing inconsistencies across claims data. It supports claims operations, finance, and audit teams by providing continuous, AI-driven monitoring without manual review. Claims data is ingested through parallel HTTP requests, including claim history, policy details, pricing rules, and enrichment data. Historical claim patterns feed calculator-based risk scoring to flag potential leakage scenarios. All data streams are consolidated and analyzed using GPT-4 with structured outputs to detect anomalies, quantify leakage risk, and recommend corrective adjustments. The workflow generates claim-level findings and routes outcomes by severity: high-risk leakage triggers immediate email and Slack alerts, while lower-risk issues are compiled into periodic audit and recovery reports. Setup Steps Configure HTTP nodes with competitor website APIs Add OpenAI API key to Chat Model node for AI analysis Connect Gmail account and set leadership distribution list Integrate Slack workspace and configure strategy team Adjust Schedule node timing for preferred monitoring frequency Prerequisites OpenAI API key, competitor data source API access, vendor monitoring service credentials Use Cases SaaS companies tracking competitor feature releases and pricing changes Customization Modify risk scoring formulas in Calculator nodes for industry-specific metrics Benefits Transforms hours of manual competitor research into automated minutes-long cycles
by Cheng Siong Chin
How It Works This automated disaster response workflow streamlines emergency management by monitoring multiple alert sources and coordinating property protection teams. Designed for property managers, insurance companies, and emergency response organizations, it solves the critical challenge of rapidly identifying at-risk properties and deploying resources during disasters.The system continuously monitors weather, seismic, and flood alerts from authoritative sources. When threats are detected, it cross-references property databases to identify affected locations, calculates insurance exposure, and generates damage assessments using OpenAI's GPT-4. Teams receive automated maintenance schedules while property owners and insurers get instant email notifications with comprehensive reports. This eliminates manual monitoring, reduces response time from hours to minutes, and ensures no vulnerable properties are overlooked during emergencies. Setup Steps Configure alert fetch nodes with weather/seismic/flood API endpoints Connect property database credentials (specify database type) Add OpenAI API key for GPT-4 damage assessments Set up Gmail/SMTP credentials for owner and insurer notifications Customize insurance calculation formulas and team scheduling logic Prerequisites Weather/seismic/flood alert API access, property database (SQL/Sheets/Airtable) Use Cases Insurance companies automating claims preparation, property management firms protecting rental portfolios Customization Modify alert source APIs, adjust damage assessment prompts Benefits Reduces emergency response time by 90%, eliminates manual alert monitoring
by Țugui Dragoș
This workflow is a complete, production-ready solution for recovering abandoned carts in Shopify stores using a multi-channel, multi-touch approach. It automates personalized follow-ups via Email, SMS, and WhatsApp, tracks every customer interaction for multi-touch attribution, and enables advanced retargeting and analytics. Key features: Multi-step, timed recovery sequence (Email → SMS → Email → WhatsApp) Customer segmentation (new, returning, VIP) and A/B testing Dynamic discounting and personalized messaging Touchpoint logging to Google Sheets for attribution analysis Facebook Custom Audience retargeting for unrecovered carts Slack notifications for high-value cart recoveries What does this workflow do? Listens for abandoned cart events from Shopify (or any e-commerce platform) via webhook. Normalizes and enriches cart data by fetching full cart details and customer purchase history. Predicts the likely reason for abandonment (e.g., price sensitivity, checkout complexity, technical issues) using rule-based logic. Segments the customer (new, returning, VIP), assigns an A/B test group, and generates a personalized discount and checkout URL. Runs a timed, multi-channel recovery sequence: 1 hour after abandonment: Checks if the order is completed. If not, sends a personalized Email #1 and logs the touchpoint. 4 hours after abandonment: Checks again. If not recovered, sends an SMS with a discount code and logs the touchpoint. 24 hours after abandonment: Checks again. If not recovered, sends Email #2 (with social proof/urgency) and logs the touchpoint. 48 hours after abandonment: Final check. If not recovered, sends a WhatsApp reminder and logs the touchpoint. If the cart is still not recovered: Hashes customer identifiers and adds them to a Facebook Custom Audience for retargeting. Logs every touchpoint (email, SMS, WhatsApp) to a Google Sheet for multi-touch attribution analysis. Sends a Slack notification if a high-value cart is recovered. Why is this workflow useful? Boosts recovery rates:** By using multiple channels and personalized timing, you maximize the chance of recovering lost sales. Improves attribution:** Every customer interaction is logged, so you can analyze which channels and messages drive conversions. Enables advanced retargeting:** Unrecovered carts are automatically added to a Facebook Custom Audience for paid retargeting. Saves time:** Fully automated, with easy configuration for your store, messaging, and analytics. Scalable and extensible:** Easily adapt the sequence, add more channels, or integrate with other tools. How to install and configure 1. Prerequisites n8n instance (v2.0.2+ recommended) Shopify store with API access Accounts and API credentials for: SendGrid (email) Twilio (SMS) WhatsApp Business API Google Sheets (service account) Facebook Graph API (for Custom Audiences) Slack (for notifications) 2. Setup steps Import the workflow into your n8n instance. Configure the “Workflow Configuration” node: Set your Shopify domain, API URLs, Google Sheets ID, and high-value threshold. Connect all required credentials in the respective nodes: Shopify, SendGrid, Twilio, WhatsApp, Google Sheets, Facebook Graph API, Slack. Create a Google Sheet named “Touchpoints” with columns: cart_id, customer_id, touchpoint_type, timestamp, channel, ab_group. Set up the webhook in your Shopify store (or e-commerce platform) to trigger the workflow on cart abandonment. Test the workflow with a sample abandoned cart event to ensure emails, SMS, WhatsApp, and logging work as expected. Enable the workflow in n8n for live operation. Node-by-node breakdown Abandoned Cart Webhook:** Receives abandoned cart events. Workflow Configuration:** Stores global settings (API URLs, Shopify domain, Google Sheets ID, high-value threshold). Normalize Cart Data:** Cleans and standardizes incoming cart data. Fetch Cart Details / Fetch Customer History:** Enriches data with full cart and customer info. Predict Abandonment Reason:** Uses business logic to guess why the cart was abandoned. Personalization Engine:** Segments the customer, assigns A/B group, calculates discount, and builds checkout URL. Customer Segment Check / Device Type Check:** Applies routing logic for personalized messaging. Wait / Check Order Status / Generate & Send Messages:** Timed sequence for Email, SMS, and WhatsApp, with order status checks at each step. Log Touchpoint (Google Sheets):** Records every message sent for attribution. Attribution Merge:** Combines all touchpoints into a single journey for analysis. Hash Customer Data for Facebook / Add to Retargeting Audience:** Adds unrecovered carts to a Facebook Custom Audience. Check Cart Value Threshold / Notify High-Value Recovery:** Sends Slack alerts for high-value recoveries. Customization tips Adjust wait times and message content to fit your brand and audience. Add or remove channels (e.g., push notifications, phone calls) as needed. Expand the Google Sheet for deeper analytics (e.g., add UTM parameters, campaign IDs). Integrate with your CRM or analytics platform for end-to-end tracking. Troubleshooting Make sure all API credentials are set and tested. Check Google Sheets permissions for the service account. Test each channel (email, SMS, WhatsApp) individually before going live. Review the workflow execution logs in n8n for errors or failed steps.
by Zain Khan
Categories: Business Automation, E-commerce, Intelligence, AI This workflow automates high-frequency price tracking across e-commerce platforms. It combines the data-handling power of the Decodo node with the intelligence of Google Gemini to eliminate manual price checks. It is for businesses seeking real-time market intelligence. Benefits Total Automation: Handles data sourcing, and email notifications without human help. Intelligent Extraction: Uses AI to analyze the full page content. Precision Alerting: Triggers notifications when a product's price meets or falls below the "Desired Price." Scalable Architecture: Processes large batches of products. How It Works Scheduled Data Retrieval: The Schedule Trigger pulls a list of URLs and target prices from Google Sheets. Raw Data Processing: Data flows through a Decodo node. Full-Body Extraction: The workflow captures the entire body of the webpage. AI-Driven Analysis: An AI Agent, powered by Google Gemini, analyzes the text to identify the product name and price. Regex Data Cleaning: A JavaScript node uses Regular Expressions to sanitize the AI's response. Smart Comparison & Alerting: An If Node compares the live price against the "Desired Price." If the condition is met, an automated alert is sent via Gmail. Requirements n8n Instance Google Account Google Gemini API Key Decodo Credentials How to Use Setup your Spreadsheet: Create a Google Sheet with columns for the product link and "Desired price." Authenticate Nodes: Connect your Google Sheets, Gmail, and Gemini credentials within n8n. Configure Parameters: Ensure the If node correctly references the "Desired price" column from your Google Sheet output. Deploy: Activate the workflow. The system will now run automatically, monitoring the list and notifying of deals. Business Use Cases Retail Arbitrage Agencies: Spot price drops on supplier sites to maximize profit margins. Competitor Intelligence: Monitor rival pricing strategies. Procurement Departments: Automate the "buy" signal for raw materials when they hit a specific price point. E-commerce Managers: Track MAP (Minimum Advertised Price) compliance. Revenue Potential Increased Margins: Buy inventory at the lowest prices. Market Leadership: React faster than competitors to market-wide price shifts. Service Offering: Provide "Price Watch" services for e-commerce clients. Difficulty Level: Intermediate Estimated Setup Time: 40 min Monthly Operating Cost: Low (based on Gemini API tokens)
by Bakdaulet Abdikhan
Analyze Meta ads with Gemini and Google Sheets Stop manually exporting CSVs and start automating your marketing insights. This workflow is designed for Marketing Agencies, Freelancers, and Media Buyers who want to keep a daily pulse on their Meta (Facebook/Instagram) Ads performance without logging into Ads Manager. It doesn't just scrape data; it uses Google Gemini AI to act as a virtual data analyst. It reviews your campaigns, identifies winning/losing creatives, and writes strategic suggestions for both your agency team and your clients. 🚀 What this workflow does Extracts Data: Wakes up every morning (6:00 AM) to fetch yesterday's Ad and Campaign performance from the Facebook Graph API. Cleans & Filters: Automatically ignores paused or zero-spend campaigns to keep your reports clean. Structuring: Uses a Code node to group Ads intelligently under their respective Ad Sets and Campaigns. AI Analysis: Sends the structured data to Google Gemini. The AI analyzes CTR, CPC, and Spend to identify the "Best Performing Ad" and "Worst Performing Ad" per Ad Set. Reporting: Saves raw Campaign Data to Google Sheets. Saves raw Ad Data to Google Sheets. Saves AI-Generated Insights (Client & Agency suggestions) to a dedicated sheet. Error Handling: If anything breaks (e.g., API token expiry), it instantly sends you an alert via Gmail with the error details. 💡 Key Features Zero-Spend Filter:** Keeps your spreadsheet tidy by excluding inactive ads. Hierarchical Data Processing:** Groups data logically so the AI understands the context of your tests. Dual-Perspective Insights:** The AI generates two types of advice: For the Client: Simple, performance-based updates. For the Agency: Technical optimization tips (e.g., "Pause Ad B, Scale Ad A"). Robust Error Monitoring:** Includes a dedicated error workflow to notify you of failures. 🛠️ Prerequisites To use this template, you will need: Meta/Facebook Developer App:** A System User Access Token with ads_read permission. Google Cloud Console Project:** Enabled APIs for Google Sheets, Gmail, and Vertex AI (Gemini). Google Sheet:** A sheet with three tabs: Campaigns, Ads, and AI_Insights. 📝 Setup Instructions Configure Credentials: Connect your Facebook Graph API and Google accounts in n8n. Set Configuration Node: Open the "Set Configuration" node and paste your Ad Account ID and Email Address for error alerts. Link Google Sheet: Open the three Google Sheets nodes and select your spreadsheet file. Activate: Turn on the workflow and let it run daily! Need help setting this up or want a custom automation for your agency? I specialize in building agentic workflows for consultants and agencies. 📧 Contact me: bakdaulet.mph@gmail.com
by Khairul Muhtadin
Streamline M&A due diligence with AI. This n8n workflow automatically parses financial documents using LlamaIndex, embeds data into Pinecone, and generates comprehensive, AI-driven reports with GPT-5-mini, saving hours of manual review and ensuring consistent, data-backed insights. Why Use This Workflow? Time Savings: Reduces manual document review and report generation from days to minutes. Cost Reduction: Minimizes reliance on expensive human analysts for initial data extraction and summary. Error Prevention: AI-driven analysis ensures consistent data extraction, reducing human error and oversight. Scalability: Effortlessly processes multiple documents and deals in parallel, scaling with your business needs. Ideal For Investment Analysts & Private Equity Firms:** Quickly evaluate target companies by automating the extraction of key financials, risks, and business models from deal documents. M&A Advisors:** Conduct preliminary due diligence efficiently, generating comprehensive overview reports for clients without extensive manual effort. Financial Professionals:** Accelerate research and analysis of company filings, investor presentations, and market reports for critical decision-making. How It Works Trigger: A webhook receives multiple due diligence documents (PDFs, DOCX, XLSX) along with associated metadata. Document Processing & Cache Check: Files are split individually. The workflow first checks Pinecone to see if the deal's documents have been processed before (cache hit). If so, it skips parsing and embedding. Data Extraction (LlamaIndex): For new deals, each document is sent to LlamaIndex for advanced parsing, extracting structured text content. Vectorization & Storage: The parsed text is then converted into numerical vector embeddings using OpenAI and stored in Pinecone, our vector database, with relevant metadata. AI-Powered Analysis (Langchain Agent): An n8n Langchain Agent, acting as a "Senior Investment Analyst," leverages GPT-5-mini to query Pinecone multiple times for specific information (e.g., company profile, financials, risks, business model). It synthesizes these findings into a structured JSON output. Report Generation: The structured AI output is transformed into an HTML report, then converted into a professional PDF document. Secure Storage & Delivery: The final PDF due diligence report is uploaded to an S3 bucket, and a public URL is returned via the initial webhook, providing instant access. Setup Guide Prerequisites | Requirement | Type | Purpose | | :---------- | :--- | :------ | | n8n instance | Essential | Workflow execution platform | | LlamaIndex API Key | Essential | For robust document parsing and text extraction | | OpenAI API Key | Essential | For creating text embeddings and powering the GPT-5-mini AI agent | | Pinecone API Key | Essential | For storing and retrieving vector embeddings | | AWS S3 Account | Essential | For secure storage of generated PDF reports | Installation Steps Import the JSON file to your n8n instance. Configure credentials: LlamaIndex: Create an "HTTP Header Auth" credential with x-api-key in the header and your LlamaIndex API key as the value. OpenAI: Create an "OpenAI API" credential with your OpenAI API key. Ensure the credential name is "Sumopod" or update the workflow nodes accordingly. Pinecone: Create a "Pinecone API" credential with your Pinecone API key and environment. Ensure the credential name is "w3khmuhtadin" or update the workflow nodes accordingly. AWS S3: Create an "AWS S3" credential with your Access Key ID and Secret Access Key. Update environment-specific values: In the "Upload to S3" node, ensure the bucketName is set to your desired S3 bucket. In the "Create Public URL" node, update the baseUrl variable to match your S3 bucket's public access URL or CDN if applicable (e.g., https://your-s3-bucket-name.s3.amazonaws.com). Customize settings: Review the prompt in the "Analyze" (Langchain Agent) node to adjust the AI's persona or required queries if needed. Test execution: Send sample documents (PDF, DOCX, XLSX) to the webhook URL (/webhook/dd-ai) to verify all connections and processing steps work as expected. Technical Details Core Nodes | Node | Purpose | Key Configuration | | :------------------------------ | :--------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------- | | Webhook | Initiates workflow with document uploads | Path: dd-ai, HTTP Method: POST | | Split Multi-File (Code) | Splits binary files, generates unique deal ID | Parses filenames from body or binary, creates dealId from sorted names. | | Parse Document via LlamaIndex | Extracts structured text from various document types | URL: https://api.cloud.llamaindex.ai/api/v1/parsing/upload, Authentication: HTTP Header Auth with x-api-key. | | Monitor Document Processing | Polls LlamaIndex for parsing status | URL: https://api.cloud.llamaindex.ai/api/v1/parsing/job/{{ $json.id }}, Authentication: HTTP Header Auth. | | Insert to Pinecone | Stores vector embeddings in Pinecone | Mode: insert, Pinecone Index: poc, Pinecone Namespace: dealId. | | Data Retrieval (Pinecone) | Enables AI agent to search due diligence documents | Mode: retrieve-as-tool, Pinecone Index: poc, Pinecone Namespace: {{ $json.dealId }}, topK: 100. | | Analyze (Langchain Agent) | Orchestrates AI analysis using specific queries | Prompt Type: define, detailed role and 6 mandatory Pinecone queries, Model: gpt-5-mini, Output Parser: Parser. | | Generate PDF (Puppeteer) | Converts HTML report to a professional PDF | Script Code: await $page.pdf(...) with A4 format, margins, and 60s timeout. | | Upload to S3 | Stores final PDF reports securely | Bucket Name: poc, File Name: {{ $json.fileName }}, Credentials: AWS S3. | | If (Check Namespace Exists) | Implements caching logic | Checks stats.namespaces[dealId].vectorCount > 0 to determine cache hit/miss. | Workflow Logic The workflow begins by accepting multiple files via a webhook. It intelligently checks if the specific "deal" (identified by a unique ID generated from filenames) has already had its documents processed and embedded in Pinecone. This cache mechanism prevents redundant processing, saving time and API costs. If a cache miss occurs, documents are parsed by LlamaIndex, their content vectorized by OpenAI, and stored in a Pinecone namespace unique to the deal. For analysis, a Langchain Agent, powered by GPT-5-mini, is instructed with a specific persona and a mandatory sequence of Pinecone queries (e.g., company overview, financials, risks). It uses the Data Retrieval tool to interact with Pinecone, synthesizing information from the stored embeddings. The AI's output is then structured by a dedicated parser, transformed into a human-readable HTML report, and converted into a PDF. Finally, this comprehensive report is uploaded to AWS S3, and a public access URL is provided as a response. Customization Options Basic Adjustments: AI Prompt Refinement:** Modify the Prompt field in the "Analyze" (Langchain Agent) node to adjust the AI's persona, introduce new mandatory queries, or change reporting style. Output Schema:** Update the JSON schema in the "Parser" (Langchain Output Parser Structured) node to include additional fields or change the structure of the AI's output. Advanced Enhancements: Integration with CRM/Dataroom:** Add nodes to automatically fetch documents from or update status in a CRM (e.g., Salesforce, HubSpot) or a virtual data room (e.g., CapLinked, Datasite). Conditional Analysis:** Implement logic to trigger different analysis paths or generate different report sections based on document content or deal parameters. Notification System:** Integrate with Slack, Microsoft Teams, or email to send notifications upon report generation or specific risk identification. Use Case Examples Scenario 1: Private Equity Firm Evaluating a Target Company Challenge: A private equity firm receives dozens of due diligence documents (financials, CIM, management presentations) for a potential acquisition, needing a rapid initial assessment. Solution: The workflow ingests all documents, automatically parses them, and an AI agent synthesizes key company information, financial summaries (revenue history, margins), and identified risks into a structured report within minutes. Result: The firm's analysts gain an immediate, comprehensive overview, enabling faster screening and more focused deep-dive questions, significantly accelerating the deal cycle. Scenario 2: M&A Advisor Conducting Preliminary Due Diligence Challenge: An M&A advisory firm needs to provide clients with a quick, consistent, and standardized preliminary due diligence report across multiple prospects. Solution: Advisors upload relevant prospect documents to the workflow. The AI-powered system automatically extracts core business model details, investment thesis highlights, and customer concentration analysis, along with key financials. Result: The firm can generate standardized, high-quality preliminary reports efficiently, ensuring consistency across all client engagements and freeing up senior staff for strategic analysis. Created by: Khmuhtadin Category: AI | Tags: Due Diligence, AI, Automation, M&A, LlamaIndex, Pinecone, GPT-5-mini, Document Processing Need custom workflows? Contact us Connect with the creator: Portfolio • Workflows • LinkedIn • Medium • Threads
by Avkash Kakdiya
How it works This workflow runs on a daily schedule to analyze all Closed–Lost deals from your CRM and uncover the true reason behind each loss. It uses AI to classify the primary loss category, generate a confidence-backed explanation, and then create a realistic re-engagement strategy for every deal. All insights are consolidated into leadership-ready email and Slack summaries. Every analyzed deal and revival plan is logged for long-term tracking and audits. Step-by-step Trigger and fetch lost deals** Schedule Trigger – Runs the workflow automatically at a defined time. Get many deals – Fetches all deal records from the CRM. If – Filters only deals marked as Closed–Lost. Edit Fields – Standardizes key deal attributes like amount, industry, owner, and loss reason. Analyze loss reasons and generate revival strategies** Brief Explanation Creator – Uses AI to identify the primary loss category with confidence. Code in JavaScript – Parses and normalizes AI loss analysis output. Merge – Combines deal data with loss insights. Feedback Creator – Generates a practical re-engagement strategy for each lost deal. Code in JavaScript7 – Parses and safeguards revival strategy outputs. Merge4 – Merges deal details, loss analysis, and revival strategy into one final dataset. Report, notify, and store results** Code in JavaScript11 – Builds a consolidated HTML summary email. Send a message4 – Sends the summary to stakeholders via email. Code in JavaScript12 – Creates a structured Slack summary. Send a message1 – Delivers insights to a Slack channel. Code in JavaScript10 – Reconstructs final data with delivery status. Append or update row in sheet – Logs all results into Google Sheets for audit and tracking. Why use this? Turns lost deals into actionable learning instead of static CRM records Gives sales teams clear, realistic re-engagement plans without manual analysis Provides leadership with concise, decision-ready summaries Creates a historical database of loss reasons and revival outcomes Improves pipeline recovery while enforcing consistent sales intelligence
by sato rio
This workflow automates the initial screening process for new job applications, freeing up your recruitment team to focus on qualified candidates. It receives applications from a webhook, uses OpenAI (GPT-4) to analyze resumes for skill and culture fit, generates interview questions, logs the results to Google Sheets, sends interview invitations via Gmail, and notifies your team on Slack. 🚀 Who is this for? HR and Recruitment Teams** looking to automate repetitive screening tasks. Hiring Managers** who want a consistent, data-driven first pass on applicants. Startups and SMBs** aiming to build an efficient, scalable hiring pipeline without a large HR team. 💡 How it works Receive Application: The workflow triggers when a new application is submitted via a webhook from your job board or application form. Extract & Analyze: It downloads the resume/CV, extracts the text, and sends it to OpenAI (GPT-4) with a custom prompt. Score & Generate: The AI scores the candidate on skill match and culture fit, provides a summary, and generates tailored interview questions based on their experience. Log Data: The evaluation scores, AI summary, and candidate information are appended to a new row in a Google Sheet for tracking. Schedule Interview: A personalized email is sent to the candidate via Gmail with a link to schedule their interview. Notify Team: A summary card with the AI evaluation and links to the full report is posted in a Slack channel to keep the hiring team informed. ⚙️ How to set up Configure Credentials: Set up your credentials for OpenAI, Google (for both Sheets and Gmail), and Slack in n8n. Webhook URL: Copy the "Production URL" from the "Webhook: New Application" node and set it as the destination in your job board's webhook settings (e.g., Greenhouse, Lever, Ashby, or a web form). Google Sheet: Create a Google Sheet to track applicants. Update the "G Sheets: Save Evaluation" node with your Spreadsheet ID and Sheet Name. Ensure the columns in your sheet match the data you want to save. Customize Prompts & Email: Modify the prompts in the two OpenAI nodes to match your company's values and the specific job requirements. Update the Gmail node with your email content and the logic for your scheduling link (e.g., Calendly, SavvyCal). 📋 Requirements An n8n instance (Cloud or self-hosted). An OpenAI API key. A Google account for Google Sheets and Gmail. A Slack workspace. A job application source capable of sending webhooks.