by Jameson Kanakulya
π¨ Template Overview This comprehensive n8n workflow automates the complete process of generating professional interior design moodboards from concept to client delivery. Users submit a design brief through a form, and the system automatically generates 12 AI-powered images, compiles them into a beautifully formatted two-page PDF moodboard, and emails the final deliverable. Key Features: Form-based design brief submission AI-powered image prompt generation (12 detailed prompts per project) Automated image generation via Hugging Face API Nextcloud cloud storage with public URL sharing Professional two-page HTML/PDF moodboard creation Automated email delivery with PDF attachment Technologies Used: Claude Sonnet 4 (OpenRouter), Google Gemini 2.5 Pro, Hugging Face FLUX.1-schnell, Nextcloud, Gotenberg PDF Service, Gmail βοΈ Self-Hosted Requirements This template requires the following self-hosted or third-party services: Nextcloud Instance** - For cloud file storage and public URL generation Gotenberg PDF Service** - For HTML to PDF conversion (can be self-hosted via Docker) OpenRouter API Access** - For Claude Sonnet 4 AI agent Google Gemini API Access** - For secondary AI processing Hugging Face API Access** - For FLUX.1-schnell image generation Gmail Account** - For email delivery (or any SMTP service) π Setup Instructions Step 1: Configure API Credentials OpenRouter (Claude Sonnet 4) Sign up at openrouter.ai Generate an API key Add credentials to the "OpenRouter Chat Model" node Google Gemini API Visit Google AI Studio Create an API key Add credentials to the "Google Gemini Chat Model1" node Hugging Face API Register at huggingface.co Generate an access token from Settings β Access Tokens Add credentials to the "Image Generator" node using HTTP Header Auth Header name: Authorization, Value: Bearer YOUR_TOKEN Nextcloud Set up a Nextcloud instance or use a hosted provider Generate an app password from Settings β Security Configure credentials in all Nextcloud nodes: "Create a folder" "Upload Image" "Share a file" Gotenberg PDF Service Self-host using Docker: docker run --rm -p 3000:3000 gotenberg/gotenberg:8 Or use a hosted instance Update the URL in the "PDF creator" node Configure HTTP Basic Auth credentials if required Gmail Enable 2-Factor Authentication on your Google account Generate an App Password from Google Account settings Add OAuth2 credentials to the "Send PDF" node Step 2: Customize Workflow Settings Email Extractor Node Review the stripPlus variable (default: true) This removes "+tag" portions from email addresses for folder naming Nextcloud Folder Structure Default path: /moodboard/{username}/ Modify in "Create a folder" node if needed Image Generator Settings Model: FLUX.1-schnell (fast generation, good quality) Adjust model in "Image Generator" node if needed Alternative models: FLUX.1-dev, Stable Diffusion XL PDF Generation Settings Default timeout: 360 seconds Page size: A4 (210mm Γ 297mm) Adjust in "PDF creator" node headers if needed Step 3: Test the Workflow Activate the Form Open the "Moodboard Form" node Copy the webhook URL Access the form in your browser Submit a Test Request Fill in the form fields: Title: Short, descriptive name (e.g., "Modern Minimalist Bedroom") Description: Detailed design brief with colors, materials, mood, lighting Email: Your test email address Submit and monitor workflow execution Verify Each Stage Check Nextcloud folder creation Monitor image generation progress (12 images) Review HTML moodboard generation Confirm PDF creation Check email delivery Step 4: Configure Form Embedding (Optional) Embed the form on your website: <iframe src="YOUR_N8N_FORM_WEBHOOK_URL" width="100%" height="800" frameborder="0"> </iframe> π Workflow Structure 1. Form Input & Data Extraction Moodboard Form** - Collects project title, description, and user email Email Extractor** - Extracts username from email for folder organization 2. Storage Setup Create a Folder** - Creates personalized Nextcloud directory using email username 3. AI Concept Generation Conceptualization Agent** (Claude Sonnet 4) - Analyzes design brief and generates 12 detailed image prompts (300-500 words each) Images 1-11: Individual design elements (furniture, materials, details, styling) Image 12: Comprehensive 3D rendered view integrating all elements 4. Image Processing Loop Concept Splitter** - Separates 12 prompts into individual items Loop Over Items** - Processes each prompt sequentially: Image Generator - Sends prompt to Hugging Face FLUX.1-schnell API Upload Image - Stores generated image in Nextcloud folder Share a File - Creates public shareable URL Set Image Title and URL - Formats data for aggregation 5. URL Collection URL Aggregate** - Combines all 12 public image URLs Clean URLs** - Extracts and formats URLs into a structured list 6. Moodboard Compilation Moodboard Generator Agent** (Google Gemini 2.5 Pro) - Creates professional two-page HTML document: Page 1: Visual moodboard with all 12 images (Image #12 prominently featured 2-3x larger) Page 2: Administrative summary with design overview, color palette, materials, and project details 7. PDF Generation & Delivery Binary Converter** - Transforms HTML to base64-encoded binary format PDF Creator** - Converts HTML to print-ready PDF via Gotenberg service Send PDF** - Emails final moodboard PDF to user π― Node Descriptions Moodboard Form Collects moodboard generation requests for any design topic. Users input a title, detailed description (colors, materials, patterns, textures, lighting), and email address for delivery. Email Extractor Extracts the username portion from email addresses, optionally stripping "+tags" for clean folder naming and user identification. Create a Folder Creates a dedicated Nextcloud folder using the extracted email username, organizing moodboard outputs by user. Conceptualization Agent AI agent that analyzes design briefs to generate 12 detailed image prompts (300-500 words each). Performs conceptual analysis of styles, colors, materials, and spatial requirements, outputting structured JSON. Concept Splitter Splits the 12 generated image prompts into individual items for parallel processing through the image generation pipeline. Loop Over Items Processes each prompt sequentially, generating images, uploading to Nextcloud, and creating public URLs. Image Generator Sends detailed prompts to Hugging Face FLUX.1-schnell API for AI-powered image generation, transforming written design concepts into high-quality visuals. Upload Image Uploads each generated moodboard image to the user's Nextcloud folder with appropriate naming conventions. Share a File Creates publicly shareable Nextcloud links for each uploaded image, enabling external viewing without authentication. Set Image Title and URL Formats image metadata (title and URL) for downstream aggregation. URL Aggregate Combines all 12 image URLs into a single consolidated output for moodboard compilation. Clean URLs Extracts and formats URLs from the aggregated data into a clean, structured list with count. Moodboard Generator Agent Transforms design concepts into professional two-page HTML moodboards. Analyzes project details and 12 image URLs, selecting appropriate visual styles. Creates artistic Page 1 with Image #12 as hero element, and comprehensive Page 2 with design documentation. Binary Converter Prepares HTML for PDF conversion by transforming it into binary format with proper encoding and filename ("index.html") for Gotenberg compatibility. PDF Creator Converts HTML moodboard into print-ready PDF with proper A4 dimensions, page breaks, and high-quality image resolution. Send PDF Emails the finalized PDF moodboard to the user's submitted email address with project details and PDF attachment. π¨ Customization Options Design Styles The Moodboard Generator Agent automatically selects from 10 layout styles: Modern Sectional Grid Material Board Collage Editorial Magazine Clean Minimalist Split-Screen Dramatic Asymmetric Feature Centered Showcase Modular Block System Organic Flow Layered Depth Image Generation Models Replace FLUX.1-schnell with alternatives in the "Image Generator" node: black-forest-labs/FLUX.1-dev - Higher quality, slower stabilityai/stable-diffusion-xl-base-1.0 - Classic SD XL Email Templates Customize the email message in the "Send PDF" node to include: Brand messaging Next steps Support contact information Pricing information π Troubleshooting Images Not Generating Verify Hugging Face API token is valid Check API rate limits and quotas Increase timeout in "Image Generator" node (default: unlimited) PDF Generation Fails Ensure Gotenberg service is accessible Verify HTML output contains all 12 image URLs Check timeout settings (default: 360s) Review Gotenberg logs for specific errors Nextcloud Upload Errors Confirm folder creation succeeded Verify Nextcloud credentials and permissions Check available storage space Ensure WebDAV is enabled Email Not Received Verify Gmail OAuth2 credentials Check spam/junk folders Confirm email address is valid Review Gmail API quotas π Performance Notes Average execution time**: 5-8 minutes (depends on image generation) Image generation**: ~20-30 seconds per image (12 images = 4-6 minutes) PDF generation**: ~30-60 seconds Total data processed**: ~15-25 MB per workflow execution π Security Considerations Store all API keys in n8n credentials (never hardcode) Use environment variables for sensitive configuration Implement rate limiting on the form webhook Consider adding CAPTCHA to prevent abuse Regularly rotate API keys and passwords Use HTTPS for all external communications π License & Attribution This template is provided as-is for the n8n community. Feel free to modify and adapt to your needs. AI Models Used: Claude Sonnet 4 (Anthropic via OpenRouter) Google Gemini 2.5 Pro (Google) FLUX.1-schnell (Black Forest Labs via Hugging Face) π€ Support & Contributions For questions or improvements, please reach out through the n8n community forum or submit issues/PRs to enhance this template. Created by: Jameson Kanakulya Template Version: 1.0 Last Updated: November 2025
by Oneclick AI Squad
This workflow ingests property document packages submitted via webhook or monitored cloud storage, extracts text from each file, runs Claude AI to verify legal compliance, detect missing or expired documents, flag invalid clauses, and produces a structured validation report with remediation guidance. How it works Trigger β Webhook submission or Google Drive folder watch Intake & Register β Logs submission, assigns case ID, normalises metadata Download Documents β Fetches each file from Drive / S3 / URL Extract Text β Reads PDF/DOCX content via parser node Classify Document Type β Identifies contract, title, disclosure, certificate, etc. AI Legal Compliance Check β Claude AI validates each document against jurisdiction rules Aggregate Validation Results β Merges per-document findings into a case report Check Required Doc Checklist β Detects missing mandatory documents Route by Compliance Status β Branches on PASS / FAIL / REQUIRES_REVIEW Notify Submitter β Email with full validation report and remediation steps Alert Legal Team on Slack β Flags FAIL or critical issues to legal channel Create Audit Record β Writes full report to Google Sheets compliance log Generate PDF Report β Stores formatted report back to Drive Return Validation Response β Sends structured JSON result to caller Setup Steps Import workflow into n8n Configure credentials: Anthropic API β Claude AI for legal document analysis Google Drive OAuth β Document intake and report storage Google Sheets OAuth β Compliance audit log Slack OAuth β Legal team alerts SendGrid / SMTP β Submitter notification emails Set your Google Drive folder IDs for intake and output Configure jurisdiction rules in the AI prompt node Set mandatory document checklist in the Checklist node Activate the workflow Sample Webhook Payload { "caseId": "CASE-2025-0871", "submitterEmail": "agent@realty.com", "propertyAddress": "42 Oak Street, Sydney NSW 2000", "transactionType": "sale", "jurisdiction": "NSW", "documents": [ { "name": "Contract of Sale", "type": "contract", "driveFileId": "1aBcDeFgHiJkL" }, { "name": "Title Search", "type": "title", "driveFileId": "2mNoPqRsTuVwX" } ] } Document Types Supported Contract of Sale / Purchase Agreement Certificate of Title / Title Search Vendor Disclosure Statement Section 32 / Vendor Statement Building & Pest Inspection Report Strata Report / Body Corporate Docs Zoning Certificate / Planning Certificate Land Tax Certificate Mortgage / Discharge of Mortgage Lease Agreement / Tenancy Documents Council Rates Notice Smoke Alarm / Safety Certificates Features Multi-document batch validation per submission Jurisdiction-aware compliance rules (AU/UK/US configurable) Missing document detection against mandatory checklist Expiry date validation for time-sensitive certificates Critical clause and red-flag detection Automated remediation guidance per issue Full audit trail in Google Sheets PDF validation report stored to Drive Explore More Automation: Contact us to design AI-powered lead nurturing, content engagement, and multi-platform reply workflows tailored to your growth strategy.
by Davide
This workflow automates the process of receiving a post-call audio file and transcription from ElevenLabs, processing them, and generating a financial risk report. Key Advantages 1. β End-to-End Automation The workflow fully automates the process from raw input (audio/transcript) to final delivery (email report), eliminating manual intervention. 2. β AI-Powered Decision Making It leverages language models to: Analyze qualitative interview responses Convert them into quantitative scores Produce consistent and objective evaluations 3. β Structured Data Extraction Automatically extracts critical business information, reducing human error and ensuring standardized outputs. 4. β Scalability The webhook-based architecture allows the system to handle large volumes of interviews in parallel without additional effort. 5. β Modular & Extensible Design Each step (audio processing, extraction, scoring, reporting) is modular, making it easy to: Replace models Add new analysis layers Integrate additional services 6. β Professional Output Generation Generates clean, ready-to-send HTML reports compatible with email clients, improving communication with stakeholders. 7. β Data Traceability & Storage Audio files are stored in Google Drive, ensuring: Auditability Easy retrieval of original data 8. β Consistency & Standardization The evaluation logic ensures that all interviews are assessed using the same criteria, reducing subjective bias. How it works Receiving and Routing Data: The workflow starts with a Webhook that listens for incoming data from ElevenLabs. A Switch node then routes the data based on the body.type field. Post Call Audio: If the type is post_call_audio, the workflow processes the audio. Post Call Transcription: If the type is post_call_transcription, the workflow processes the transcription. Audio Processing Path: For an audio file, a Code node extracts the Base64 audio data and the conversation_id from the webhook payload. It converts the Base64 string into a binary audio buffer (MP3). This binary data is then passed to a Google Drive node, which uploads the file to a specified folder (the user's root folder). Transcription Processing Path: For a transcription, a Set node extracts the transcript array from the payload. A subsequent Code node processes this array, combining all messages from the conversation into a single, readable full text string, prefixed by the speaker's role. Data Enrichment and Analysis: The full transcript text is then used by two nodes in parallel: Information Extractor: This LangChain node uses an OpenAI model (gpt-5-mini) to extract structured data from the text, specifically the company_name, the CEO's name, the address, and the vat_number. Calculate Rating: This LangChain node uses another OpenAI model to perform a quantitative evaluation. It follows a provided system prompt to assign a numerical score, a final verdict (POSITIVE/NEUTRAL/NEGATIVE), and a reason based on the interviewee's responses. Its output is parsed by a Structured Output Parser to ensure it is valid JSON. Report Generation and Delivery: The outputs from the Information Extractor and Calculate Rating nodes are merged into a single data object. This object is passed to the Financial Report Generator, a final LangChain node that acts as a professional analyst. Using the merged data (company details, score, verdict, etc.), it generates a polished, formatted HTML email body. Finally, a Gmail node sends this HTML report as an email to the specified recipient. Set up steps Configure Credentials: OpenAI: Set up an OpenAI API credential for the three language model nodes. Ensure it has access to the gpt-5-mini model. Google Drive: Configure OAuth2 credentials for the "Upload audio" node to allow file uploads. Gmail: Set up OAuth2 credentials for the "Send report" node. Configure Webhook: Note the webhook ID and path. This URL must be configured in ElevenLabs to send post-call data to this n8n instance. Update Node Parameters: Google Drive: Modify the "Upload audio" node if the target folder (folderId) is not the root. Information Extractor: The extraction attributes (company, name, address, VAT) are pre-configured. No changes are needed unless the target data fields change. Gmail: Update the Gmail node with the recipient email address (xxx@xxx.com) and verify the email subject line formatting. Activate Workflow: Once all credentials and parameters are set, toggle the workflow from active: false to active: true in the n8n editor to start listening for webhook calls. π Subscribe to my new YouTube channel. Here Iβll share videos and Shorts with practical tutorials and FREE templates for n8n. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Dean Pike
LinkedIn URL β Scrape β Match β Screen β Decide, all automated This workflow automatically processes candidate LinkedIn profiles shared via Telegram, intelligently matches them to job descriptions, performs AI-powered screening analysis, and sends actionable summaries to your team in Telegram. Good to know Handles LinkedIn profile scraping via Apify API (extracts full profile data including experience, education, skills) Built-in spam prevention: limits users to 3 LinkedIn profile submissions Two-stage JD matching: prioritizes role mentioned in candidate's Telegram message, falls back to LinkedIn profile analysis if needed Uses Google Gemini API for AI screening (generous free tier and rate limits, typically enough to avoid paying for API requests - check latest pricing at Google AI Pricing and rate limits documentation) Automatic polling mechanism checks Apify extraction status up to 10 times (15-second intervals) Complete audit trail logged in Google Sheets with unique submission IDs Who's it for Hiring teams and recruiters who want to streamline first-round screening for candidates who share LinkedIn profiles directly. Perfect for companies accepting applications via messaging platforms (Telegram, WhatsApp, etc.), especially useful fortech-savvy audiences and remote/global hiring. How it works Telegram bot receives message containing LinkedIn profile URL from candidate Validates URL format and checks spam prevention (max 3 submissions per Telegram username) Sends confirmation message to candidate and notifies internal talent team via Telegram group Extracts clean LinkedIn URL and initiates Apify scraping job Polls Apify API up to 10 times (15-second intervals) until profile extraction completes AI agent matches candidate to best-fit job description by analyzing Telegram message context first (if candidate mentioned a role), or LinkedIn profile content as fallback (selects up to 3 potential JD matches) If multiple JDs matched, second AI agent selects the single best fit based on detailed profile analysis AI recruiter agent analyzes LinkedIn profile against selected JD and generates structured screening report (strengths, weaknesses, risk/reward factors, overall fit score 0-10 with justification) Logs complete analysis to Google Sheets tracker with unique submission ID Sends formatted summary to Telegram group with candidate details, matched JD, and overall fit score Requirements Telegram Bot Token (Create bot via @BotFather) Apify account with API token (Sign up for free tier) Google Drive account (OAuth2) Google Sheets account (OAuth2) Google Gemini API key (Get free key here) Google Drive folder for Job Descriptions (as PDFs or Google Docs) Telegram group for internal talent team notifications How to set up Create Telegram bot and internal Telegram chat group with new bot: Message @BotFather on Telegram Send /newbot and follow instructions to create your bot Save the API token provided Create Telegram group chat and invite your new bot + invite the @GetIDs bot Note down the group chat ID (How to get group chat ID) Setup Apify: Sign up at Apify Get your API token from Settings Note: Free tier includes sufficient scraping credits for testing and production ($0.01 per successful LinkedIn profile enriched, a free monthly limit of $5.00) - LinkedIn profile scraper "actor" details Create Google Sheet: Create new sheet named "LinkedIn Profile AI Candidate Screening" Add columns: Submission ID, Date, LinkedIn Profile URL, First Name, Last Name, Email (if known), Telegram Username, Strengths, Weaknesses, Risk Factor, Reward Factor, JD Match, Overall Fit, Justification Copy the spreadsheet ID from URL Setup Google Drive folder: Create folder named "Job Descriptions" Upload your JD files (PDFs or Google Docs) with clear, descriptive filenames Copy the folder ID from URL Configure workflow nodes: In "Receive Telegram Msg to Recruiter Bot" node: Add Telegram API credentials In "Extract LinkedIn Profile Information" node: Replace YOUR_APIFY_API_TOKEN with your Apify token In "Check LinkedIn Profile Extraction Status" node: Replace YOUR_APIFY_API_TOKEN with your Apify token In "Get Fully Extracted LinkedIn Profile Data" node: Replace YOUR_APIFY_API_TOKEN with your Apify token In "Access JD Files" node: Update folder ID to your "Job Descriptions" folder In "Get All Rows Matching Telegram Username" node: Select your Google Sheet In "Add Candidate Analysis in GSheet" node: Select your Google Sheet and verify column mappings In "Send Msg to Internal Talent Group" node: Update chat ID to your Telegram group chat ID In "Send Review Completed Msg to Talent Group" node: Update chat ID and Google Sheet URL Add your company description: In "JD Matching Agent" system message: Replace company description with your details In "Detailed JD Matching Agent" system message: Replace company description with your details In "Recruiter Scoring Agent" system message: Update company description Test the workflow: Send a LinkedIn profile URL to your bot from Telegram Monitor execution to ensure all nodes run successfully Check Google Sheets for logged results Activate workflow Customizing this workflow Change spam limits: Edit "Spam Check: Sent <4 LinkedIn Profiles?" node to adjust maximum submissions (currently 3) Adjust polling attempts: Edit "Checked 10x for LinkedIn Profile Data?" node to change maximum polling attempts (currently 10) or modify wait time in "Wait for LinkedIn Profile" node (currently 15 seconds) Change JD matching logic: Edit "JD Matching Agent" node prompt to adjust how LinkedIn profiles are matched to roles (e.g., weight current role vs. overall experience) Modify screening criteria: Edit "Recruiter Scoring Agent" node system message to focus on specific qualities (culture fit, leadership potential, technical depth, industry experience, etc.) Add more messaging platforms: Add nodes to support WhatsApp, Discord, or other messaging platforms using similar URL-based triggers Customize Telegram messages: Edit notification nodes to change formatting, add emojis, or include additional candidate data Auto-proceed logic: Add IF node after screening to auto-proceed candidates with fit score above threshold (e.g., 8+/10) and trigger different notification paths Add candidate responses: Connect nodes to automatically message candidates back via Telegram (confirmation, rejection, interview invite) Add interview scheduling: For approved candidates, send Telegram message with Cal.com or Calendly link so they can book their interview Enrich with additional data: Add nodes to cross-reference candidate data with other sources (GitHub, Twitter/X, company websites) Multi-language support: Add translation nodes to support candidates submitting profiles in different languages Add human approval step: Create buttons in Telegram group messages for instant Approve/Reject decisions that update Google Sheets Pro tip: Add your Telegram bot to your company's careers page with instructions like: "Want fast-track screening? Share your LinkedIn profile with our AI recruiter: @YourBotName" Troubleshooting Telegram bot not responding: Ensure bot token is correct in "Receive Telegram Msg to Recruiter Bot" node, and users have sent /start to your bot at least once "LinkedIn profile URL invalid" error: Check that candidates are sending full URLs in format https://www.linkedin.com/in/username (not shortened links or text without URL) Apify extraction failing: Verify Apify API token is correctly set in all three HTTP Request nodes ("Extract LinkedIn Profile Information", "Check LinkedIn Profile Extraction Status", "Get Fully Extracted LinkedIn Profile Data") LinkedIn extraction timeout: Increase polling attempts in "Checked 10x for LinkedIn Profile Data?" node (currently 10) or increase wait time in "Wait for LinkedIn Profile" node (currently 15 seconds) Spam check blocking valid users: Check "Get All Rows Matching Telegram Username" node is pointing to correct Google Sheet, and adjust limit in "Spam Check: Sent <4 LinkedIn Profiles?" node if needed JD matching returns no results: Check "Access JD Files" node folder ID points to your Job Descriptions folder, and JD files are named clearly (e.g., "Marketing Director JD.pdf") JD matching is not relevant for my company: Update the "Company Description" in the System Messages in all three AI agent nodes ("JD Matching Agent", "Detailed JD Matching Agent", "Recruiter Scoring Agent") "Can't find matching JD": Ensure candidate's Telegram message mentions role name OR their LinkedIn profile clearly indicates relevant experience for available JDs Google Sheets errors: Verify sheet name is "LinkedIn Profile AI Candidate Screening" and column headers exactly match workflow expectations (Submission ID, Date, LinkedIn Profile URL, First Name, Last Name, etc.) Telegram group notifications not appearing: Verify chat ID is correct in "Send Msg to Internal Talent Group" and "Send Review Completed Msg to Talent Group" nodes (use negative number for group chats, e.g., -4954246611) Missing candidate data in Google Sheets: LinkedIn profile may be incomplete - verify Apify successfully extracted data by checking "Get Fully Extracted LinkedIn Profile Data" node output Loop counter not working: Check "Restore Loop Counter" code node references correct node names ("Checked 10x for LinkedIn Profile Data?" and "Initialize Loop Counter to Poll for Completion") 401/403 API errors: Re-authorize all OAuth2 credentials (Google Drive, Google Sheets) and verify Apify and Telegram API tokens are valid AI analysis quality issues: Edit system prompts in "JD Matching Agent", "Detailed JD Matching Agent", and "Recruiter Scoring Agent" nodes to refine screening criteria and provide more context about your hiring needs Gemini API rate limit errors: Check your usage at Google AI Studio and consider upgrading to paid tier if exceeding free tier limits (see rate limits documentation) Sample Outputs Google Sheets - LinkedIn AI Candidate Screening - sample Telegram messages between AI recruiter bot and job applicant Telegram messages from AI recruiter bot in internal group chat
by Rahul Joshi
Description Automatically generate and distribute detailed End-of-Day (EOD) reports combining task progress from ClickUp and opportunity data from GoHighLevel. This workflow uses AI to analyze daily performance, summarize key metrics, identify blockers, and deliver polished reports directly to Slack, Email, and Google Drive. βοΈππ¬ What This Template Does Triggers automatically every weekday at 6:00 PM (MonβFri). β° Fetches all completed ClickUp tasks and won GoHighLevel opportunities for the day. π₯ Merges and transforms both datasets into a unified structure. π Uses Azure OpenAI GPT-4 to analyze performance and generate structured summaries. π€ Formats three output versions β Slack (Markdown), Email (HTML), and Google Drive (Text). π§Ύ Routes and sends reports automatically to connected channels. π€ Uploads the generated text report to Google Drive with timestamped filenames. βοΈ Key Benefits β Saves time by automating daily performance reporting. β Unifies task and deal data into a single AI-generated summary. β Provides real-time visibility into productivity and outcomes. β Delivers beautifully formatted, channel-specific reports. β Maintains historical reports in Google Drive for reference. β Helps managers identify wins, blockers, and next steps quickly. Features Automated scheduling via cron (MonβFri, 6 PM). ClickUp task and GHL opportunity integration for daily data sync. AI-powered analysis for contextual, actionable summaries. Dynamic formatting for Slack, Email, and Drive outputs. Parallel routing for simultaneous delivery across platforms. No manual steps β runs fully hands-free after setup. Requirements ClickUp OAuth2 credentials for task retrieval. GoHighLevel OAuth2 credentials for deal data. Azure OpenAI GPT-4 API credentials. Slack Bot credentials for message posting. SMTP (Gmail/Outlook) credentials for email reports. Google Drive OAuth2 credentials for report upload. Target Audience π― Sales, marketing, and operations teams tracking daily performance. π Project managers monitoring team productivity and blockers. π€ Client success teams summarizing EOD outcomes for leadership. π§ Business automation teams seeking end-of-day visibility. Step-by-Step Setup Instructions Connect ClickUp, GoHighLevel, Slack, Gmail/SMTP, and Google Drive credentials. π Set your team, space, folder, and list IDs in the ClickUp node. π Update your Slack channel ID in the Slack node configuration. π¬ Configure your email sender and recipients in the email node. π§ (Optional) Modify the cron expression for different reporting times. β° Test the workflow manually once, then activate for automated EOD execution. β
by Sona Labs
Sona-Powered AI Sales Research & Personalized Email Automation π― Overview Automatically research B2B leads and generate personalized outreach emails by reading prospects from Google Sheets, enriching with company data from Sona Enrich, analyzing insights with AI, and creating custom emails β so you can scale personalized outreach to target accounts. You'll be able to automatically enrich company data for target accounts, use AI to identify pain points and opportunities, generate personalized email copy, and sync everything back to your sheet with ready-to-send Gmail compose links. β¨ What This Workflow Does Smart Lead Processing - Reads leads from Google Sheets and filters unprocessed contacts Deep Company Intelligence - Enriches each lead using Sona's API (industry, tech stack, revenue, employee count, social profiles) AI-Powered Research - GPT-4 analyzes company data to identify pain points, growth opportunities, and personalization hooks Email Generation - Creates 120-150 word personalized emails with curiosity-driven subject lines Automated Sync - Updates Google Sheets with research insights and one-click Gmail compose links π₯ Key Features Structured AI Output** - Consistent, high-quality research and copy generation Zero Manual Work** - Processes 20-50 leads per hour completely hands-free Email Generation - Creates 120-150 word personalized emails with curiosity-driven subject lines Gmail Integration** - Pre-filled send links for instant outreach Progress Tracking** - Real-time status updates in Google Sheets πΌ Perfect For Sales teams doing cold outreach SDRs needing personalized emails at scale Agencies managing client prospecting Founders building their pipeline π What You'll Need 1. Sona API Key Get yours at sonalabs.com Provides company data enrichment Add to HTTP Request node header: x-api-key: YOUR_KEY 2. OpenAI API Key Get from platform.openai.com Uses GPT-4.1-mini for research and email generation Add credentials in n8n 3. Google Sheets Setup Create a spreadsheet with these columns: Input columns:** Website Domain, Company Name, Contact Name, Email, Industry Status column:** Research Status (leave empty for new leads) Auto-populated:** Pain Points, Key Insight, Email Subject, Email Body, Send Email Link, Generated Date, Sent Status 4. Google Sheets API Enable in Google Cloud Console Set up OAuth2 with spreadsheets permission Add your spreadsheet ID to workflow nodes π Setup Instructions Import workflow into n8n Add credentials: Sona API key (HTTP Request node) OpenAI API credentials Google Sheets OAuth2 Update spreadsheet ID in all Google Sheets nodes Customize AI prompts (optional) to match your offering Test with 2-3 leads before running full list Execute workflow - it processes leads automatically in batches π Expected Output Each processed lead gets: Pain points** (3-5 identified challenges) Growth opportunities** (2-3 actionable insights) Personalization hooks** (3-4 talking points) Email subject line** (max 8 words, curiosity-driven) Email body** (120-150 words, consultative tone) Gmail compose link** (one-click to send) Fit score** (High/Medium/Low) Processing time: 30-60 seconds per lead π How It Works Step 1: Data Input & Filtering Reads all leads from Google Sheets and filters out already-processed leads (those with a value in "Research Status" column). Step 2: Company Data Enrichment Updates status to "Pending" in Google Sheets Searches Sona database using domain or email 5-tier smart matching algorithm finds best company match Retrieves firmographic data and technology stack Step 3: AI Company Research GPT-4.1-mini analyzes company data to generate: Specific pain points based on industry, size, tech stack Growth opportunities and market positioning Personalization hooks from company description Recommended outreach tone and CTA One-liner insight for email opening Step 4: Personalized Email Generation AI crafts cold email following best practices: Curiosity-driven subject line (max 8 words) Opens with personalization hook showing research References ONE specific pain point Focuses on tangible outcomes (not product features) Natural CTA without being pushy Professional but conversational tone Step 5: Data Output & Loop Formats all data for Google Sheets Creates Gmail compose link with pre-filled content Updates sheet with complete results Sets status to "Completed" Waits 2 seconds, then processes next lead β‘ Pro Tips Start small:** Test with 5-10 leads to validate personalization quality Review first emails:** Adjust AI prompts if tone needs calibration Clean your data:** Better input domains = better Sona matches Monitor fit scores:** Focus manual review on High/Medium fits Use status column:** Easily re-run workflow for new leads only Connect CRM:** Use webhooks to push data to Salesforce/HubSpot π― Use Cases Sales Team Automation Process 100+ leads overnight with personalized research and emails ready by morning. Agency Client Work Deliver custom prospecting campaigns with unique emails for each client's target accounts. Founder Outreach Build pipeline systematically with AI-researched, personalized emails at scale. SDR Productivity Give SDRs pre-researched talking points and draft emails to speed up their workflow 10x. π Expected Results Email personalization:** 10x better than templates Time saved:** 5-10 minutes per lead β 30 seconds automated Response rates:** 2-3x higher with AI-researched insights Scalability:** Process 50-100 leads per day hands-free π§ Customization Options Change AI model:** Swap GPT-4.1-mini for GPT-4 or other models Adjust email length:** Modify prompt to generate shorter/longer emails Add more enrichment:** Chain additional API calls (Clearbit, Apollo, etc.) Multi-language:** Update prompts for outreach in other languages Custom tone:** Adjust system prompts for industry-specific voice Webhook triggers:** Replace manual trigger with scheduled runs or form submissions π Troubleshooting No Sona data found? Verify API key is correct Check domain format (remove http://, trailing slashes) Fallback uses first search result if no exact match AI output not formatted correctly? Structured Output Parser ensures valid JSON Check OpenAI API key and model availability Google Sheets not updating? Verify OAuth2 credentials are connected Check spreadsheet ID matches your sheet Ensure column names match exactly (case-sensitive) Rate limits? Sona: 3 second delay between requests (built-in) OpenAI: Adjust batch size or add longer waits Google Sheets: No limit for standard usage π Template Information Category:** Sales & Marketing Difficulty:** Intermediate Setup Time:** 5-10 minutes Run Time:** 30-60 seconds per lead Cost:** Pay-per-use (Sona API + OpenAI tokens) Updated:** December 2025
by Avkash Kakdiya
How it works The workflow detects incoming job-application emails, extracts resumes, and parses them for AI analysis. It evaluates each candidate against three open roles and assigns a fit score with structured reasoning. Low-scoring applicants are stored for review, while strong candidates move into an automated scheduling flow. The system checks availability on the next business day, books the slot, sends a confirmation email, and records all details in Airtable. Step-by-step Detect and collect job-application data Gmail Trigger1** β Monitors inbox for all new emails. Message a model2** β Classifies whether the email is a job application. If2** β Continues only when the AI result is YES. Get a message1** β Fetches the full message and attachments. Upload file1** β Uploads the resume to Google Drive. Extract from File1** β Converts the PDF resume into text. Analyze the resume and evaluate fit Available Positions1** β Defines the three open roles. Message a model3** β Produces recommended role, fit score, strengths, gaps, skills, and reasoning. If3** β Routes candidates based on fit_score β₯ 8. Create a record3** β Stores lower-scoring applicants in Airtable. Get Next Business Day1** β Calculates the schedule window for qualified candidates. Check availability on the next business day AI Agent1** β Orchestrates availability search using calendar nodes. Get Events1** β Retrieves events for the target day. Check Availability1** β Evaluates free 1-hour slots. OpenAI Chat Model2** β Reasoning engine for the agent. Structured Output Parser1** β Returns clean JSON with start_time and end_time. OpenAI Chat Model3** β Supports structured parsing. Schedule the interview and notify the candidate Create an event1** β Books the interview in Google Calendar. Send a message1** β Sends an HTML confirmation email to the candidate. Create a record2** β Saves shortlisted candidate and interview data in Airtable. Why use this? Removes manual screening by automating email intake and resume parsing. Ensures consistent AI-based role matching and scoring. Books interviews automatically using real calendar availability. Keeps all applicant and scheduling data organized in Airtable. Provides a fully hands-off, end-to-end hiring pipeline.
by Cheng Siong Chin
How It Works This workflow automates end-to-end candidate evaluation for HR teams and recruiters overwhelmed by high-volume hiring. Designed for talent acquisition professionals, hiring managers, and HR operations, it solves the challenge of manually screening resumes, validating qualifications, and coordinating interview feedback across multiple stakeholders. The system triggers on new applications, extracts CV content, prepares structured candidate data, and deploys specialized AI agents for comprehensive evaluation: Signal Agent validates credentials, CV Verification Agent confirms qualifications, Trust Assessment Agent evaluates cultural fit, and Experience Agent analyzes career trajectory. The Orchestrator Agent synthesizes insights, checks validation results, and routes decisionsβsending approval emails for qualified candidates, rejection notices for mismatches, and logging all outcomes to Google Calendar and Sheets. By automating screening with multi-dimensional AI analysis, organizations reduce time-to-hire by 70%, eliminate bias, ensure consistent evaluation criteria, and free recruiters to focus on relationship-building with top talent. Setup Steps Connect webhook/form trigger to applicant tracking system or career portal Configure CV extraction node with document parsing API credentials Add OpenAI API keys to all AI agent nodes Define evaluation criteria in each agent's prompt Link Gmail credentials for approval and rejection email templates Connect Google Calendar API for interview scheduling automation Prerequisites ATS integration or career portal webhook access, OpenAI API account Use Cases High-volume recruitment screening, technical role qualification validation Customization Modify agent prompts for role-specific criteria, adjust scoring thresholds for pass/fail decisions Benefits Reduces screening time by 70%, eliminates unconscious bias through standardized evaluation
by Ranjan Dailata
Who this is for This workflow is designed for teams that collect feedback or survey responses via Jotform and want to automatically: Analyze sentiment (positive, neutral, negative) of each response. Extract key topics and keywords from qualitative text. Generate AI summaries and structured insights. Store results in Google Sheets and n8n DataTables for easy reporting and analysis. Use Cases Customer experience analysis Market research & survey analysis Product feedback clustering Support ticket prioritization AI-powered blog or insight generation from feedback What this workflow does This n8n automation connects Jotform, Google Gemini, and Google Sheets to turn raw responses into structured insights with sentiment, topics, and keywords. Pipeline Overview Jotform β Webhook β Gemini (Topics + Keywords) β Gemini (Sentiment) β Output Parser β Merge β Google Sheets Jotform Trigger Captures each new submission from your Jotform (e.g., a feedback or survey form). Extracts raw fields ($json.body.pretty) such as name, email, and response text. Format Form Data (Code Node) Converts the Jotform JSON structure into a clean string for AI input. Ensures the text is readable and consistent for Gemini. Topics & Keyword Extraction (Google Gemini + Output Parser) Goal: Identify the main themes and important keywords from responses. { "topics": [ { "topic": "Product Features", "summary": "Users request more automation templates.", "keywords": ["AI templates", "automation", "workflow"], "sentiment": "positive", "importance_score": 0.87 } ], "global_keywords": ["AI automation", "developer tools"], "insights": ["Developers desire more creative, ready-to-use AI templates."], "generated_at": "2025-10-08T10:30:00Z" } Sentiment Analyzer (Google Gemini + Output Parser) Goal: Evaluate overall emotional tone and priority. { "customer_name": "Ranjan Dailata", "customer_email": "ranjancse@gmail.com", "feedback_text": "Please build more interesting AI automation templates.", "sentiment": "positive", "confidence_score": 0.92, "key_phrases": ["AI automation templates", "developer enablement"], "summary": "Customer requests more AI automation templates to boost developer productivity.", "alert_priority": "medium", "timestamp": "2025-10-08T10:30:00Z" } Merge + Aggregate Combines the topic/keyword extraction and sentiment output into a single structured dataset. Aggregates both results for unified reporting. Persist Results (Google Sheets) Writes combined output into your connected Google Sheet. Two columns recommended: feedback_analysis β Sentiment + Summary JSON topics_keywords β Extracted Topics + Keywords JSON Enables easy visualization, filtering, and reporting. Visualization (Optional) Add Sticky Notes or a logo image node in your workflow to: Visually describe sections (e.g., βSentiment Analysisβ, βTopic Extractionβ). Embed brand logo: Example AI Output (Combined) { "feedback_analysis": { "customer_name": "Ranjan Dailata", "sentiment": "positive", "summary": "User appreciates current templates and suggests building more advanced AI automations.", "key_phrases": ["AI automation", "developer templates"] }, "topics_keywords": { "topics": [ { "topic": "AI Template Expansion", "keywords": ["AI automation", "workflow templates"], "sentiment": "positive", "importance_score": 0.9 } ], "global_keywords": ["automation", "AI development"] } } Setup Instructions Pre-requisite If you are new to Jotform, Please do signup using Jotform Signup For the purpose of demonstation, we are considering the Jotforms Prebuilt New Customer Registration Form as a example. However, you are free to consider for any of the form submissions. Step 0: Local n8n (Optional) If using local n8n, set up ngrok: ngrok http 5678 Use the generated public URL as your Webhook URL base for Jotform integration. Step 1: Configure the Webhook Copy the Webhook URL generated by n8n (e.g., /webhook-test/f3c34cda-d603-4923-883b-500576200322). You can copy the URL by double clicking on the Webhook node. Make sure to replace the base url with the above Step 0, if you are running the workflow from your local machine. In Jotform, go to your form β Settings β Integrations β Webhooks β paste this URL. Now, every new form submission will trigger the n8n workflow. Step 2: Connect Google Gemini Create a Google Gemini API Credential in n8n. Select the model models/gemini-2.0-flash-exp. Step 3: Create Data Storage Create a DataTable named JotformFeedbackInsights with columns: feedback_analysis (string) topics_keywords (string) Step 4: Connect Google Sheets Add credentials under Google Sheets OAuth2. Link to your feedback tracking sheet. Step 5: Test the Workflow Submit a form via Jotform. Check results: AI nodes return structured JSON. Google Sheet updates with new records. Customization Tips Change the Prompt You can modify the topic extraction prompt to highlight specific themes: You are a research analyst. Extract main topics, keywords, and actionable insights from this feedback: {{ $json.body }} Extend the Output Schema Add more fields like: { "suggested_blog_title": "", "tone": "", "recommendations": [] } Then update your DataTable or Sheets schema accordingly. Integration Ideas Send sentiment alerts to Slack for high-priority feedback. Push insights into Notion, Airtable, or HubSpot. Generate weekly reports summarizing trends across all submissions. Summary This workflow turns raw Jotform submissions into actionable insights using Google Gemini AI β extracting topics, keywords, and sentiment while automatically logging everything to Google Sheets.
by Gracewell
Who Is This For? This workflow is designed for educators, universities, examination departments, and EdTech institutions that need a faster, smarter, and standardized way to prepare exam question papers. What Problem Does This Solve? Creating balanced, outcome-based question papers can take hours or even days of manual effort. Faculty often struggle to: Ensure syllabus coverage across units Maintain Bloomβs Taxonomy alignment Keep a consistent difficulty balance Format papers in institution-specific templates How it works This workflow automatically generates an exam question paper based on syllabus topics submitted via a form and sends it to the entered email address. Hereβs the flow in simple steps: Form Submission β A student or faculty fills out a form with subject code, syllabus topics, and their email. AI Question Generation β The workflow passes the syllabus to AI agents (Part A with 2 Marks, Part B with 13 Marks, and Part C with 14 Marks) to create question sets. The marks and the no. of question generated can be customized according to the convenience. Merging Questions β All AI-generated questions are combined into a single structured document. Format into HTML β The questions are formatted into a clean HTML exam paper (can also be extended to PDF). Send by Emailβ The formatted exam paper is sent to the userβs email (with option to CC/BCC). Set up steps Connect Accounts Connect your OpenAI (or LLM) credentials for AI-powered question generation. Connect your Gmail (or preferred email service) to send emails. Prepare Form Create an n8n form trigger with required fields: Subject with Code Syllabus for Unit 1, 2, 3β¦ Email to receive the paper Customize Question Generation Modify the AI prompts for Parts A, B, and C to fit your syllabus style (e.g., 2-mark, 13-mark, 14-mark). Format the Exam Paper Adjust the HTML template to match your institutionβs exam paper layout. Test & Deploy Submit a test form entry. Check the received email to ensure formatting looks good. Deploy the workflow to production for real usage. Need help customizing? βοΈ Contact Me πΌ LinkedIn
by Milo Bravo
Email Sentiment Router for Event Sales Leads Who is this for? Event organizers, conference managers, and sales teams drowning in sponsor/exhibitor/partner emails who need zero-drop leads + real-time pipeline analytics. Key nodes: Gmail Trigger - Google Gemini (2x) - Data Table - Google Sheets - Send Email - Slack Category: Sales / AI / Event Management Level: Advanced Credits: Milo Bravo (BRaiA Labs) What problem is this workflow solving? Email overload kills event revenue: 200+ weekly sponsor/partner emails go unread No sentiment/intent analysis = missed hot leads Manual routing wastes 10+ hours/week Zero visibility into pipeline trends This workflow auto-classifies, routes, and analyzes every inbound lead. What this workflow does? Gmail Trigger monitors event inbox for new emails Gemini #1 scores sentiment (Positive/Neutral/Negative) Gemini #2 extracts topic, intent, urgency, org, budget signals Logs to email_analytics Data Table + Google Sheets Routes intelligently: Positive β Hot Lead email + Slack #hot-leads (2h SLA) Neutral β Nurture email + Slack #follow-ups (24-48h) Negative β Insights + Slack #insights Looker Studio dashboard auto-updates from Sheets Setup (5 minutes): Gmail OAuth2 (event inbox) Google Gemini API key Slack OAuth2 + channels (#hot-leads, #follow-ups, #insights) Create email_analytics Data Table β paste Table ID Update recipient emails (placeholders in Send Email nodes) Test with Evaluation Dataset before live How to customize: Add keywords for your niche (conferences, webinars, trade shows) Adjust sentiment thresholds or routing rules Swap Slack β Teams/CRM (HubSpot/Salesforce) Scale: Multi-inbox + team routing ROI: 100% lead capture (zero drops) 5x faster response (2h β 2min) 20% conversion lift from sentiment prioritization Pipeline dashboard = data-driven sales strategy Need help customizing?: Contact me for consulting and support: LinkedIn / Message Keywords: event sales leads, email sentiment analysis, Gmail AI routing, Google Gemini sales automation, conference sponsor leads, event pipeline analytics, sales lead qualification, sales dashboard
by Yassin Zehar
Description This workflow continuously validates data quality using rules stored in Notion, runs anomaly checks against your SQL database, generates AI-powered diagnostics, and alerts your team only when real issues occur. Notion holds all data quality rules (source, field, condition, severity). n8n reads them on schedule, converts them into live SQL queries, and aggregates anomalies into a global run summary. The workflow then scores data health, creates a Notion run record, optionally opens a Jira issue, and sends a Slack/email alert including AI-generated root cause & recommended fixes. Target users Perfect for: DataOps Analytics Product Data BI Compliance ETL/ELT pipelines Platform reliability teams. Workflow steps How it works 1) Notion β Rules Database Each entry defines a check (table, field, condition, severity). 2) n8n β Dynamic Query Execution Rules are converted into SQL and checked automatically. 3) Summary Engine Aggregates anomalies, computes data quality score. 4) AI Diagnostic Layer Root cause analysis + recommended fix plan. 5) Incident Handling Notion Run Page + optional Slack/Email/Jira escalation. Silent exit when no anomaly = zero noise. Setup Instructions Create two Notion databases: Data Quality Rules β source / field / rule / severity / owner Data Quality Runs β run_id / timestamp / score / anomalies / trend / AI summary/recommendation Connect SQL database (Postgres / Supabase / Redshift etc.) Add OpenAI credentials for AI analysis Connect Slack + Gmail + Jira for incident alerts Set your execution schedule (daily/weekly) Expected outcomes Fully automated, rule-based data quality monitoring with minimal maintenance and zero manual checking. When everything is healthy, runs remain silent. When data breaks, the team is notified instantly: with context, root cause insight, and a structured remediation output. Tutorial video Watch the Youtube Tutorial video About me : Iβm Yassin a Project & Product Manager Scaling tech products with data-driven project management. π¬ Feel free to connect with me on Linkedin