by OwenLee
๐In the social and behavioral sciences (e.g., psychology, sociology, economics, management), researchers and students often need to normalize academic paper metadata and extract variables before any literature review or meta-analysis. ๐งฉThis workflow automates the busywork. Using an LLM, it processes CSV/XLSX/XLS files (exported from WoS, Scopus, EndNote, Zotero, or your own spreadsheets) into normalized metadata and extracted variables, and writes a neat table to Google Sheets. ๐ Example Google Sheet:ย click me ๐ฅ Who is this for? ๐ Undergraduate and graduate students or researchers in soft-science fields (psychology, sociology, economics, business) โฑ๏ธ People who donโt have time to read full papers and need quick overviews ๐ Anyone who wants to automate academic paper metadata normalization and variable extraction to speed up a literature review โ๏ธ How it works ๐ค Upload an academic paper file (CSV/XLSX/XLS) in chat. ๐ The workflow creates a Google Sheets spreadsheet with two tabs: Checkpoint and FinalResult. ๐ A structured-output LLM normalizes core metadata (title, abstract, authors, publication date, source) from the uploaded file and writes it to Checkpoint; ๐ง a Gmail notification is sent when finished. ๐งช A second structured-output LLM uses the metadata above to extract variables (Independent Variable, Dependent Variable) and writes them to FinalResult; ๐ง youโll get a second Gmail notification when done. ๐ ๏ธ How to set up ๐ Credentials Google Sheets OAuth2** (read/write) Gmail OAuth2** (send notifications) Google Gemini (or any LLM you prefer)** ๐ Quick start Connect Google Sheets, Gmail, and Gemini (or your LLM) credentials. Open File Upload Trigger โ upload your CSV/XLSX/XLS file and type a name in chat (used as the Google Sheets spreadsheet title). Watch your inbox for status emails and open the Google Sheets spreadsheet to review Checkpoint and FinalResult. ๐ Customization ๐๏ธ Journal lists: Edit the Journal Rank Classifier code node to add/remove titles. The default list is for business/management journalsโswap it for a list from your own field. ๐ Notifications: Replace Gmail with Slack, Teams, or any channel you prefer. ๐ง LLM outputs: Need different metadata or extracted data? Edit the LLMโs system prompt and Structured Output Parser. ๐ Note ๐ Make sure your file includes abstracts. If the academic paper data you upload doesnโt contain an abstract, the extracted results will be far less useful. ๐งฉ CSV yields no items? Encoding mismatches can break the workflow. If this happens, convert the CSV to .xls or .xlsx and try again. ๐ฉ Help Contact: owenlzyxg@gmail.com
by Mohamed Abubakkar
WORKFLOW OVERVIEW This workflow is an AI-powered business intelligence agent designed for founders and business owners. It automatically collects key business metrics, calculates performance KPIs, applies decision logic, uses AI reasoning, and sends clear, actionable notifications โ without dashboards or manual reports. Key Features: โ Aggregates multiple data sources (MSSQL, Google Analytics, Google Sheets) โ Calculates critical KPIs: ROAS, CAC, Revenue & User Growth โ Applies rule-based decision logic for business risk and opportunity detection โ AI-powered reasoning: summarizes insights and recommends actions โ Multi-channel notifications: Email, WhatsApp, Slack, Telegram โ Fully automated daily execution via Cron trigger โ Enterprise-ready: error handling, structured data, KPI validation Setup & Requirements: API access to data sources (MSSQL, Google Analytics, Google Sheets) OpenAI or Google Gemini API for AI reasoning Messaging integration: Gmail, Twilio (WhatsApp), Slack, Telegram Workflow Flow: Cron Trigger โ runs daily at a chosen time Data Collection โ revenue, users, marketing spend, website analytics Merge Node โ combines all data sources Function Node โ consolidates into a single JSON object KPI Calculation โ calculates ROAS, CAC, growth rates Business Logic Engine โ identifies risks and opportunities AI Reasoning Agent โ summarizes insights, suggests actions Notification Formatter โ builds founder-friendly message Notification Delivery โ sends via WhatsApp, Email, Slack, or Telegram Example Data Formation These data below Getting from all different channels. `{ "revenue": 4290, "registeredUsers": 20, "totalUsers": 3, "adSpend": 800 }` Applies rule-based logic to detect potential risks or opportunities `{ "ROAS": 5.36, "CAC": 40, "agentStatus": "normal", "agentPriority": "low", "insights": ["Marketing campaigns are performing very well"] }` Workflow Highlights Fully automated, runs daily without human intervention Integrates multiple business data sources Converts raw data into KPIs for actionable insight Applies both rule-based logic and AI reasoning Generates concise, human-friendly notifications Sending notification to different channels.
by SOLOVIEVA ANNA
Overview This workflow turns audio attachments you receive by Gmail into Japanese transcripts and structured AI summaries, then saves everything to Google Drive and Google Sheets while notifying you via Gmail and Slack. Every time an email with a voice recording arrives, the audio is stored in a dated folder, fully transcribed in Japanese, summarized into clear meeting-style points, and logged so you can quickly review and search later. Audio Email to Japanese Transcrโฆ Audio Email to Japanese Transcript with AI Summary & Multi-Channel Notification Who this is for People who get voice memos or meeting recordings as email attachments Teams that want clear Japanese transcripts plus action-item summaries from calls Anyone who wants audio notes automatically archived and searchable in Drive/Sheets How it works Trigger: New Gmail with audio attachment A Gmail Trigger watches your inbox, downloads attachments for each new email, and passes them into the workflow. Split & filter attachments A Code node splits the email into one item per attachment and normalizes the binary data to binary.data. A Filter node keeps only audio files (mp3, wav, m4a, ogg) and discards everything else. Create date-based Drive folder & upload audio A Code node builds a YYYY/MM folder path from the current date. A Google Drive node creates that folder (if it doesnโt exist) under your chosen parent folder. A Merge node combines folder info with file info, and the audio file is uploaded into that folder so all recordings are organized by year/month. Transcribe audio to Japanese text An HTTP Request node calls the OpenAI Audio Transcriptions API (gpt-4o-transcribe) with the audio file. The prompt tells the model to produce a verbatim Japanese transcript (no summarization, no guessing), returned as plain text. Generate structured AI summary The transcript is sent to an OpenAI Chat node (gpt-4o), which outputs JSON with: title: short Japanese title for the recording points: key discussion points (array) decisions: decisions made (array) actionItems: action items with owner/deadline (array) A Set node then formats this JSON into a Markdown summary (summaryContent) with sections for ่ฆ็น / ๆฑบๅฎไบ้ / ใขใฏใทใงใณใขใคใใ . Save transcript & summary files to Drive The transcript text is converted into a .txt file and uploaded to the same YYYY/MM folder. The Markdown summary is converted into a .md file (e.g. xxx_summary.md) and uploaded as well. Each file is then shared in Drive so you have accessible web links to both transcript and summary. Log to Google Sheets A Code node collects the email subject, file name, full transcript, formatted summary, and Drive links into one JSON object. A Google Sheets node appends a new row with timestamp, subject, summary, transcript, and link so you get a running log of all processed audios. Notify via Gmail & Slack Finally, the workflow: Sends a Gmail message back to the original sender with the meeting summary and links Posts a Slack notification in your chosen channel, including subject, file name, summary text, and Drive link How to set up Connect your Gmail, Google Drive, Google Sheets, Slack, and OpenAI credentials in the respective nodes. In the Gmail Trigger, narrow the scope if needed (e.g. specific label, sender, or inbox). In the Drive nodes, set the parent folder where you want the YYYY/MM subfolders to be created. In the Google Sheets node, point to your own spreadsheet and sheet name. In the Slack node, select the channel where reminders should be posted. Make sure your OpenAI credentials have access to both audio transcription and chat endpoints. Customization ideas Filter by sender, subject keyword, or label so only certain emails are processed. Change the folder structure (e.g. ProjectName/YYYY/MM or YYYY/MM/DD) in the folder-path Code node. Adjust the transcription prompt (e.g. allow light punctuation clean-up, use another language). Modify the summary format or add extra fields (e.g. meeting participants, project name) in the AI prompt and Markdown template. Send notifications to other tools: add branches for Notion, LINE, Teams, or additional Slack channels.
by Rahul Joshi
Description This workflow is designed to evaluate newly added CVs for Diversity, Equity, and Inclusion (DEI) eligibility. It automatically ingests CVs from Google Drive, extracts key fields, analyzes them with Azure OpenAI, logs structured DEI outcomes in Google Sheets, and sends a concise DEI-focused summary email to the hiring manager. The entire flow prioritizes consistent, auditable DEI checks and controlled logic paths. What This Template Does Watches Google Drive for new CV files to trigger DEI evaluation. Downloads and extracts text/structured fields from PDF CVs. Assesses DEI eligibility using Azure OpenAI, following defined criteria and prompts. Appends DEI results (eligible/not eligible, rationale, confidence) to Google Sheets for tracking. Generates and sends a DEI-focused summary email to the hiring manager for review. Key Benefits Standardized DEI screening to support equitable hiring decisions. Centralized, structured logging in Sheets for transparency and audits. Automated DEI summaries for faster, consistent manager review. Reliable routing with true/false logic to enforce DEI evaluation steps. Features Google Drive trigger (fileCreated) for CV intake tied to DEI checks. PDF extraction mapped to fields relevant for DEI evaluation. Azure OpenAI Chat Model prompts tuned for DEI criteria and rationale. Google Sheets append with eligibility status, notes, and timestamps. Email node that delivers DEI summaries and next-step guidance. Logic branching (true/false) to control DEI evaluation and notifications. Requirements n8n instance (cloud or self-hosted). Google Drive access to the CV intake folder. Google Sheets access for DEI results logging. Azure OpenAI access and configured prompts reflecting DEI criteria. Email node credentials to send DEI summaries to managers. Step-by-Step Setup Instructions Connect Google Drive and select the CV folder for the fileCreated trigger. Configure the Download CV and Extract From PDF nodes to capture fields needed for DEI checks. Add Azure OpenAI credentials and set DEI-specific prompts (criteria, rationale, confidence). Connect Google Sheets and select the target sheet; map columns for status, rationale, and timestamps. Configure the Email to Manager node with a DEI-focused subject and template. Test with sample CVs, verify sheet entries and email content, then enable the workflow. DEI-Focused Best Practices Clarify DEI criteria and document them in your prompt and sheet schema. Avoid including sensitive PII in emails; store only necessary fields for DEI decisions. Use n8n Credentials; never hardcode API keys or private data. Maintain an audit trail (timestamps, model version, prompt version, decision rationale). Periodically review prompts and sheet schema to align with policy updates.
by Julian Kaiser
Automatically Scrape Make.com Job Board with GPT-5-mini Summaries & Email Digest Overview Who is this for? Make.com consultants, automation specialists, and freelancers who want to catch new client opportunities without manually checking the forum. What problem does it solve? Scrolling through forum posts to find jobs wastes time. This automation finds new postings, uses AI to summarize what clients need, and emails you a clean digest. How it works: Runs on schedule โ scrapes the Make.com professional services forum โ filters jobs from last 7 days โ AI summarizes each posting โ sends formatted email digest. Use Cases Freelancers: Get daily job alerts without forum browsing, respond to opportunities faster Agencies: Keep sales teams informed of potential clients needing Make.com expertise Job Seekers: Track contract and full-time positions requiring Make.com skills Detailed Workflow Scraping: HTTP module pulls HTML from the Make.com forum job board Parsing: Extracts job titles, dates, authors, and thread links Filtering: Only jobs posted within last 7 days pass through (configurable) AI Processing: GPT-5-mini analyzes each post to extract: Project type Key requirements Complexity level Budget/timeline (if mentioned) Email Generation: Aggregates summaries into organized HTML email with direct links Delivery: Sends via SMTP to your inbox Setup Steps Time: ~10 minutes Requirements: OpenRouter API key (get one here) SMTP credentials (Gmail, SendGrid, etc.) Steps: Import template Add OpenRouter API key in "OpenRouter Chat Model" node Configure SMTP settings in "Send email" node Update recipient email address Set schedule (recommended: daily at 8 AM) Run test to verify Customization Tips Change date range: Modify filter from 7 days to X days: {{now - X days}} Keyword filtering: Add filter module to only show jobs mentioning "API", "Shopify", etc. AI detail level: Edit prompt for shorter/longer summaries Multiple recipients: Add comma-separated emails in Send Email node Different AI model: Switch to Gemini or Claude in OpenRouter settings Team notifications: Add Slack/Discord webhook instead of email
by Paul Karrmann
LinkedIn Inbox Triage (Gmail Label to Notion + Slack) This n8n template demonstrates how to use AI to triage LinkedIn emails in your Gmail inbox, so you only see the messages worth your time. It filters out automated noise, scores sales likelihood, drafts quick replies for real conversations, stores everything in Notion, and sends you a Slack DM for items you should answer quickly. Good to know This workflow sends email content to an LLM. Do not use it with sensitive mailboxes unless you are comfortable with that. Cost depends on your model choice and token usage. The body is currently limited to 4000 characters to control spend. If you want a shorter run window, adjust the receivedAfter filter. How it works Runs on a daily schedule. Pulls emails from Gmail using a label you define (example: LinkedIn). Applies two filters: Keeps only invitations and messages Removes common automated notifications Fetches the full email body for better classification. Sends the message to an AI agent that returns strict structured JSON: action (reply_quick, review, ignore, block) relevancy_score (0 to 100) sales_likelihood (0 to 1) summary optional reply_draft Applies a quality gate to keep high signal messages. Writes the output to a Notion database as a ticket. Sends a Slack DM only for items marked reply_quick. How to use Create a Gmail label that captures LinkedIn emails, then add the label id to the Gmail node. Create a Notion database with fields matching the Notion node mapping. Connect your OpenAI, Gmail, Notion, and Slack credentials in n8n. Run once manually to verify mapping, then enable the workflow. Requirements Gmail account OpenAI API credentials (or compatible model node) Notion database Slack account Customising this workflow Make it more aggressive by increasing the sales threshold or raising the relevancy cutoff. Add more filter phrases for your own LinkedIn email language. Swap Slack DM for a channel post, or send a daily digest instead of per message. Add a redaction step before the AI node if you want to remove signatures or quoted replies.
by Connor Provines
โ ๏ธ Community Node Disclaimer This template uses the Apify LinkedIn Profile Scraper, which is a community node only available in self-hosted n8n installations. The LinkedIn scraping step is optional and can be removed for n8n Cloud compatibility. Who's it for Sales and marketing teams processing 20+ leads daily who need to eliminate manual research and focus reps on hot prospects. Perfect for B2B companies wanting to qualify inbound leads at scale using AI-powered enrichment and scoring. What it does This workflow automates lead qualification by enriching email addresses with firmographic data from People Data Labs, researching individuals and companies using Perplexity AI, scoring leads against your ICP criteria with Claude, and routing them to appropriate channels. Hot leads (8-10 score) get instant Slack alerts with personalized email drafts. Warm leads (5-7) go to a digest channel. Cold leads (0-4) log to your CRM only. Processing takes 30-60 seconds per lead versus 20 minutes manual research, costing $0.08-0.15 per lead. How it works The webhook receives an email address and optional name. Multiple enrichment sources run in parallel: PDL fetches contact and firmographic data, Perplexity researches the individual's recent activity and company developments, and optionally Apify scrapes their LinkedIn profile. All data merges into a complete profile. Claude AI scores the lead against your ICP rules stored in Google Docs, calculating points for company fit, title fit, buying signals, and timing. Based on the total score, leads route to three tiers with different handling. Hot leads trigger immediate Slack alerts and generate personalized email drafts using Gemini. All qualified leads optionally sync to your CRM. Requirements People Data Labs API (or Apollo/Clearbit alternative) Perplexity API Anthropic Claude API Google Docs for ICP rules Slack workspace Gmail account Optional: Apify for LinkedIn scraping (self-hosted only) Optional: HubSpot or other CRM Set up steps 1. Configure the webhook In the Webhook node, set your webhook path (default is "lead-intake"). Send POST requests with this JSON format: { "email": "lead@company.com", "name": "Optional Name" } 2. Add API credentials securely People Data Labs: In the PDL Enrich node, click "Credential for Header Auth" โ Create new credential โ Add header name X-Api-Key with your PDL API key as the value. This uses n8n's credential management instead of hardcoding keys. Perplexity: In both Individual Research and Company Research nodes, add your Perplexity API credentials. Anthropic: In the Anthropic Chat Model node, add your Claude API credentials. Slack: In both Slack nodes, set up OAuth2 and select your target channels. Hot and warm leads can route to different channels. Gmail: In the Send Hot Lead Email node, configure OAuth2 credentials. Google Docs: In the ICP & Use Case node, replace the documentURL with your Google Doc containing ICP scoring rules, then add OAuth2 credentials. Optional - Apify: In LinkedIn Profile Scraper node, add your Apify OAuth2 credentials from https://apify.com/curious_coder/linkedin-profile-scraper Optional - HubSpot: Enable the Upsert to HubSpot CRM node and add your credentials. Customize the customPropertiesValues array to match your fields. 3. Create your ICP rules document Create a Google Doc with this structure: COMPANY FIT (0-3 points): Company size: 50-500 employees = 3 points Industry: SaaS/Technology = 3 points Geography: North America = 3 points TITLE FIT (0-3 points): VP/C-level = 3 points Director = 2 points Manager = 1 point BUYING SIGNALS (0-2 points): Recent funding = 2 points New executive = 1 point TIMING (0-2 points): Urgent need = 2 points Copy the URL and paste it in the ICP & Use Case node's documentURL parameter. 4. Test the workflow Activate the workflow and send a test webhook. Monitor the execution to verify enrichment sources return data, AI scoring completes, routing works correctly, and notifications send to the right channels. How to customize Swap enrichment sources: Replace the PDL Enrich node with Apollo or Clearbit HTTP Request nodes. Update the Merge Enrichment Data node to parse the new response format. Adjust scoring thresholds: In the AI Agent node prompt, change the score ranges (currently 8-10 = hot, 5-7 = warm, 0-4 = cold) and add custom scoring factors like technology stack match or budget authority. Change routing: In the Route by Score node, add new output conditions for additional tiers like VIP or modify existing thresholds. Different notifications: Replace Slack nodes with Gmail or add Twilio nodes for SMS. Update the formatting nodes to create appropriate message templates. Use different AI models: Swap the Anthropic Chat Model with OpenAI for GPT-4 or replace the Gemini formatting nodes with Claude for consistency. Remove LinkedIn scraping: Delete the LinkedIn Profile Scraper node and adjust Merge All Sources to accept 4 inputs instead of 5 for n8n Cloud compatibility. Connect different CRMs: Replace the HubSpot node with Salesforce, Pipedrive, or other CRM nodes. Update the Format for CRM node's field mappings to match your CRM's structure.
by Rahul Joshi
Description Automate your weekly cross-platform social media analytics workflow with AI-powered insights. ๐๐ค This system retrieves real-time Twitter (X) and Facebook data, validates and merges the metrics, formats them via custom JavaScript, generates a visual HTML summary with GPT-4o, stores structured analytics in Notion, and broadcasts key results through Gmail and Slack โ all in one seamless flow. Perfect for marketing, social media, and growth teams tracking weekly engagement trends. ๐๐ฌ What This Template Does 1๏ธโฃ Starts on manual execution to fetch the latest performance data. ๐น๏ธ 2๏ธโฃ Collects live metrics from both Twitter (X API) and Facebook Graph API. ๐ฆ๐ 3๏ธโฃ Merges API responses into one unified dataset for analysis. ๐งฉ 4๏ธโฃ Validates data completeness before processing; logs missing or invalid data to Google Sheets. ๐ 5๏ธโฃ Uses JavaScript to normalize data into clean JSON structures for AI analysis. ๐ป 6๏ธโฃ Leverages Azure OpenAI GPT-4o to generate a professional HTML analytics report. ๐ง ๐ 7๏ธโฃ Updates Notionโs โGrowth Chartโ database with historical metrics for record-keeping. ๐๏ธ 8๏ธโฃ Sends the HTML report via Gmail to the marketing or analytics team. ๐ง 9๏ธโฃ Posts a summarized Slack message highlighting key insights and platform comparisons. ๐ฌ Key Benefits โ Eliminates manual social media reporting with full automation. โ Ensures clean, validated data before report generation. โ Delivers visually engaging HTML performance summaries. โ Centralizes analytics storage in Notion for trend tracking. โ Keeps teams aligned with instant Slack and Gmail updates. Features Dual-platform analytics integration (Twitter X + Facebook Graph). Custom JavaScript node for data normalization and mapping. GPT-4o model integration for HTML report generation. Real-time error logging to Google Sheets for transparency. Notion database update for structured performance tracking. Slack notifications with emoji-rich summaries and insights. Gmail automation for formatted weekly performance emails. Fully modular โ easy to scale to other social platforms. Requirements Twitter OAuth2 API credentials for fetching X metrics. Facebook Graph API credentials for retrieving page data. Azure OpenAI credentials for GPT-4o AI report generation. Notion API credentials with write access to โGrowth Chart.โ Slack Bot Token with chat:write permission for updates. Google Sheets OAuth2 credentials for error logs. Gmail OAuth2 credentials to send HTML reports. Environment Variables TWITTER_API_KEY FACEBOOK_GRAPH_TOKEN AZURE_OPENAI_KEY NOTION_GROWTH_DB_ID SLACK_ALERT_CHANNEL_ID GOOGLE_SHEET_ERROR_LOG_ID GMAIL_MARKETING_RECIPIENTS Target Audience ๐ Marketing and growth teams analyzing engagement trends. ๐ก Social media managers tracking cross-channel performance. ๐ง Data and insights teams needing AI-based summaries. ๐ฌ Brand strategists and content teams monitoring audience health. ๐งพ Agencies and operations teams automating weekly reporting. Step-by-Step Setup Instructions 1๏ธโฃ Connect all required API credentials (Twitter, Facebook, Azure OpenAI, Notion, Gmail, Slack, Sheets). 2๏ธโฃ Replace the username and page IDs in the HTTP Request nodes for your brand handles. 3๏ธโฃ Verify the JavaScript node output structure for correct field mapping. 4๏ธโฃ Configure the Azure GPT-4o prompt with your preferred tone and formatting. 5๏ธโฃ Link your Notion database and confirm property names match (followers, likes, username). 6๏ธโฃ Add recipient email(s) in the Gmail node. 7๏ธโฃ Specify your Slack channel ID for automated alerts. 8๏ธโฃ Test run the workflow manually to validate end-to-end execution. 9๏ธโฃ Activate or schedule the workflow for regular weekly reporting. โ
by Davide
This workflow automates the creation, assignment, tracking, and monitoring of tasks (issues) inside a Paperclip system using AI and external integrations. View this Youtube Video Tutorial to setup your Paperclip instance for FREE and get API Key (subtitles in English). โ Key Advantages 1. โ Full Automation of Task Lifecycle The workflow handles everything: Task intake โ assignment โ tracking โ completion notification No manual intervention is required. 2. โ AI-Powered Task Assignment Using an LLM: Tasks are assigned intelligently based on context Reduces human decision-making errors Scales easily with more agents 3. โ Centralized Tracking with Google Sheets Acts as a lightweight database Easy to audit, monitor, and share Provides historical tracking of tasks 4. โ Real-Time Monitoring & Alerts Scheduled checks ensure tasks are constantly monitored Instant email notifications when tasks are completed Improves responsiveness and visibility 5. โ Modular & Scalable Architecture Each block (Webhook, AI, API, Sheets, Email) is independent Easy to extend (e.g., Slack alerts, dashboards, analytics) Can integrate with other systems without redesigning everything 6. โ Efficient Resource Utilization Batch processing (Split in Batches) avoids overload Scheduled execution reduces unnecessary API calls 7. โ Seamless API Integration Connects Paperclip, OpenAI, Google Sheets, and Gmail Demonstrates strong interoperability across services How it works This workflow automates the assignment and tracking of issues/tasks to AI agents (called "Paperclip agents") and monitors their completion. Two main flows: Issue creation WF (triggered via Webhook or Manual): Receives a task with title and issue via webhook Fetches the company ID from the Paperclip API Retrieves all available Paperclip agents for that company Normalizes agent data (id, name, title) Uses GPT-5-mini to intelligently assign the task to the most appropriate agent Creates a new issue in Paperclip with the assigned agent Logs the issue to a Google Sheet with metadata (date, ID, title, issue, assigned agent) Completion monitoring WF (runs every 10 minutes via Schedule Trigger): Fetches all open issues (where COMPLETED column is empty) from Google Sheets Loops through each open issue Checks the current status of each issue in Paperclip API If status is "completed", sends a Gmail alert and updates the COMPLETED column in Sheets with the completion timestamp Set up steps API Credentials: Configure httpBearerAuth credentialwith your Paperclip API key Set up openAiApi credential Configure gmailOAuth2 credential for sending completion alerts Set up googleSheetsOAuth2Api credential for Sheets access Google Sheets Setup: Clone this Sheet Sheet must contain columns: DATE, TITLE, ISSUE, ASSIGN, ID, COMPLETED Share the sheet with the service account or OAuth account used in credentials Paperclip API Configuration: Replace all https://paperclip.xxx.xxx URLs with your actual Paperclip instance URL Verify the /api/agents/me, /api/companies/{id}/agents, and /api/issues/{id} endpoints are accessible Workflow Settings: Webhook path is auto-generated โ copy this for external calls Update the Gmail recipient from xxx@xxx.xxx to your target email address Adjust schedule trigger interval (currently 10 minutes) as needed Testing: Activate the workflow Use the Manual Trigger or send a POST request to the webhook URL with payload containing title and issue fields Monitor execution logs to verify agent assignment and issue creation ๐ Subscribe to my new YouTube channel. Here Iโll share videos and Shorts with practical tutorials and FREE templates for n8n. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Nirav Gajera
๐ AI Real Estate Lead Qualifier โ Typeform to Airtable with Smart Email Routing Automatically qualify property leads, score them with AI, save to Airtable, and send personalised emails โ all in seconds. ๐ Description Every time a prospect submits your Typeform property inquiry, this workflow kicks in automatically. It extracts their details, runs them through an AI lead scoring engine, saves the record to Airtable, and sends a personalised email โ a priority response for hot leads, a nurture email for everyone else. No manual review. No missed leads. No delayed follow-ups. This is built for real estate agencies, leasing companies, and property managers who receive inquiries through Typeform and want instant, intelligent responses without hiring extra staff. โจ Key Features Instant lead capture** โ triggers the moment Typeform receives a submission AI lead scoring** โ Google Gemini classifies every lead as High, Medium, or Low automatically Smart email routing** โ High leads get a priority response, others get a nurture sequence Airtable CRM sync** โ every lead saved with full details + AI assessment Robust AI parsing** โ 4-layer fallback system ensures AI output is always usable Personalised emails** โ name, property type, location, budget pulled into every email automatically No-code maintenance** โ update scoring rules in the prompt, email copy in the nodes ๐ How It Works Prospect submits Typeform inquiry โ Webhook receives the form payload โ Extract Typeform Fields โ name, email, phone, property type โ purpose, location, budget, requirements โ AI Lead Qualifier (Google Gemini) โ lead_score: High / Medium / Low โ intent, timeline, notes โ Parse AI Output (4-layer fallback) โ Save to Airtable CRM โ High Lead? (IF check) โ YES โ Priority Email (contact within 2 hours) โ NO โ Nurture Email (contact within 1-2 days) ๐ค AI Lead Scoring Rules The AI automatically classifies leads based on these criteria: | Score | Criteria | | :--- | :--- | | ๐ด High | Has budget + specific location + clear purpose (investment or near-term buying) | | ๐ก Medium | Partial information โ needs follow-up to qualify further | | ๐ข Low | Vague inquiry, missing budget or location, early exploration | The AI also extracts: Intent** โ what the lead is trying to achieve Timeline** โ how urgently they need a property Notes** โ assessment summary for your sales team ๐ง Email Templates High Lead Email Subject: ๐ [Name], we have properties matching your needs! Dark blue header โ premium feel Full inquiry summary (property, purpose, location, budget, timeline) AI assessment notes included Promise: senior agent contacts within 2 hours Nurture Email (Medium / Low) Subject: Thanks for reaching out, [Name]! Grey header โ warm and professional Inquiry summary (property, location, budget) Preparation tips while they wait Promise: contact within 1-2 business days ๐ Setup Requirements 1. Typeform Setup Create a Typeform with these fields and note their field ref IDs from the Typeform API: | Field | Type | Required | | :--- | :--- | :---: | | Full Name | Short text | โ | | Phone Number | Phone | โ | | Email Address | Email | โ | | Property Type | Multiple choice | โ | | Purpose | Multiple choice | โ | | Preferred Location | Short text | โ | | Budget Range | Multiple choice | โ | | Requirements | Long text | โ | | Consent | Yes/No | Optional | Connect Typeform webhook: Typeform Dashboard โ Connect โ Webhooks URL: your n8n webhook URL (Production URL from the Webhook node) Method: POST 2. Update Field Refs In the Extract Typeform Fields Code node, update the REF_* constants to match your actual Typeform field reference IDs: const REF_NAME = 'your-field-ref-here'; const REF_EMAIL = 'your-field-ref-here'; const REF_PHONE = 'your-field-ref-here'; // ... etc Find your field refs from the Typeform API or by logging a test webhook payload. 3. Airtable Setup Create an Airtable base with a table containing these fields: | Field Name | Field Type | | :--- | :--- | | Full Name | Single line text | | Email Address | Email | | Mobile Phone Number | Phone | | Property Type | Single line text | | Purpose | Single line text | | Preferred Location | Single line text | | Budget Range | Single line text | | Requirements | Long text | | Submit Date | Single line text | | Lead Score | Single line text | | Intent | Single line text | | Timeline | Single line text | | Notes | Long text | Update the Save to Airtable node with your Base ID and Table ID. 4. Credentials Required | Credential | Used for | Free? | | :--- | :--- | :--- | | Google Gemini (PaLM) API | AI lead scoring | Free tier available | | Airtable Personal Access Token | CRM save | Free | | SMTP | Sending emails | Depends on provider | 5. Email Configuration In both email nodes, update: fromEmail โ your sending address SMTP credential โ your email provider details Recommended SMTP providers for testing: Mailtrap (sandbox), Gmail, SendGrid โ๏ธ Workflow Nodes | Node | Type | Purpose | | :--- | :--- | :--- | | Webhook | Webhook | Receives Typeform POST payload | | Extract Typeform Fields | Code | Parses answers by field ref ID | | AI Lead Qualifier | AI Agent | Scores lead using Gemini | | Google Gemini Chat Model | LLM | AI model for scoring | | Parse AI Output | Code | Extracts JSON with 4-layer fallback | | Save to Airtable | Airtable | Creates CRM record | | High Lead? | IF | Routes by lead score | | Priority Email | Email Send | Sends to High leads | | Nurture Email | Email Send | Sends to Medium/Low leads | ๐ง Customisation Change scoring criteria: Edit the prompt in the AI Lead Qualifier node. Add your own rules โ e.g. score higher if budget exceeds a threshold, or if the purpose is investment. Add more email tiers: Add a third IF branch for Low leads with a different nurture sequence, or add a Slack/WhatsApp alert for High leads. Use a different form: The Extract Typeform Fields node uses Typeform's field.ref system. Replace it with a Google Forms or Jotform parser by adjusting how you read the incoming webhook body. Change the AI model: Replace the Google Gemini node with Claude, OpenAI, or any other LLM Chat Model node โ no other changes needed. Add lead deduplication: Before saving to Airtable, add a search step to check if the email already exists and skip or update instead of creating a duplicate. ๐ก Robustness Features The Parse AI Output node uses a 4-layer fallback to ensure the workflow never breaks due to AI formatting issues: Direct JSON parse โ if Gemini returns clean JSON Strip markdown fences โ if Gemini wraps in Regex JSON extraction โ if there's extra text around the JSON Field-by-field regex โ if JSON is malformed, extracts each field individually Default fallback โ if all else fails, sets lead_score: Medium This means the workflow continues and saves the record even if the AI returns unexpected output. ๐ Sample Airtable Record Full Name: Sarah Johnson Email Address: sarah@example.com Mobile Phone: +91 9876543210 Property Type: 2BHK Apartment Purpose: Investment Preferred Location: Bandra, Mumbai Budget Range: โน80L - โน1.2Cr Requirements: Parking, gym, sea view Submit Date: 2026-03-18T10:30:00Z Lead Score: High Intent: Serious buyer looking for investment property with strong rental yield Timeline: Within 3 months Notes: High-intent lead with clear budget and location. Has specific amenity requirements. Recommend immediate callback. ๐ฆ Requirements Summary n8n (cloud or self-hosted) Typeform account (any paid plan for webhooks) Airtable account (free tier works) Google AI Studio account for Gemini API key (free tier available) SMTP email account ๐ก Enhancement Ideas WhatsApp alert** โ send instant WhatsApp to sales team for every High lead via Twilio Calendar booking link** โ include a Calendly link in the High lead email Lead follow-up reminder** โ add a Wait node + reminder email if no reply in 48 hours Slack notification** โ ping your team channel instantly for High leads Lead scoring dashboard** โ connect Airtable to a dashboard tool for weekly reports Built with n8n ยท Google Gemini AI ยท Typeform ยท Airtable ยท SMTP
by Oneclick AI Squad
Simplify financial oversight with this automated n8n workflow. Triggered daily, it fetches cash flow and expense data from a Google Sheet, analyzes inflows and outflows, validates records, and generates a comprehensive daily report. The workflow sends multi-channel notifications via email and Slack, ensuring finance professionals stay updated with real-time financial insights. ๐ธ๐ง Key Features Daily automation keeps cash flow tracking current. Analyzes inflows and outflows for actionable insights. Multi-channel alerts enhance team visibility. Logs maintain a detailed record in Google Sheets. Workflow Process The Every Day node triggers a daily check at a set time. Get Cash Flow Data** retrieves financial data from a Google Sheet. Analyze Inflows & Outflows** processes the data to identify trends and totals. Validate Records** ensures all entries are complete and accurate. If records are valid, it branches to: Sends Email Daily Report to finance team members. Send Slack Alert to notify the team instantly. Logs to Sheet** appends the summary data to a Google Sheet for tracking. Setup Instructions Import the workflow into n8n and configure Google Sheets OAuth2 for data access. Set the daily trigger time (e.g., 9:00 AM IST) in the "Every Day" node. Test the workflow by adding sample cash flow data and verifying reports. Adjust analysis parameters as needed for specific financial metrics. Prerequisites Google Sheets OAuth2 credentials Gmail API Key for email reports Slack Bot Token (with chat:write permissions) Structured financial data in a Google Sheet Google Sheet Structure: Create a sheet with columns: Date Cash Inflow Cash Outflow Category Notes Updated At Modification Options Customize the "Analyze Inflows & Outflows" node to include custom financial ratios. Adjust the "Validate Records" filter to flag anomalies or missing data. Modify email and Slack templates with branded formatting. Integrate with accounting tools (e.g., Xero) for live data feeds. Set different trigger times to align with your financial review schedule. Discover more workflows โ Get in touch with us
by Oneclick AI Squad
This is a production-ready, end-to-end workflow that automatically compares hotel prices across multiple booking platforms and delivers beautiful email reports to users. Unlike basic building blocks, this workflow is a complete solution ready to deploy. โจ What Makes This Production-Ready โ Complete End-to-End Automation Input**: Natural language queries via webhook Processing**: Multi-platform scraping & comparison Output**: Professional email reports + analytics Feedback**: Real-time webhook responses โ Advanced Features ๐ง Natural Language Processing for flexible queries ๐ Parallel scraping from multiple platforms ๐ Analytics tracking with Google Sheets integration ๐ Beautiful HTML email reports ๐ก๏ธ Error handling and graceful degradation ๐ฑ Webhook responses for real-time feedback โ Business Value For Travel Agencies**: Instant price comparison service for clients For Hotels**: Competitive pricing intelligence For Travelers**: Save time and money with automated research ๐ Setup Instructions Step 1: Import Workflow Copy the workflow JSON from the artifact In n8n, go to Workflows โ Import from File/URL Paste the JSON and click Import Step 2: Configure Credentials A. SMTP Email (Required) Settings โ Credentials โ Add Credential โ SMTP Host: smtp.gmail.com (for Gmail) Port: 587 User: your-email@gmail.com Password: your-app-password (not regular password!) Gmail Setup: Enable 2FA on your Google Account Generate App Password: https://myaccount.google.com/apppasswords Use the generated password in n8n B. Google Sheets (Optional - for analytics) Settings โ Credentials โ Add Credential โ Google Sheets OAuth2 Follow the OAuth flow to connect your Google account Sheet Setup: Create a new Google Sheet Name the first sheet "Analytics" Add headers: timestamp, query, hotel, city, checkIn, checkOut, bestPrice, platform, totalResults, userEmail Copy the Sheet ID from URL and paste in the "Save to Google Sheets" node Step 3: Set Up Scraping Service You need to create a scraping API that the workflow calls. Here are your options: Option A: Use Your Existing Python Script Create a simple Flask API wrapper: api_wrapper.py from flask import Flask, request, jsonify import subprocess import json app = Flask(name) @app.route('/scrape/<platform>', methods=['POST']) def scrape(platform): data = request.json query = f"{data['checkIn']} to {data['checkOut']}, {data['hotel']}, {data['city']}" try: result = subprocess.run( ['python3', 'price_scrap_2.py', query, platform], capture_output=True, text=True, timeout=30 ) Parse your script output output = result.stdout Assuming your script returns price data return jsonify({ 'price': extracted_price, 'currency': 'USD', 'roomType': 'Standard Room', 'url': booking_url, 'availability': True }) except Exception as e: return jsonify({'error': str(e)}), 500 if name == 'main': app.run(host='0.0.0.0', port=5000) Deploy: pip install flask python api_wrapper.py Update n8n HTTP Request nodes: URL: http://your-server-ip:5000/scrape/booking URL: http://your-server-ip:5000/scrape/agoda URL: http://your-server-ip:5000/scrape/expedia Option B: Use Third-Party Scraping Services Recommended Services: ScraperAPI** (scraperapi.com) - $49/month for 100k requests Bright Data** (brightdata.com) - Pay as you go Apify** (apify.com) - Has pre-built hotel scrapers Example with ScraperAPI: // In HTTP Request node URL: http://api.scraperapi.com Query Parameters: api_key: YOUR_API_KEY url: https://booking.com/search?hotel={{$json.hotelName}}... Option C: Use n8n SSH Node (Like Your Original) Keep your SSH approach but improve it: Replace HTTP Request nodes with SSH nodes Point to your server with the Python script Ensure error handling and timeouts // SSH Node Configuration Host: your-server-ip Command: python3 /path/to/price_scrap_2.py "{{$json.hotelName}}" "{{$json.city}}" "{{$json.checkInISO}}" "{{$json.checkOutISO}}" "booking" Step 4: Activate Webhook Click on "Webhook - Receive Request" node Click "Listen for Test Event" Copy the webhook URL (e.g., https://your-n8n.com/webhook/hotel-price-check) Test with this curl command: curl -X POST https://your-n8n.com/webhook/hotel-price-check \ -H "Content-Type: application/json" \ -d '{ "message": "I want to check Marriott Hotel in Singapore from 15th March to 18th March", "email": "user@example.com", "name": "John Doe" }' Step 5: Activate Workflow Toggle the workflow to Active The webhook is now live and ready to receive requests ๐ Usage Examples Example 1: Basic Query { "message": "Hilton Hotel in Dubai from 20th December to 23rd December", "email": "traveler@email.com", "name": "Sarah" } Example 2: Flexible Format { "message": "I need prices for Taj Hotel, Mumbai. Check-in: 5th January, Check-out: 8th January", "email": "customer@email.com" } Example 3: Short Format { "message": "Hyatt Singapore March 10 to March 13", "email": "user@email.com" } ๐จ Customization Options 1. Add More Booking Platforms Steps: Duplicate an existing "Scrape" node Update the platform parameter Connect it to "Aggregate & Compare" Update the aggregation logic to include the new platform 2. Change Email Template Edit the "Format Email Report" node's JavaScript: Modify HTML structure Change colors (currently purple gradient) Add your company logo Include terms and conditions 3. Add SMS Notifications Using Twilio: Add new node: Twilio โ Send SMS Connect after "Aggregate & Compare" Format: "Best deal: ${hotel} at ${platform} for ${price}" 4. Add Slack Integration Add Slack node after "Aggregate & Compare" Send to #travel-deals channel Include quick booking links 5. Implement Caching Add Redis or n8n's built-in cache: // Before scraping, check cache const cacheKey = ${hotelName}-${city}-${checkIn}-${checkOut}; const cached = await $cache.get(cacheKey); if (cached && Date.now() - cached.timestamp < 3600000) { return cached.data; // Use 1-hour cache } ๐ Analytics & Monitoring Google Sheets Dashboard The workflow automatically logs to Google Sheets. Create a dashboard with: Metrics to track: Total searches per day/week Most searched hotels Most searched cities Average price ranges Platform with best prices (frequency) User engagement (repeat users) Example Sheet Formulas: // Total searches today =COUNTIF(A:A, TODAY()) // Most popular hotel =INDEX(C:C, MODE(MATCH(C:C, C:C, 0))) // Average best price =AVERAGE(G:G) Set Up Alerts Add a node after "Aggregate & Compare": // Alert if prices are unusually high if (bestDeal.price > avgPrice * 1.5) { // Send alert to admin return [{ json: { alert: true, message: High prices detected for ${hotelName} } }]; } ๐ก๏ธ Error Handling The workflow includes comprehensive error handling: 1. Missing Information If user doesn't provide hotel/city/dates โ Responds with helpful prompt 2. Scraping Failures If all platforms fail โ Sends "No results" email with suggestions 3. Partial Results If some platforms work โ Shows available results + notes errors 4. Email Delivery Issues Uses continueOnFail: true to prevent workflow crashes ๐ Security Best Practices 1. Rate Limiting Add rate limiting to prevent abuse: // In Parse & Validate node const userEmail = $json.email; const recentSearches = await $cache.get(searches:${userEmail}); if (recentSearches && recentSearches.length > 10) { return [{ json: { status: 'rate_limited', response: 'Too many requests. Please try again in 1 hour.' } }]; } 2. Input Validation Already implemented - validates hotel names, cities, dates 3. Email Verification Add email verification before first use: // Send verification code const code = Math.random().toString(36).substring(7); await $sendEmail({ to: userEmail, subject: 'Verify your email', body: Your code: ${code} }); 4. API Key Protection Never expose scraping API keys in responses or logs ๐ Deployment Options Option 1: n8n Cloud (Easiest) Sign up at n8n.cloud Import workflow Configure credentials Activate Pros: No maintenance, automatic updates Cons: Monthly cost Option 2: Self-Hosted (Most Control) Using Docker docker run -it --rm \ --name n8n \ -p 5678:5678 \ -v ~/.n8n:/home/node/.n8n \ n8nio/n8n Using npm npm install -g n8n n8n start Pros: Free, full control Cons: You manage updates Option 3: Cloud Platforms Railway.app (recommended for beginners) DigitalOcean App Platform AWS ECS Google Cloud Run ๐ Scaling Recommendations For < 100 searches/day Current setup is perfect Use n8n Cloud Starter or small VPS For 100-1000 searches/day Add Redis caching (1-hour cache) Use queue system for scraping Upgrade to n8n Cloud Pro For 1000+ searches/day Implement job queue (Bull/Redis) Use dedicated scraping service Load balance multiple n8n instances Consider microservices architecture ๐ Troubleshooting Issue: Webhook not responding Solution: Check workflow is Active Verify webhook URL is correct Check n8n logs: Settings โ Log Streaming Issue: No prices returned Solution: Test scraping endpoints individually Check if hotel name matches exactly Verify dates are in future Try different date ranges Issue: Emails not sending Solution: Verify SMTP credentials Check "less secure apps" setting (Gmail) Use App Password instead of regular password Check spam folder Issue: Slow response times Solution: Enable parallel scraping (already configured) Add timeout limits (30 seconds recommended) Implement caching Use faster scraping service