by Rajeet Nair
Overview This workflow automates invoice processing directly from your email inbox. It captures invoice attachments, extracts structured data using OCR and AI, validates totals, and securely processes invoices. If issues are detected, it routes them for human review. Approved invoices are recorded in QuickBooks and logged for auditing. How It Works Email Trigger Monitors a Gmail label and downloads invoice attachments automatically. Configuration & Metadata Sets OCR API, thresholds, and captures invoice details like vendor, timestamp, and file hash. File Processing Routes PDFs to a text extractor and images to an OCR API. AI Data Extraction AI converts raw text into structured invoice data with confidence scores. Validation Recalculates totals and verifies subtotal, tax, and final amounts. Data Privacy Masks sensitive fields like PAN, GST, and bank account details. Review Decision Flags invoices for review if validation fails or confidence is low. Human Review (if needed) Sends Slack alert and waits for approval before proceeding. Accounting Integration Creates a bill in QuickBooks for approved invoices. Audit Logging Stores processing details in Google Sheets for traceability. Notifications Sends a success message after processing is complete. Setup Instructions Connect Gmail trigger and configure the invoice label (e.g., "AP Inbox") Add OCR API endpoint in the configuration node Connect OpenAI credentials for data extraction Connect Slack for alerts and notifications Connect QuickBooks for bill creation Add Google Sheets ID for audit logging Configure: Validation tolerance Confidence threshold Test with sample invoice emails Activate the workflow Use Cases Automating accounts payable workflows Reducing manual invoice data entry Validating invoices before accounting entry Handling invoice approvals with human-in-the-loop Maintaining audit logs for compliance Requirements Gmail account (with label setup) OpenAI API credentials OCR API (e.g., OCR.space or Google Vision) Slack workspace QuickBooks account Google Sheets (for audit logs) n8n instance with file handling enabled Notes You can adjust validation tolerance and confidence thresholds based on your needs. Extend validation logic in the Code node for stricter financial checks. Replace QuickBooks or Google Sheets with your preferred tools if needed. Ensure secure handling of sensitive financial data in production environments.
by OwenLee
๐In the social and behavioral sciences (e.g., psychology, sociology, economics, management), researchers and students often need to normalize academic paper metadata and extract variables before any literature review or meta-analysis. ๐งฉThis workflow automates the busywork. Using an LLM, it processes CSV/XLSX/XLS files (exported from WoS, Scopus, EndNote, Zotero, or your own spreadsheets) into normalized metadata and extracted variables, and writes a neat table to Google Sheets. ๐ Example Google Sheet:ย click me ๐ฅ Who is this for? ๐ Undergraduate and graduate students or researchers in soft-science fields (psychology, sociology, economics, business) โฑ๏ธ People who donโt have time to read full papers and need quick overviews ๐ Anyone who wants to automate academic paper metadata normalization and variable extraction to speed up a literature review โ๏ธ How it works ๐ค Upload an academic paper file (CSV/XLSX/XLS) in chat. ๐ The workflow creates a Google Sheets spreadsheet with two tabs: Checkpoint and FinalResult. ๐ A structured-output LLM normalizes core metadata (title, abstract, authors, publication date, source) from the uploaded file and writes it to Checkpoint; ๐ง a Gmail notification is sent when finished. ๐งช A second structured-output LLM uses the metadata above to extract variables (Independent Variable, Dependent Variable) and writes them to FinalResult; ๐ง youโll get a second Gmail notification when done. ๐ ๏ธ How to set up ๐ Credentials Google Sheets OAuth2** (read/write) Gmail OAuth2** (send notifications) Google Gemini (or any LLM you prefer)** ๐ Quick start Connect Google Sheets, Gmail, and Gemini (or your LLM) credentials. Open File Upload Trigger โ upload your CSV/XLSX/XLS file and type a name in chat (used as the Google Sheets spreadsheet title). Watch your inbox for status emails and open the Google Sheets spreadsheet to review Checkpoint and FinalResult. ๐ Customization ๐๏ธ Journal lists: Edit the Journal Rank Classifier code node to add/remove titles. The default list is for business/management journalsโswap it for a list from your own field. ๐ Notifications: Replace Gmail with Slack, Teams, or any channel you prefer. ๐ง LLM outputs: Need different metadata or extracted data? Edit the LLMโs system prompt and Structured Output Parser. ๐ Note ๐ Make sure your file includes abstracts. If the academic paper data you upload doesnโt contain an abstract, the extracted results will be far less useful. ๐งฉ CSV yields no items? Encoding mismatches can break the workflow. If this happens, convert the CSV to .xls or .xlsx and try again. ๐ฉ Help Contact: owenlzyxg@gmail.com
by Cheng Siong Chin
How It Works This workflow automates end-to-end AI-driven content moderation for platforms managing user-generated content, including marketplaces, communities, and enterprise systems. It is designed for product, trust & safety, and governance teams seeking scalable, policy-aligned enforcement without subjective scoring. The workflow validates structured review, goal, and feedback data using a Performance Signal Agent that standardizes moderation signals and removes ambiguity. A Governance Agent then orchestrates policy enforcement, eligibility checks, escalation logic, and audit preparation. Content enters via webhook, is classified, validated, and routed by action type (approve, flag, escalate). Enforcement logic determines whether to store clean content, flag violations, or trigger escalation emails and team notifications. All actions are logged for traceability and compliance. This template solves inconsistent moderation decisions, lack of structured governance controls, and manual escalation overhead by embedding deterministic checkpoints, structured outputs, and audit-ready logging into a single automated pipeline. Setup Steps Connect OpenAI API credentials for AI agents. Configure Google Sheets or database for logging. Connect Gmail for escalation emails. Define moderation policies and routing rules. Activate webhook and test sample content. Prerequisites n8n account, OpenAI API key, Google Sheets or DB access, Gmail credentials, defined moderation policies. Use Cases Marketplace listing moderation, enterprise HR review screening Customization Adjust policy rules, add risk scoring, integrate Slack instead of Gmail Benefits Improves moderation accuracy, reduces manual review, enforces governance consistency
by Mohamed Abubakkar
WORKFLOW OVERVIEW This workflow is an AI-powered business intelligence agent designed for founders and business owners. It automatically collects key business metrics, calculates performance KPIs, applies decision logic, uses AI reasoning, and sends clear, actionable notifications โ without dashboards or manual reports. Key Features: โ Aggregates multiple data sources (MSSQL, Google Analytics, Google Sheets) โ Calculates critical KPIs: ROAS, CAC, Revenue & User Growth โ Applies rule-based decision logic for business risk and opportunity detection โ AI-powered reasoning: summarizes insights and recommends actions โ Multi-channel notifications: Email, WhatsApp, Slack, Telegram โ Fully automated daily execution via Cron trigger โ Enterprise-ready: error handling, structured data, KPI validation Setup & Requirements: API access to data sources (MSSQL, Google Analytics, Google Sheets) OpenAI or Google Gemini API for AI reasoning Messaging integration: Gmail, Twilio (WhatsApp), Slack, Telegram Workflow Flow: Cron Trigger โ runs daily at a chosen time Data Collection โ revenue, users, marketing spend, website analytics Merge Node โ combines all data sources Function Node โ consolidates into a single JSON object KPI Calculation โ calculates ROAS, CAC, growth rates Business Logic Engine โ identifies risks and opportunities AI Reasoning Agent โ summarizes insights, suggests actions Notification Formatter โ builds founder-friendly message Notification Delivery โ sends via WhatsApp, Email, Slack, or Telegram Example Data Formation These data below Getting from all different channels. `{ "revenue": 4290, "registeredUsers": 20, "totalUsers": 3, "adSpend": 800 }` Applies rule-based logic to detect potential risks or opportunities `{ "ROAS": 5.36, "CAC": 40, "agentStatus": "normal", "agentPriority": "low", "insights": ["Marketing campaigns are performing very well"] }` Workflow Highlights Fully automated, runs daily without human intervention Integrates multiple business data sources Converts raw data into KPIs for actionable insight Applies both rule-based logic and AI reasoning Generates concise, human-friendly notifications Sending notification to different channels.
by SOLOVIEVA ANNA
Overview This workflow turns audio attachments you receive by Gmail into Japanese transcripts and structured AI summaries, then saves everything to Google Drive and Google Sheets while notifying you via Gmail and Slack. Every time an email with a voice recording arrives, the audio is stored in a dated folder, fully transcribed in Japanese, summarized into clear meeting-style points, and logged so you can quickly review and search later. Audio Email to Japanese Transcrโฆ Audio Email to Japanese Transcript with AI Summary & Multi-Channel Notification Who this is for People who get voice memos or meeting recordings as email attachments Teams that want clear Japanese transcripts plus action-item summaries from calls Anyone who wants audio notes automatically archived and searchable in Drive/Sheets How it works Trigger: New Gmail with audio attachment A Gmail Trigger watches your inbox, downloads attachments for each new email, and passes them into the workflow. Split & filter attachments A Code node splits the email into one item per attachment and normalizes the binary data to binary.data. A Filter node keeps only audio files (mp3, wav, m4a, ogg) and discards everything else. Create date-based Drive folder & upload audio A Code node builds a YYYY/MM folder path from the current date. A Google Drive node creates that folder (if it doesnโt exist) under your chosen parent folder. A Merge node combines folder info with file info, and the audio file is uploaded into that folder so all recordings are organized by year/month. Transcribe audio to Japanese text An HTTP Request node calls the OpenAI Audio Transcriptions API (gpt-4o-transcribe) with the audio file. The prompt tells the model to produce a verbatim Japanese transcript (no summarization, no guessing), returned as plain text. Generate structured AI summary The transcript is sent to an OpenAI Chat node (gpt-4o), which outputs JSON with: title: short Japanese title for the recording points: key discussion points (array) decisions: decisions made (array) actionItems: action items with owner/deadline (array) A Set node then formats this JSON into a Markdown summary (summaryContent) with sections for ่ฆ็น / ๆฑบๅฎไบ้ / ใขใฏใทใงใณใขใคใใ . Save transcript & summary files to Drive The transcript text is converted into a .txt file and uploaded to the same YYYY/MM folder. The Markdown summary is converted into a .md file (e.g. xxx_summary.md) and uploaded as well. Each file is then shared in Drive so you have accessible web links to both transcript and summary. Log to Google Sheets A Code node collects the email subject, file name, full transcript, formatted summary, and Drive links into one JSON object. A Google Sheets node appends a new row with timestamp, subject, summary, transcript, and link so you get a running log of all processed audios. Notify via Gmail & Slack Finally, the workflow: Sends a Gmail message back to the original sender with the meeting summary and links Posts a Slack notification in your chosen channel, including subject, file name, summary text, and Drive link How to set up Connect your Gmail, Google Drive, Google Sheets, Slack, and OpenAI credentials in the respective nodes. In the Gmail Trigger, narrow the scope if needed (e.g. specific label, sender, or inbox). In the Drive nodes, set the parent folder where you want the YYYY/MM subfolders to be created. In the Google Sheets node, point to your own spreadsheet and sheet name. In the Slack node, select the channel where reminders should be posted. Make sure your OpenAI credentials have access to both audio transcription and chat endpoints. Customization ideas Filter by sender, subject keyword, or label so only certain emails are processed. Change the folder structure (e.g. ProjectName/YYYY/MM or YYYY/MM/DD) in the folder-path Code node. Adjust the transcription prompt (e.g. allow light punctuation clean-up, use another language). Modify the summary format or add extra fields (e.g. meeting participants, project name) in the AI prompt and Markdown template. Send notifications to other tools: add branches for Notion, LINE, Teams, or additional Slack channels.
by Rahul Joshi
Description This workflow is designed to evaluate newly added CVs for Diversity, Equity, and Inclusion (DEI) eligibility. It automatically ingests CVs from Google Drive, extracts key fields, analyzes them with Azure OpenAI, logs structured DEI outcomes in Google Sheets, and sends a concise DEI-focused summary email to the hiring manager. The entire flow prioritizes consistent, auditable DEI checks and controlled logic paths. What This Template Does Watches Google Drive for new CV files to trigger DEI evaluation. Downloads and extracts text/structured fields from PDF CVs. Assesses DEI eligibility using Azure OpenAI, following defined criteria and prompts. Appends DEI results (eligible/not eligible, rationale, confidence) to Google Sheets for tracking. Generates and sends a DEI-focused summary email to the hiring manager for review. Key Benefits Standardized DEI screening to support equitable hiring decisions. Centralized, structured logging in Sheets for transparency and audits. Automated DEI summaries for faster, consistent manager review. Reliable routing with true/false logic to enforce DEI evaluation steps. Features Google Drive trigger (fileCreated) for CV intake tied to DEI checks. PDF extraction mapped to fields relevant for DEI evaluation. Azure OpenAI Chat Model prompts tuned for DEI criteria and rationale. Google Sheets append with eligibility status, notes, and timestamps. Email node that delivers DEI summaries and next-step guidance. Logic branching (true/false) to control DEI evaluation and notifications. Requirements n8n instance (cloud or self-hosted). Google Drive access to the CV intake folder. Google Sheets access for DEI results logging. Azure OpenAI access and configured prompts reflecting DEI criteria. Email node credentials to send DEI summaries to managers. Step-by-Step Setup Instructions Connect Google Drive and select the CV folder for the fileCreated trigger. Configure the Download CV and Extract From PDF nodes to capture fields needed for DEI checks. Add Azure OpenAI credentials and set DEI-specific prompts (criteria, rationale, confidence). Connect Google Sheets and select the target sheet; map columns for status, rationale, and timestamps. Configure the Email to Manager node with a DEI-focused subject and template. Test with sample CVs, verify sheet entries and email content, then enable the workflow. DEI-Focused Best Practices Clarify DEI criteria and document them in your prompt and sheet schema. Avoid including sensitive PII in emails; store only necessary fields for DEI decisions. Use n8n Credentials; never hardcode API keys or private data. Maintain an audit trail (timestamps, model version, prompt version, decision rationale). Periodically review prompts and sheet schema to align with policy updates.
by Julian Kaiser
Automatically Scrape Make.com Job Board with GPT-5-mini Summaries & Email Digest Overview Who is this for? Make.com consultants, automation specialists, and freelancers who want to catch new client opportunities without manually checking the forum. What problem does it solve? Scrolling through forum posts to find jobs wastes time. This automation finds new postings, uses AI to summarize what clients need, and emails you a clean digest. How it works: Runs on schedule โ scrapes the Make.com professional services forum โ filters jobs from last 7 days โ AI summarizes each posting โ sends formatted email digest. Use Cases Freelancers: Get daily job alerts without forum browsing, respond to opportunities faster Agencies: Keep sales teams informed of potential clients needing Make.com expertise Job Seekers: Track contract and full-time positions requiring Make.com skills Detailed Workflow Scraping: HTTP module pulls HTML from the Make.com forum job board Parsing: Extracts job titles, dates, authors, and thread links Filtering: Only jobs posted within last 7 days pass through (configurable) AI Processing: GPT-5-mini analyzes each post to extract: Project type Key requirements Complexity level Budget/timeline (if mentioned) Email Generation: Aggregates summaries into organized HTML email with direct links Delivery: Sends via SMTP to your inbox Setup Steps Time: ~10 minutes Requirements: OpenRouter API key (get one here) SMTP credentials (Gmail, SendGrid, etc.) Steps: Import template Add OpenRouter API key in "OpenRouter Chat Model" node Configure SMTP settings in "Send email" node Update recipient email address Set schedule (recommended: daily at 8 AM) Run test to verify Customization Tips Change date range: Modify filter from 7 days to X days: {{now - X days}} Keyword filtering: Add filter module to only show jobs mentioning "API", "Shopify", etc. AI detail level: Edit prompt for shorter/longer summaries Multiple recipients: Add comma-separated emails in Send Email node Different AI model: Switch to Gemini or Claude in OpenRouter settings Team notifications: Add Slack/Discord webhook instead of email
by Diego Alejandro Parrรกs
AI job application tracker and interview prep assistant Categories: AI, Productivity, Career Transform your job search from chaos to clarity. This workflow automatically tracks every application, researches companies, and prepares you for interviews with AI-generated materials โ all saved to a visual Notion pipeline. Benefits Never lose track of applications** โ Every job gets logged automatically with status tracking Walk into interviews prepared** โ AI generates likely questions, talking points, and company insights Save 2-3 hours per application** โ Research and prep that took hours now happens in seconds Automated follow-up reminders** โ Get notified when it's time to send that follow-up email How It Works Submit application via form (paste job URL) or forward your confirmation email AI extracts job details โ company, role, requirements, salary, location Interview prep generates โ likely questions, suggested talking points, questions to ask Everything saves to Notion โ visual pipeline with follow-up dates Daily reminders โ Slack notification for applications needing follow-up Required Setup Notion Database Structure Create a Notion database with these properties: | Property Name | Type | Purpose | |---------------|------|---------| | Company | Title | Company name | | Role | Text | Job title | | Status | Select | Applied, Interviewing, Offer, Rejected, Ghosted | | Applied Date | Date | When you applied | | Salary Range | Text | Compensation info | | Job URL | URL | Link to posting | | Location | Text | City/Remote | | Interview Prep | Text | AI-generated prep materials | | Follow Up Date | Date | When to follow up | | Requirements | Text | Key job requirements | | Notes | Text | Your personal notes | Credentials Needed OpenAI API** โ For job extraction and interview prep generation Notion** โ Connected to your job tracker database Gmail** (optional) โ For email forwarding and confirmations Slack** (optional) โ For follow-up reminders Use Cases Active job seekers** โ Track 20+ applications without spreadsheet chaos Career changers** โ Get AI help understanding new industry requirements Recent graduates** โ Build interview confidence with generated prep materials Passive searchers** โ Keep a running list with minimal effort Set Up Steps Import the workflow into your n8n instance Create Notion database with the structure above (or duplicate template) Connect OpenAI credentials โ API key with GPT-5 access recommended Connect Notion credentials โ Select your job tracker database Configure Gmail trigger (optional) โ Set filter for forwarded job emails Set up Slack webhook (optional) โ Choose channel for reminders Test with a sample job posting โ Paste a LinkedIn or company careers page URL Customization Tips Edit the interview prep prompt to mention your background/experience Adjust the follow-up reminder interval (default: 7 days) Add additional research sources (LinkedIn, Crunchbase) for richer company data Connect to calendar to block interview prep time automatically Technical Notes Uses Jina AI Reader for web scraping (free tier available) GPT-5-mini recommended for cost efficiency Notion text fields limited to 2000 characters (full prep saved) Daily check runs at 9 AM (configurable) Difficulty Level: Intermediate Estimated Setup Time: 30-45 minutes Monthly Operating Cost: ~$2-5 (based on 50 applications/month with GPT-5-mini)
by Rahul Joshi
Description Automate your weekly cross-platform social media analytics workflow with AI-powered insights. ๐๐ค This system retrieves real-time Twitter (X) and Facebook data, validates and merges the metrics, formats them via custom JavaScript, generates a visual HTML summary with GPT-4o, stores structured analytics in Notion, and broadcasts key results through Gmail and Slack โ all in one seamless flow. Perfect for marketing, social media, and growth teams tracking weekly engagement trends. ๐๐ฌ What This Template Does 1๏ธโฃ Starts on manual execution to fetch the latest performance data. ๐น๏ธ 2๏ธโฃ Collects live metrics from both Twitter (X API) and Facebook Graph API. ๐ฆ๐ 3๏ธโฃ Merges API responses into one unified dataset for analysis. ๐งฉ 4๏ธโฃ Validates data completeness before processing; logs missing or invalid data to Google Sheets. ๐ 5๏ธโฃ Uses JavaScript to normalize data into clean JSON structures for AI analysis. ๐ป 6๏ธโฃ Leverages Azure OpenAI GPT-4o to generate a professional HTML analytics report. ๐ง ๐ 7๏ธโฃ Updates Notionโs โGrowth Chartโ database with historical metrics for record-keeping. ๐๏ธ 8๏ธโฃ Sends the HTML report via Gmail to the marketing or analytics team. ๐ง 9๏ธโฃ Posts a summarized Slack message highlighting key insights and platform comparisons. ๐ฌ Key Benefits โ Eliminates manual social media reporting with full automation. โ Ensures clean, validated data before report generation. โ Delivers visually engaging HTML performance summaries. โ Centralizes analytics storage in Notion for trend tracking. โ Keeps teams aligned with instant Slack and Gmail updates. Features Dual-platform analytics integration (Twitter X + Facebook Graph). Custom JavaScript node for data normalization and mapping. GPT-4o model integration for HTML report generation. Real-time error logging to Google Sheets for transparency. Notion database update for structured performance tracking. Slack notifications with emoji-rich summaries and insights. Gmail automation for formatted weekly performance emails. Fully modular โ easy to scale to other social platforms. Requirements Twitter OAuth2 API credentials for fetching X metrics. Facebook Graph API credentials for retrieving page data. Azure OpenAI credentials for GPT-4o AI report generation. Notion API credentials with write access to โGrowth Chart.โ Slack Bot Token with chat:write permission for updates. Google Sheets OAuth2 credentials for error logs. Gmail OAuth2 credentials to send HTML reports. Environment Variables TWITTER_API_KEY FACEBOOK_GRAPH_TOKEN AZURE_OPENAI_KEY NOTION_GROWTH_DB_ID SLACK_ALERT_CHANNEL_ID GOOGLE_SHEET_ERROR_LOG_ID GMAIL_MARKETING_RECIPIENTS Target Audience ๐ Marketing and growth teams analyzing engagement trends. ๐ก Social media managers tracking cross-channel performance. ๐ง Data and insights teams needing AI-based summaries. ๐ฌ Brand strategists and content teams monitoring audience health. ๐งพ Agencies and operations teams automating weekly reporting. Step-by-Step Setup Instructions 1๏ธโฃ Connect all required API credentials (Twitter, Facebook, Azure OpenAI, Notion, Gmail, Slack, Sheets). 2๏ธโฃ Replace the username and page IDs in the HTTP Request nodes for your brand handles. 3๏ธโฃ Verify the JavaScript node output structure for correct field mapping. 4๏ธโฃ Configure the Azure GPT-4o prompt with your preferred tone and formatting. 5๏ธโฃ Link your Notion database and confirm property names match (followers, likes, username). 6๏ธโฃ Add recipient email(s) in the Gmail node. 7๏ธโฃ Specify your Slack channel ID for automated alerts. 8๏ธโฃ Test run the workflow manually to validate end-to-end execution. 9๏ธโฃ Activate or schedule the workflow for regular weekly reporting. โ
by Oneclick AI Squad
Simplify financial oversight with this automated n8n workflow. Triggered daily, it fetches cash flow and expense data from a Google Sheet, analyzes inflows and outflows, validates records, and generates a comprehensive daily report. The workflow sends multi-channel notifications via email and Slack, ensuring finance professionals stay updated with real-time financial insights. ๐ธ๐ง Key Features Daily automation keeps cash flow tracking current. Analyzes inflows and outflows for actionable insights. Multi-channel alerts enhance team visibility. Logs maintain a detailed record in Google Sheets. Workflow Process The Every Day node triggers a daily check at a set time. Get Cash Flow Data** retrieves financial data from a Google Sheet. Analyze Inflows & Outflows** processes the data to identify trends and totals. Validate Records** ensures all entries are complete and accurate. If records are valid, it branches to: Sends Email Daily Report to finance team members. Send Slack Alert to notify the team instantly. Logs to Sheet** appends the summary data to a Google Sheet for tracking. Setup Instructions Import the workflow into n8n and configure Google Sheets OAuth2 for data access. Set the daily trigger time (e.g., 9:00 AM IST) in the "Every Day" node. Test the workflow by adding sample cash flow data and verifying reports. Adjust analysis parameters as needed for specific financial metrics. Prerequisites Google Sheets OAuth2 credentials Gmail API Key for email reports Slack Bot Token (with chat:write permissions) Structured financial data in a Google Sheet Google Sheet Structure: Create a sheet with columns: Date Cash Inflow Cash Outflow Category Notes Updated At Modification Options Customize the "Analyze Inflows & Outflows" node to include custom financial ratios. Adjust the "Validate Records" filter to flag anomalies or missing data. Modify email and Slack templates with branded formatting. Integrate with accounting tools (e.g., Xero) for live data feeds. Set different trigger times to align with your financial review schedule. Discover more workflows โ Get in touch with us
by Oneclick AI Squad
This is a production-ready, end-to-end workflow that automatically compares hotel prices across multiple booking platforms and delivers beautiful email reports to users. Unlike basic building blocks, this workflow is a complete solution ready to deploy. โจ What Makes This Production-Ready โ Complete End-to-End Automation Input**: Natural language queries via webhook Processing**: Multi-platform scraping & comparison Output**: Professional email reports + analytics Feedback**: Real-time webhook responses โ Advanced Features ๐ง Natural Language Processing for flexible queries ๐ Parallel scraping from multiple platforms ๐ Analytics tracking with Google Sheets integration ๐ Beautiful HTML email reports ๐ก๏ธ Error handling and graceful degradation ๐ฑ Webhook responses for real-time feedback โ Business Value For Travel Agencies**: Instant price comparison service for clients For Hotels**: Competitive pricing intelligence For Travelers**: Save time and money with automated research ๐ Setup Instructions Step 1: Import Workflow Copy the workflow JSON from the artifact In n8n, go to Workflows โ Import from File/URL Paste the JSON and click Import Step 2: Configure Credentials A. SMTP Email (Required) Settings โ Credentials โ Add Credential โ SMTP Host: smtp.gmail.com (for Gmail) Port: 587 User: your-email@gmail.com Password: your-app-password (not regular password!) Gmail Setup: Enable 2FA on your Google Account Generate App Password: https://myaccount.google.com/apppasswords Use the generated password in n8n B. Google Sheets (Optional - for analytics) Settings โ Credentials โ Add Credential โ Google Sheets OAuth2 Follow the OAuth flow to connect your Google account Sheet Setup: Create a new Google Sheet Name the first sheet "Analytics" Add headers: timestamp, query, hotel, city, checkIn, checkOut, bestPrice, platform, totalResults, userEmail Copy the Sheet ID from URL and paste in the "Save to Google Sheets" node Step 3: Set Up Scraping Service You need to create a scraping API that the workflow calls. Here are your options: Option A: Use Your Existing Python Script Create a simple Flask API wrapper: api_wrapper.py from flask import Flask, request, jsonify import subprocess import json app = Flask(name) @app.route('/scrape/<platform>', methods=['POST']) def scrape(platform): data = request.json query = f"{data['checkIn']} to {data['checkOut']}, {data['hotel']}, {data['city']}" try: result = subprocess.run( ['python3', 'price_scrap_2.py', query, platform], capture_output=True, text=True, timeout=30 ) Parse your script output output = result.stdout Assuming your script returns price data return jsonify({ 'price': extracted_price, 'currency': 'USD', 'roomType': 'Standard Room', 'url': booking_url, 'availability': True }) except Exception as e: return jsonify({'error': str(e)}), 500 if name == 'main': app.run(host='0.0.0.0', port=5000) Deploy: pip install flask python api_wrapper.py Update n8n HTTP Request nodes: URL: http://your-server-ip:5000/scrape/booking URL: http://your-server-ip:5000/scrape/agoda URL: http://your-server-ip:5000/scrape/expedia Option B: Use Third-Party Scraping Services Recommended Services: ScraperAPI** (scraperapi.com) - $49/month for 100k requests Bright Data** (brightdata.com) - Pay as you go Apify** (apify.com) - Has pre-built hotel scrapers Example with ScraperAPI: // In HTTP Request node URL: http://api.scraperapi.com Query Parameters: api_key: YOUR_API_KEY url: https://booking.com/search?hotel={{$json.hotelName}}... Option C: Use n8n SSH Node (Like Your Original) Keep your SSH approach but improve it: Replace HTTP Request nodes with SSH nodes Point to your server with the Python script Ensure error handling and timeouts // SSH Node Configuration Host: your-server-ip Command: python3 /path/to/price_scrap_2.py "{{$json.hotelName}}" "{{$json.city}}" "{{$json.checkInISO}}" "{{$json.checkOutISO}}" "booking" Step 4: Activate Webhook Click on "Webhook - Receive Request" node Click "Listen for Test Event" Copy the webhook URL (e.g., https://your-n8n.com/webhook/hotel-price-check) Test with this curl command: curl -X POST https://your-n8n.com/webhook/hotel-price-check \ -H "Content-Type: application/json" \ -d '{ "message": "I want to check Marriott Hotel in Singapore from 15th March to 18th March", "email": "user@example.com", "name": "John Doe" }' Step 5: Activate Workflow Toggle the workflow to Active The webhook is now live and ready to receive requests ๐ Usage Examples Example 1: Basic Query { "message": "Hilton Hotel in Dubai from 20th December to 23rd December", "email": "traveler@email.com", "name": "Sarah" } Example 2: Flexible Format { "message": "I need prices for Taj Hotel, Mumbai. Check-in: 5th January, Check-out: 8th January", "email": "customer@email.com" } Example 3: Short Format { "message": "Hyatt Singapore March 10 to March 13", "email": "user@email.com" } ๐จ Customization Options 1. Add More Booking Platforms Steps: Duplicate an existing "Scrape" node Update the platform parameter Connect it to "Aggregate & Compare" Update the aggregation logic to include the new platform 2. Change Email Template Edit the "Format Email Report" node's JavaScript: Modify HTML structure Change colors (currently purple gradient) Add your company logo Include terms and conditions 3. Add SMS Notifications Using Twilio: Add new node: Twilio โ Send SMS Connect after "Aggregate & Compare" Format: "Best deal: ${hotel} at ${platform} for ${price}" 4. Add Slack Integration Add Slack node after "Aggregate & Compare" Send to #travel-deals channel Include quick booking links 5. Implement Caching Add Redis or n8n's built-in cache: // Before scraping, check cache const cacheKey = ${hotelName}-${city}-${checkIn}-${checkOut}; const cached = await $cache.get(cacheKey); if (cached && Date.now() - cached.timestamp < 3600000) { return cached.data; // Use 1-hour cache } ๐ Analytics & Monitoring Google Sheets Dashboard The workflow automatically logs to Google Sheets. Create a dashboard with: Metrics to track: Total searches per day/week Most searched hotels Most searched cities Average price ranges Platform with best prices (frequency) User engagement (repeat users) Example Sheet Formulas: // Total searches today =COUNTIF(A:A, TODAY()) // Most popular hotel =INDEX(C:C, MODE(MATCH(C:C, C:C, 0))) // Average best price =AVERAGE(G:G) Set Up Alerts Add a node after "Aggregate & Compare": // Alert if prices are unusually high if (bestDeal.price > avgPrice * 1.5) { // Send alert to admin return [{ json: { alert: true, message: High prices detected for ${hotelName} } }]; } ๐ก๏ธ Error Handling The workflow includes comprehensive error handling: 1. Missing Information If user doesn't provide hotel/city/dates โ Responds with helpful prompt 2. Scraping Failures If all platforms fail โ Sends "No results" email with suggestions 3. Partial Results If some platforms work โ Shows available results + notes errors 4. Email Delivery Issues Uses continueOnFail: true to prevent workflow crashes ๐ Security Best Practices 1. Rate Limiting Add rate limiting to prevent abuse: // In Parse & Validate node const userEmail = $json.email; const recentSearches = await $cache.get(searches:${userEmail}); if (recentSearches && recentSearches.length > 10) { return [{ json: { status: 'rate_limited', response: 'Too many requests. Please try again in 1 hour.' } }]; } 2. Input Validation Already implemented - validates hotel names, cities, dates 3. Email Verification Add email verification before first use: // Send verification code const code = Math.random().toString(36).substring(7); await $sendEmail({ to: userEmail, subject: 'Verify your email', body: Your code: ${code} }); 4. API Key Protection Never expose scraping API keys in responses or logs ๐ Deployment Options Option 1: n8n Cloud (Easiest) Sign up at n8n.cloud Import workflow Configure credentials Activate Pros: No maintenance, automatic updates Cons: Monthly cost Option 2: Self-Hosted (Most Control) Using Docker docker run -it --rm \ --name n8n \ -p 5678:5678 \ -v ~/.n8n:/home/node/.n8n \ n8nio/n8n Using npm npm install -g n8n n8n start Pros: Free, full control Cons: You manage updates Option 3: Cloud Platforms Railway.app (recommended for beginners) DigitalOcean App Platform AWS ECS Google Cloud Run ๐ Scaling Recommendations For < 100 searches/day Current setup is perfect Use n8n Cloud Starter or small VPS For 100-1000 searches/day Add Redis caching (1-hour cache) Use queue system for scraping Upgrade to n8n Cloud Pro For 1000+ searches/day Implement job queue (Bull/Redis) Use dedicated scraping service Load balance multiple n8n instances Consider microservices architecture ๐ Troubleshooting Issue: Webhook not responding Solution: Check workflow is Active Verify webhook URL is correct Check n8n logs: Settings โ Log Streaming Issue: No prices returned Solution: Test scraping endpoints individually Check if hotel name matches exactly Verify dates are in future Try different date ranges Issue: Emails not sending Solution: Verify SMTP credentials Check "less secure apps" setting (Gmail) Use App Password instead of regular password Check spam folder Issue: Slow response times Solution: Enable parallel scraping (already configured) Add timeout limits (30 seconds recommended) Implement caching Use faster scraping service
by Oneclick AI Squad
This enterprise-grade n8n workflow automates the entire event planning lifecycle โ from client briefs to final reports โ using Claude AI, real-time financial data, and smart integrations. It converts raw client data into optimized, insight-driven event plans with cost savings, risk management, and automatic reporting, all with zero manual work. Key Features Multi-source data fusion** from Google Sheets (ClientBriefs, BudgetEstimates, ActualCosts, VendorDatabase) AI-powered orchestration* using *Claude 3.5 Sonnet** for event plan optimization Automatic ROI and variance analysis** with cost-saving insights Vendor intelligence** โ ranks suppliers by cost, rating, and reliability Risk engine** computes event risk (probability ร impact) Auto-approval logic** for safe, high-ROI events Multi-channel delivery:** Slack + Email + Google Sheets Audit-ready:** Full JSON plan + execution logs Scalable triggers:** Webhook or daily schedule Workflow Process | Step | Node | Description | | ---- | --------------------------- | -------------------------------------------------------- | | 1 | Orchestrate Trigger | Runs daily at 7 AM or via webhook (/event-orchestrate) | | 2 | Read Client Brief | Loads event metadata from the ClientBriefs sheet | | 3 | Read Budget Estimates | Fetches estimated budgets and vendor data | | 4 | Read Actual Costs | Loads live cost data for comparison | | 5 | Read Vendor Database | Pulls vendor pricing, reliability, and rating | | 6 | Fuse All Data | Merges data into a unified dataset | | 7 | Data Fusion Engine | Calculates totals, variances, and validates inputs | | 8 | AI Orchestration Engine | Sends structured prompt to Claude AI for analysis | | 9 | Parse & Finalize | Extracts JSON, computes ROI, risks, and savings | | 10 | Save Orchestrated Plan | Updates OrchestratedPlans sheet with results | | 11 | Team Sync | Sends status & summary to Slack | | 12 | Executive Report | Emails final interactive plan to event planner | Setup Instructions 1. Import Workflow Open n8n โ Workflows โ Import from Clipboard Paste the JSON workflow 2. Configure Credentials | Integration | Details | | ----------------- | -------------------------------------------------- | | Google Sheets | Service account with spreadsheet access | | Claude AI | Anthropic API key for claude-3-5-sonnet-20241022 | | Slack | Webhook or OAuth app | | Email | SMTP or Gmail OAuth credentials | 3. Update Spreadsheet IDs Ensure your Google Sheets include: ClientBriefs BudgetEstimates ActualCosts VendorDatabase OrchestratedPlans 4. Set Triggers Webhook:** /webhook/event-orchestrate Schedule:** Daily at 7:00 AM 5. Run a Test Use manual execution to confirm: Sheet updates Slack notifications Email delivery Google Sheets Structure ClientBriefs | eventId | clientName | eventType | attendees | budget | eventDate | plannerEmail | spreadsheetId | teamChannel | priority | |----------|-------------|------------|-----------|----------|------------|---------------|---------------|-------------| | EVT-2025-001 | Acme Corp | Conference | 200 | 75000 | 2025-06-15 | sarah@acme.com | 1A... | #event-orchestration | High | BudgetEstimates | category | item | budgetAmount | estimatedCost | vendor | | -------- | -------------- | ------------ | ------------- | ----------- | | Venue | Grand Ballroom | 20000 | 22500 | Luxe Events | ActualCosts | category | actualCost | | -------- | ---------- | | Venue | 23000 | VendorDatabase | vendorName | category | avgCost | rating | reliability | | ----------- | -------- | ------- | ------ | ----------- | | Luxe Events | Venue | 21000 | 4.8 | High | OrchestratedPlans Automatically filled with: eventId, savings, roi, riskLevel, status, summary, fullPlan (JSON) System Requirements | Requirement | Version/Access | | --------------------- | ---------------------------------------------- | | n8n | v1.50+ (LangChain supported) | | Claude AI API | claude-3-5-sonnet-20241022 | | Google Sheets API | https://www.googleapis.com/auth/spreadsheets | | Slack Webhook | Required for notifications | | Email Service | SMTP, Gmail, or SendGrid | Optional Enhancements Add PDF export for management reports Connect Google Calendar for event scheduling Integrate CRM (HubSpot / Salesforce) for client updates Add interactive Slack buttons for approvals Export results to Notion or Airtable Enable multi-event batch orchestration Add forecasting from past data trends Result: A single automated system that plans, analyzes, and reports events โ with full AI intelligence and zero manual work. Explore More AI Workflows: https://www.oneclickitsolution.com/contact-us/