by Daniel Shashko
This workflow automates the process of monitoring multiple RSS feeds, intelligently identifying new articles, maintaining a record of processed content, and delivering timely notifications to a designated Slack channel. It leverages AI to ensure only truly new and relevant articles are dispatched, preventing duplicate alerts and information overload. 🚀 Main Use Cases Automated News Aggregation:** Continuously monitor industry news, competitor updates, or specific topics from various RSS feeds. 📈 Content Curation:** Filter and deliver only new, unprocessed articles to a team or personal Slack channel. 🎯 Duplicate Prevention:** Maintain a persistent record of seen articles to avoid redundant notifications. 🛡️ Enhanced Information Delivery:** Provide a streamlined and intelligent way to stay updated without manual checking. 📧 How it works The workflow operates in distinct, interconnected phases to ensure efficient and intelligent article delivery: 1. RSS Feed Data Acquisition 📥 Initiation:** The workflow is manually triggered to begin the process. 🖱️ RSS Link Retrieval:** It connects to a Baserow database to fetch a list of configured RSS feed URLs. 🔗 Individual Feed Processing:** Each RSS feed URL is then processed independently. 🔄 Content Fetching & Parsing:** An HTTP Request node downloads the raw XML content of each RSS feed, which is then parsed into a structured JSON format for easy manipulation. 📄➡️🌳 2. Historical Data Management 📚 Seen Articles Retrieval:** Concurrently, the workflow queries another Baserow table to retrieve a comprehensive list of article GUIDs or links that have been previously processed and notified. This forms the basis for duplicate detection. 🔍 3. Intelligent Article Filtering with AI 🧠 Data Structuring for AI:** A Code node prepares the newly fetched articles and the list of already-seen articles into a specific JSON structure required by the AI Agent. 🏗️ AI-Powered Filtering:** An AI Agent, powered by an OpenAI Chat Model and supported by a Simple Memory component, receives this structured data. It is precisely prompted to compare the new articles against the historical "seen" list and return only those articles that are genuinely new and unprocessed. 🤖 Output Validation:** A Structured Output Parser ensures that the AI Agent's response adheres to a predefined JSON schema, guaranteeing data integrity for subsequent steps. ✅ JSON Cleaning:** A final Code node takes the AI's raw JSON string output, parses it, and formats it into individual n8n items, ready for notification and storage. 🧹 4. Notification & Record Keeping 🔔 Persistent Record:** For each newly identified article, its link is saved to the Baserow "seen products" table, marking it as processed and preventing future duplicate notifications. 💾 Slack Notification:** The details of the new article (title, content, link) are then formatted and sent as a rich message to a specified Slack channel, providing real-time updates. 💬 Summary Flow: Manual Trigger → RSS Link Retrieval (Baserow) → HTTP Request → XML Parsing | Seen Articles Retrieval (Baserow) → Data Structuring (Code) → AI-Powered Filtering (AI Agent, OpenAI, Memory, Parser) → JSON Cleaning (Code) → Save Seen Articles (Baserow) → Slack Notification 🎉 Benefits: Fully Automated:** Eliminates manual checking of RSS feeds and Slack notifications. ⏱️ Intelligent Filtering:** Leverages AI to accurately identify and deliver only new content, avoiding duplicates. 💡 Centralized Data Management:** Utilizes Baserow for robust storage of RSS feed configurations and processed article history. 🗄️ Real-time Alerts:** Delivers timely updates directly to your team or personal Slack channel. ⚡ Scalable & Customizable:** Easily adaptable to monitor various RSS feeds and integrate with different Baserow tables and Slack channels. ⚙️ Setup Requirements: Baserow API Key:** Required for accessing and updating your Baserow databases. 🔑 OpenAI API Key:** Necessary for the AI Agent to function. 🤖 Slack Credentials:** Either a Slack OAuth token (recommended for full features) or a Webhook URL for sending messages. 🗣️ Baserow Table Configuration:** A table with an rssLink column to store your RSS feed URLs. A table with a Nom column to store the links of processed articles. For any questions or further assistance, feel free to connect with me on LinkedIn: https://www.linkedin.com/in/daniel-shashko/
by Massimo Bensi
Automate Google News Digests with AI & Gmail Approval Workflow in n8n Overview This n8n automation template shows how to automatically collect and format daily Google News digests on your chosen topics, send them to your Gmail inbox for quick approval, and—if declined—generate the next set of curated news articles until you approve. ⚠️ Disclaimer: This workflow template uses community nodes and works only on n8n self-hosted instances. == Use case Streamline content curation for social media, newsletters, or blog posts by scheduling an AI-formatted Google News workflow that saves time in finding trending stories. How to use ⏰ Schedule the automation to run at your preferred time. 🔎 Fetch the latest trending Google News on your selected topic with SerpApi integration. 🤖 Send articles in batches of 10 to an AI content formatter that generates clean HTML output. 📧 Receive an approval email in your Gmail inbox with the AI-formatted news digest. ❌ Decline the digest to trigger the next batch of 10 curated news articles until you approve. 📊 Workflow logic uses AirTable counters and a custom Code node to manage batching. Setup instructions Connect your SerpApi, AirTable, OpenAI, and Gmail accounts. In the Gmail node, set the variable $env.EMAIL_ADDRESS_ME or replace the “To” field with your email. In AirTable, create a free-tier base with two columns: WorkflowID and Counter. The workflow will manage row creation and deletion automatically. Define your news topic or keyword in the SerpApi “Search Query (q)” field. Run the workflow and check your Gmail inbox for your curated AI-powered news digest. Requirements AirTable account Gmail account SerpApi account OpenAI account Customising this workflow ⏱ Adjust the schedule in the “Schedule Trigger” node for daily, weekly, or custom timing. 🔑 Enter your niche news keyword in the “Search Query (q)” field of the SerpApi node. 📦 Change the batch size (default 10) inside the Code node “Extract Details.” 🎨 Personalize the Gmail approval email template inside the AI Agent node “Prepare Content Review Email” for branding or formatting preferences.
by Wessel Bulte
Automatically BackUp Your n8n Workflows to OneDrive This workflow automates the backup of your self-hosted n8n instance by exporting all workflows and saving them as individual .json files to a designated OneDrive folder. Each file is timestamped for easy versioning and audit tracking. After a successful backup, the workflow optionally cleans up old backup files and sends a confirmation email to notify you that the process completed. How it works Uses the HTTP Request node to fetch all workflows via the /rest/workflows API. Iterates through each workflow using SplitInBatches. Converts each workflow to a .json file using Set and Function nodes. Uploads each file to a target Microsoft OneDrive folder using OAuth2. Deletes old backup files from OneDrive after upload, with the option to keep backups for a configurable number of time. Sends an email notification once all backups have completed successfully. Setup instructions Enter your n8n Base URL and authentication details in the HTTP Request node. Set up Microsoft OneDrive OAuth2 credentials for cloud upload. Configure the Email node with SMTP credentials to receive backup confirmation. (Optional) Adjust the file retention logic to keep backups for a set duration. A Cron trigger to schedule the workflow automatically (e.g., daily or weekly). 👉 Sticky notes inside the workflow explain each step for easy setup. Need Help Need Help 🔗 LinkedIn – Wessel Bulte
by Shady Ahmed
📌 Description Automate your course enrollment process with this workflow that handles student submissions, evaluates eligibility, and sends acceptance or rejection emails — all without manual effort. It's perfect for instructors managing multi-week technical courses who want to streamline onboarding and communication. ⚙️ How It Works 📥 Captures student registration data via an n8n Form Trigger 📊 Evaluates responses (e.g., checks programming background & availability) 📤 Sends automated, personalized acceptance or rejection emails based on criteria 📝 Logs submission outcomes for review 📨 Optionally stores records in Google Sheets, Airtable, or a database (customizable) 🛠️ Set Up Steps 🔗 Connect Gmail (or your preferred email service) ✅ Add your course filtering logic to the decision node (simple JSON rules) 📄 Customize email templates (plain or HTML) 🧪 Test the flow with sample submissions ⏱️ Setup Time: 10–15 minutes (depending on integrations) 🔐 Notes No hardcoded API keys used – all credentials must be set up using the n8n credential system Sticky notes inside the workflow provide detailed setup and customization tips Easily extendable to add payment links, WhatsApp alerts, or CRM integration
by Rahul Joshi
Description Keep your product and project teams perfectly aligned by automatically syncing task dependencies between Jira and Monday.com. This workflow ensures real-time visibility into cross-platform blockers and dependencies, allowing smoother delivery across multiple teams and tools. 🔄📅💼 What This Template Does Step 1: Trigger the workflow on a schedule or manual run. Step 2: Fetch project tasks and dependencies from Jira. Step 3: Retrieve matching items from Monday.com based on linked project IDs or issue keys. Step 4: Compare dependencies between both systems. Step 5: Identify mismatched or missing dependencies across platforms. Step 6: Send summarized reports to Slack or Gmail for team visibility. Step 7: Optionally update Monday.com or Jira items with dependency status tags. Key Benefits ✅ Maintain alignment across multiple projects and teams. ✅ Detect and resolve dependency conflicts before they cause delays. ✅ Automate visibility — no more manual cross-checking. ✅ Simplify multi-tool management for product and engineering leads. Features Integration between Jira Cloud and Monday.com API Cross-dependency comparison logic Scheduled or manual execution Slack/Gmail notifications for updates or conflicts Custom mapping for project and issue identifiers Requirements Jira Cloud account with API credentials Monday.com API key or OAuth2 token Optional: Slack or Gmail credentials for notifications n8n instance (cloud or self-hosted) Target Audience Product and Project Managers coordinating across tools 🧩 Engineering Leads overseeing multi-platform sprints ⚙️ PMOs managing dependencies across cross-functional teams 📊 Operations teams aiming for unified delivery visibility 📈 Step-by-Step Setup Instructions Connect your Jira and Monday.com credentials in n8n. Map project identifiers or keys between Jira and Monday.com. (Optional) Configure Slack or Gmail for daily status alerts. Adjust the cron expression to match your monitoring schedule. Run the workflow once manually to validate mappings. Activate the workflow for ongoing dependency tracking. ✅
by Yang
Who’s it for This template is perfect for SEO writers, niche bloggers, and content marketers who want to generate high-quality blog posts from a single keyword without spending hours on research and writing. If you often find yourself stuck at the research stage or manually drafting blog content, this workflow automates the entire process from topic discovery to publication. What it does The workflow takes a keyword, performs a Google search using Dumpling AI, analyzes the top-ranking pages and People Also Ask (PAA) questions, and then uses GPT-4 to generate a detailed blog post based on the most valuable question. The blog draft is sent for approval via email, and once approved, it’s automatically published to WordPress. Here’s what happens step by step: Receives a keyword through a simple form Uses Dumpling AI to perform a Google search and extract: Top 2 organic search results People Also Ask (PAA) questions and answers Top related searches Filters for insightful PAA questions Sends the data to GPT-4 to generate a blog post in JSON format Emails the draft blog post for manual review and approval If approved, publishes the post automatically to WordPress How it works Form Trigger: Captures the keyword input Dumpling AI: Searches Google and extracts SEO data including top results, PAA, and related searches Code Node: Processes the raw search data into a structured format for GPT-4 Filter Node: Checks if PAA questions are available GPT-4: Chooses a strong PAA question and writes the blog post Gmail: Sends the draft blog post to your inbox for review Approval Node: Waits for manual approval WordPress: Publishes the approved post automatically Requirements ✅ Dumpling AI API key stored securely as credentials ✅ OpenAI GPT-4 credentials ✅ Gmail account with OAuth2 connected to n8n ✅ WordPress account with API credentials configured How to customize Edit the GPT-4 prompt to control the blog structure, tone, or style Add extra filters to select specific types of PAA questions (e.g., how-to, guides) Change the review recipient email in the Gmail node Add additional formatting or SEO optimization steps before publishing Integrate with Notion, Airtable, or Slack to log or notify team members after publication > This workflow turns a single keyword into a fully researched, GPT-4 generated, and auto-published blog post — helping you scale content creation efficiently while maintaining quality.
by Stephan Koning
Who it's for Construction and renovation businesses that need to generate detailed quotes from customer inquiries—plasterers, painters, contractors, renovation specialists, or any construction service provider handling quote requests through online forms. What it does Automatically transforms JotForm submissions into professional, itemized construction quotes with complete CRM tracking—no subscription needed (saving €200-500/year). When a customer fills your project request form (specifying wall/ceiling areas, finish types, ceiling heights, wet areas, prep work), the workflow extracts measurements, normalizes service selections, applies intelligent pricing rules from your Supabase catalog, calculates line items with material and labor costs plus proper VAT handling, stores everything in a structured CRM pipeline (customer → project deal → estimate), and generates a branded HTML email ready for delivery. This self-hosted pricing engine replaces paid invoicing software for quote generation, saving thousands over time while eliminating manual takeoffs and quote preparation— from 30-60 minutes to under 30 seconds. How it works Stage 1: JotForm webhook triggers → Parser extracts project data (m² measurements, service types, property details) → Normalize Dutch construction terms to database values → Save raw submission for audit trail Stage 2: Upsert customer record (idempotent on email) → Create project deal → Link to form submission Stage 3: Fetch active pricing rules → Calculate line items based on square meters, service type (smooth plaster vs decorative), ceiling height premiums, property status (new build vs renovation), wet area requirements → Apply conditional logic (high ceilings = price multiplier, prep work charges, finish level) → Group duplicate items → Save estimate header + individual lines Stage 4: Query optimized view (single call, all data) → Generate professional HTML email with logo, itemized services table (description, m², unit price, totals), VAT breakdown, CTA buttons, legal disclaimer Setup requirements Supabase account** (free tier sufficient) - Database for CRM + pricing catalog JotForm account** (free tier works) - Form builder with webhook support Email service** - Gmail, SendGrid, or similar (add your own email node) How to set up 1. Database setup (2 minutes): Run this workflow's "SQL Generator" node to output complete schema Copy output → Paste in Supabase SQL Editor → Click Run Creates 9 tables + 1 optimized view + sample construction services (plastering €21-32/m², painting €12-15/m², ornamental work, ceiling finishes) 2. Credentials: Add Supabase credentials to n8n (Project URL + Service Role Key from Supabase Settings → API) No JotForm credentials needed (uses webhook) 3. JotForm webhook: Clone demo construction form: [jotform stucco planet demo](https://form.jotform.com/252844786304060 )- Form fields: Property type, postcode, services needed, wall/ceiling m², finish level, ornament quantities, molding meters, wet areas, ceiling heights, prep removal, start date, customer contact Settings → Integrations → Webhooks → Add your n8n webhook URL Test with preview submission 4. Customize email: Update company info in "Generate Email HTML" node (logo, business address, contact details, Chamber of Commerce number, VAT number) Adjust colors/branding in HTML template Available in Dutch and English versions How to customize Add your construction services: Edit price_catalog table in Supabase (no code changes): INSERT INTO price_catalog (item_code, name, unit_price, vat_rate, unit_type) VALUES ('DRYWALL_INSTALL', 'Drywall Installation', 18.50, 9, 'm²');
by Oneclick AI Squad
This automated n8n workflow streamlines the process of receiving, processing, and delivering patient-friendly lab reports with precautionary advice. 🏆 Minimal But Complete Design: Node Flow: 📧 Email Trigger → Monitors inbox for lab reports 📄 PDF Extract → Processes attachments & extracts content 🤖 AI Simplify → Converts medical jargon to simple language ✨ Format Response → Creates beautiful patient-friendly layout 📤 Send Report → Delivers simplified report via email 🚀 Key Features: ✅ Automatic Processing: Monitors email for lab report PDFs Extracts content from attachments No manual intervention needed ✅ AI-Powered Simplification: Converts complex medical terms to plain English Explains what each test result means Adds ✅/⚠️ indicators for normal/abnormal results ✅ Patient-Friendly Output: Professional HTML email formatting Clear sections: Summary, Results, Precautions Includes next steps and follow-up advice ✅ Built-in Safety: Always includes medical disclaimers Encourages consulting healthcare providers Handles edge cases with fallbacks 🛠️ Setup Requirements: APIs Needed: IMAP Email** (Gmail, Outlook, etc.) Ollama AI Model** (Local medical AI) SMTP Email** (Sending service) Quick Configuration: Import the JSON into n8n Set up email credentials (IMAP + SMTP) Configure Ollama medical model Test with a sample lab report 📋 Sample Output: 🩺 Your Lab Report - Simplified ✅ CHOLESTEROL: 180 mg/dL - Normal! Good job maintaining healthy levels. ⚠️ BLOOD SUGAR: 126 mg/dL - Slightly high Normal is under 100. Consider reducing sugar intake. 🔬 VITAMIN D: 25 ng/mL - Low You may need supplements. Ask your doctor. 📋 PRECAUTIONS: • Eat more fruits and vegetables • Exercise 30 minutes daily • Schedule follow-up in 3 months • Watch for: excessive thirst, fatigue
by Vlad Arbatov
Summary Send a number to your Telegram bot (e.g., 2) and get a neatly formatted digest of all Gmail newsletters received since that date. Each email is summarized by an LLM into concise topics, merged into a single Telegram message, automatically split into chunks to fit Telegram limits, and safely formatted as HTML. What this workflow does Triggers on your Telegram message containing a number of days, e.g., 1, 2, 7 Fetches all Gmail messages since that date using a custom search query, optionally filtered by senders Retrieves and decodes each email’s HTML, subject, sender name, date Prompts an LLM (GPT‑4.1‑mini) to produce a consistent JSON summary of topics per email Merges topics from all emails into a single digest Builds a readable, enumerated message (with bold titles) Splits it into 3 500‑char parts and sanitizes Markdown to Telegram‑safe HTML Sends the digest to your Telegram chat with preview disabled Apps and credentials Gmail OAuth2: Gmail account Telegram: Telegram account (bot) OpenAI: OpenAi account Typical use cases Personal or team daily/weekly newsletter digests in Telegram Curated feeds from selected senders compiled on demand Lightweight knowledge briefings without leaving Telegram How it works (node-by-node) Telegram Trigger Waits for your message (e.g., "2"). Chat ID is restricted to your Telegram ID for safety. Get days (Code) Takes the numeric daysAgo from the Telegram message text Computes YYYY/MM/DD for Gmail’s after: filter Get many messages (Gmail → getAll, returnAll: true) Uses a custom q filter like: =(from:@.com) OR (from:@.com) OR (from:@.com -"__") after:{{ $json.dateString }} Returns a list of message IDs Loop Over Items (Split in Batches) Iterates through each message ID Get a message (Gmail → get) Retrieves the full message/payload for the current email Get message data (Code) Extracts HTML from Gmail’s payload (body/parts) Normalizes sender to just the name Formats the date as DD.MM.YYYY Passes html, subject, from, date forward Clean (Code) Converts DD.MM.YYYY → MM.DD (for prompt brevity) Passes html, subject, from, date to the LLM Message a model (OpenAI, model: gpt‑4.1‑mini, JSON output) Prompt instructs: Produce JSON: { "topics": [ { "title", "descr", "subject", "from", "date" } ] } Split multi-news blocks into separate topics Combine or ignore specific blocks for particular senders (placeholders __) Keep subject untranslated; other values in __ language Injects subject/from/date/html from the current email Loop Over Items (continues) After all iterations complete, the aggregated per-email results are available Merge (Code) Flattens the topics arrays from all processed emails into one combined topics list Create TG message (Code) Renders an enumerated list: 1. Title (bold) Short description Original subject From — Date Split (Code) Splits into 3 500‑character chunks to stay below Telegram’s 4 096 limit with HTML overhead Sanitize (Code) Escapes &, <, > Fixes unbalanced * and _ Converts basic Markdown markers to Telegram HTML Send a message (Telegram) Sends each part with parse_mode=HTML, previews disabled Node map | Node | Type | Purpose | |---|---|---| | Telegram Trigger | Trigger | Receive daysAgo command from Telegram | | Get days | Code | Compute Gmail after:YYYY/MM/DD from daysAgo | | Get many messages | Gmail (getAll) | Search emails since date with custom from: filters | | Loop Over Items | Split in Batches | Iterate messages one-by-one | | Get a message | Gmail (get) | Fetch full message payload | | Get message data | Code | Extract HTML/subject/from/date; normalize sender and date | | Clean | Code | Reformat date and forward fields to LLM | | Message a model | OpenAI | Summarize email into JSON topics | | Merge | Code | Merge topics from all emails | | Create TG message | Code | Build human-friendly digest text | | Split | Code | Chunk into 3 500‑char parts | | Sanitize | Code | Escape HTML and map Markdown to Telegram HTML | | Send a message | Telegram | Deliver digest to Telegram chat | Before you start Create a Telegram bot and get its token (via @BotFather) Get your Telegram user ID to restrict access Connect Gmail OAuth2 in n8n Add your OpenAI API key Import the provided workflow JSON into n8n Setup instructions 1) Telegram Telegram Trigger node: additionalFields.chatIds = your Telegram user ID Send a message node: chatId = your Telegram user ID parse_mode = HTML disable_web_page_preview = true 2) Gmail Connect a Gmail OAuth2 credential (Gmail account) In Get many messages, adjust filters.q to your senders and rules: Example: =(from:news@publisher.com) OR (from:briefs@media.com -"promo") after:{{ $json.dateString }} If needed, add label: or category: filters 3) OpenAI Message a model: Model: gpt‑4.1‑mini (can swap to gpt‑4o‑mini or your preferred) Update the prompt placeholders: __ language → your target language __ sender rules → your special cases (combine blocks, ignore sections) 4) Safety and formatting Keep parse_mode=HTML in Telegram The Sanitize node is designed for ` and ` only; avoid other HTML tags The Split node uses 3 500 chars per part to stay safe under Telegram limits How to use In Telegram, send a number indicating “days ago” Example: 2 → will query Gmail after the date 2 days ago The workflow compiles and returns a digest in your chat Rerun anytime with a new number Customization ideas Labels instead of global search: q = label:Newsletters after:{{ $json.dateString }} Time window control: add before: or exact date ranges Different language: set the __ language in the LLM prompt Model choice: swap to cheaper/faster models if volume is high Chunk size: adjust from 3 500 to your needs Formatting: tweak Create TG message to include links parsed from HTML (if you add an HTML parser step) Limits and notes Telegram messages are limited to ~4 096 characters; we chunk to 3 500 per part Gmail “after:” uses YYYY/MM/DD and Google’s interpretation of dates; your n8n server time influences the computed date LLM usage incurs cost and latency proportional to email size and count HTML extraction is robust for typical Gmail structures but may need tweaks for exotic MIME layouts Privacy and safety Emails are sent to OpenAI for summarization—ensure that’s acceptable for your data policies The Telegram Trigger restricts chat access; keep your chatIds locked down Avoid sending raw HTML to Telegram; rely on the Sanitize node Sample output format (Telegram) Bold topic title One-sentence description Original Subject Line → Sender Name — DD.MM.YYYY Next topic title ... Tips and troubleshooting Got empty digests? Check Gmail filters.q and make sure there really are emails after the computed date Model errors or empty JSON? Lower prompt complexity or switch model HTML formatting issues in Telegram? Ensure parse_mode=HTML and keep only `, ` Long messages not fully delivered? Reduce chunk size from 3 500 Tags gmail, telegram, openai, llm, newsletters, digest, summarization, automation Changelog v1: Initial release with sender filters, topic merging, Telegram HTML sanitization, and on-demand time window via Telegram message
by shae
How it works This AI Customer Success Risk Prediction workflow revolutionizes customer retention by predicting churn risk 30-90 days before it happens. Here's the high-level flow: Daily Data Collection → AI Multi-Signal Analysis → Risk Scoring & Prediction → Smart Risk Routing → AI-Generated Personalized Interventions → CRM Updates & Team Alerts The system automatically gathers data from your product analytics, support system, billing platform, and email tools, then uses GPT-4 to analyze patterns and predict which customers are at risk. It creates personalized intervention strategies and routes them based on urgency level. Set up steps Time to set up: Approximately 45 minutes Prerequisites: Active accounts with your analytics platform, support system, billing provider, CRM, and AI provider Step 1: Import & Configure Workflow (5 minutes) Import the workflow JSON into your n8n instance Review the 3 comprehensive sticky notes for context Understand the AI analysis logic and intervention strategies Step 2: Set Environment Variables (10 minutes) Configure these critical variables: ANALYTICS_API_URL and ANALYTICS_API_KEY HIGH_RISK_SLACK_CHANNEL (for critical alerts) CS_TEAM_EMAIL (intervention sender) CRM_BASE_URL and CALENDAR_BOOKING_URL Step 3: Configure API Credentials (20 minutes) Set up secure credential connections for: OpenAI/Anthropic API (AI analysis engine) Analytics platform (Mixpanel/Amplitude/GA) Support system (Zendesk/Intercom) Billing platform (Stripe/Chargebee) HubSpot CRM (risk data storage) Slack API (team notifications) SMTP/SendGrid (email delivery) Step 4: Customize AI Prompts & Risk Thresholds (8 minutes) Review and adjust the AI analysis prompts for your business Modify risk score thresholds (Critical 90+, High 70-89, Medium 40-69) Customize intervention email templates and tone Set your specific risk factors (usage patterns, support indicators) Step 5: Test & Activate (2 minutes) Run a test execution with sample customer data Verify AI analysis generates appropriate risk scores Check that interventions are routed correctly Activate the daily cron schedule
by n8n Automation Expert | Template Creator | 2+ Years Experience
🎯 Smart Job Hunter Pro - AI-Powered Multi-Platform Job Automation Transform your job search with this comprehensive n8n workflow that automatically searches, analyzes, and applies to relevant positions across multiple job platforms. Perfect for developers, engineers, and tech professionals looking to streamline their job hunting process. ✨ Key Features 🔄 Multi-Platform Job Search**: Simultaneously searches Jooble, JobStreet, Indeed, and WhatJobs APIs 🤖 AI-Powered Job Analysis**: Uses Google Gemini AI to analyze job compatibility and generate tailored cover letters 📊 Smart Scoring System**: Automatically scores job matches based on your skills and requirements 📝 Auto-Apply Threshold**: Only applies to jobs above your specified compatibility score 📋 Notion Integration**: Automatically tracks applications in organized Notion database 💬 Telegram Notifications**: Real-time alerts for high-match job opportunities ☁️ Google Drive Storage**: Saves personalized cover letters for each application ⚠️ Error Handling**: Comprehensive error tracking with Telegram notifications ⏰ Automated Scheduling**: Runs every 8 hours to find fresh opportunities 🛠 What This Workflow Does Scheduled Search: Automatically searches multiple job platforms every 8 hours Data Normalization: Standardizes job data from different API sources AI Analysis: Gemini AI evaluates each job posting against your skills profile Smart Filtering: Only processes jobs above your compatibility threshold (default: 75%) Application Tracking: Creates detailed records in Notion with match scores and status Instant Alerts: Sends Telegram notifications for promising opportunities Cover Letter Generation: AI creates personalized cover letters for each position Document Management: Automatically saves all cover letters to Google Drive 🔧 Required Integrations Job APIs**: Jooble API, WhatJobs API (JobStreet & Indeed use web scraping) AI Service**: Google Gemini API for job analysis Productivity**: Notion database for application tracking Communication**: Telegram bot for notifications Storage**: Google Drive for cover letter management 💡 Perfect For Software Developers** seeking JavaScript, React, Node.js positions Full-Stack Engineers** wanting automated job discovery Tech Professionals** needing organized application tracking Remote Workers** searching across multiple platforms Career Changers** looking for systematic job hunting 🎛 Customizable Variables Job Keywords**: Define your target roles and skills Location & Radius**: Set geographic search parameters Auto-Apply Threshold**: Control compatibility score requirements Results Limit**: Adjust number of jobs per platform Schedule Frequency**: Modify search intervals 📈 Benefits Save 10+ hours weekly** on manual job searching Never miss opportunities** with automated monitoring Professional application tracking** with detailed analytics Personalized cover letters** for every application Instant notifications** for high-match positions Complete audit trail** of all job search activities 🚀 Getting Started Import the workflow to your n8n instance Configure API credentials for all job platforms Set up Notion database with provided template structure Create Telegram bot and Google Drive folder Customize job search parameters for your profile Activate workflow and start receiving opportunities! 📝 Additional Notes Uses placeholder credentials for security ({{PLACEHOLDER_API_KEY}}) Comprehensive error handling prevents workflow failures Includes detailed setup instructions via sticky notes Optimized for Indonesian job market (JobStreet.co.id) Easily adaptable for other regions and job types Perfect for developers, engineers, and automation enthusiasts who want to leverage AI and n8n's power to dominate their job search process! 🚀
by Meak
Local Lead Finder + Cold Email Sender (Form → Apify → AI → Gmail + Google Sheets) Fill a short form with business type, location, and how many leads you want. This workflow finds local businesses, grabs a valid email from each website, writes a cold email in your chosen style, sends it, and logs everything to Google Sheets. Benefits Simple form input (business type, location, lead count, email style) Finds local businesses with Apify and filters only those with a website Scrapes one best email address from each site Writes cold emails with AI (style: Friendly / Professional / Simple) Sends via Gmail and updates Google Sheets with status + send time How It Works Form Submit: Enter Business Type, Location, Lead Number, Email Style. Find Leads (Apify): Search places in the target location (up to your lead count). Filter: Keep only results that include a website. Extract Email (AI): Visit the website and pull one valid email address. Save Lead (Sheets): Add company name, category, website, phone, address, email. Generate Email (AI): Create subject + body using your selected style. Send (Gmail): Email the scraped address; retry-safe. Log Result (Sheets): Mark “Cold Mail Status = ✅” and add send time. Batching/Wait: Process leads one by one with a short wait to avoid limits. Who Is This For Agencies doing local outreach Freelancers offering lead gen services SMBs testing cold email in a city or niche Setup Connect the built-in Form Trigger (use provided fields) Add Apify actor endpoint + token Add Google Gemini (for email extraction) and OpenAI (for email writing) keys Connect Gmail OAuth2 to send emails Connect Google Sheets (Spreadsheet ID + Sheet1) ROI & Monetization Spin up targeted campaigns in minutes (no manual research) Sell as a local lead-gen + outreach package ($500–$2k/campaign) Reuse the form for any niche and city to scale quickly Strategy Insights In the full walkthrough, I show how to: Tune the Apify search for better niches and categories Improve email extraction prompts for higher-quality addresses Adjust templates for short, compliant cold emails Add fallbacks (no email → skip or save for manual review) Check Out My Channel For more AI outreach workflows that get real results, check out my YouTube channel where I share the exact setups I use to win clients and scale to $20k+ monthly revenue.