by Dr. Firas
💥 Automate Scrape Google Maps Business Leads (Email, Phone, Website) using Apify 🧠 AI-Powered Business Prospecting Workflow (Google Maps + Email Enrichment) Who is this for? This workflow is designed for entrepreneurs, sales teams, marketers, and agencies who want to automate lead discovery and build qualified business contact lists — without manual searching or copying data. It’s perfect for anyone seeking an AI-driven prospecting assistant that saves time, centralizes business data, and stays fully compliant with GDPR. What problem is this workflow solving? Manually searching for potential clients, copying their details, and qualifying them takes hours — and often leads to messy spreadsheets. This workflow automates the process by: Gathering publicly available business information from Google Maps Enriching that data with AI-powered summaries and contact insights Compiling it into a clean, ready-to-use Google Sheet database This means you can focus on closing deals, not collecting data. What this workflow does This automation identifies, analyzes, and organizes business opportunities in just a few steps: Telegram Trigger → Send a message specifying your business type, number of leads, and Google Maps URL. Apify Integration → Fetches business information from Google Maps (public data). Duplicate Removal → Ensures clean, non-redundant results. AI Summarization (GPT-4) → Generates concise business summaries for better understanding. Email Extraction (GPT-4) → Finds and extracts professional contact emails from company websites. Google Sheets Integration → Automatically stores results (name, category, location, phone, email, etc.) in a structured sheet. Telegram Notification → Confirms when all businesses are processed. All data is handled ethically and transparently — only from public sources and without any unsolicited contact. Setup Telegram Setup Create a Telegram bot via BotFather Copy the API token and paste it into the Telegram Trigger node credentials. Apify Setup Create an account on Apify Get your API token and connect it to the “Run Google Maps Scraper” node. Google Sheets Setup Connect your Google account under the “Google Maps Database” node. Specify the target spreadsheet and worksheet name. OpenAI Setup Add your OpenAI API key to the AI nodes (“Company Summary Info” and “Extract Business Email”). Test Send a Telegram message like: restaurants, 5, https://www.google.com/maps/search/restaurants+in+Paris How to customize this workflow to your needs Change search region or business type** by modifying the Telegram input message format. Adjust the number of leads** via the maxCrawledPlacesPerSearch parameter in Apify. Add filters or enrichments** (e.g., websites with social links, review counts, or opening hours). Customize AI summaries** by tweaking the prompt inside the “Company Summary Info” node. Integrate CRM tools** like HubSpot or Pipedrive by adding a connector after the Google Sheets node. ⚙️ Expected Outcome ✅ A clean, enriched, and ready-to-use Google Sheet of businesses with: Name, category, address, and city Phone number and website AI-generated business summary Extracted professional email (if available) ✅ Telegram confirmation once all businesses are processed ✅ Fully automated, scalable, and GDPR-compliant prospecting workflow 💡 This workflow provides a transparent, ethical way to streamline your B2B lead research while staying compliant with privacy and anti-spam regulations. 🎥 Watch This Tutorial 👋 Need help or want to customize this? 📩 Contact: LinkedIn 📺 YouTube: @DRFIRASS 🚀 Workshops: Mes Ateliers n8n 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube / 🚀 Mes Ateliers n8n
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Accelerate your research analysis with this Automated Research Intelligence System! This workflow uses AI and web scraping to analyze research papers and articles, extracting key insights, validating content quality, and generating comprehensive research documents. Perfect for research teams, academics, and AI enthusiasts staying current with the latest developments in artificial intelligence and machine learning. What This Template Does Triggers via form submission for on-demand research URL analysis. Validates URL accessibility and prepares for processing. Uses Decodo scraper to extract research content from target URLs. Analyzes research papers with AI for comprehensive understanding. Validates summaries for accuracy, completeness, and relevance. Generates key insights and actionable takeaways from research. Creates professional Google Docs with formatted research summaries. Evaluates research quality with AI-powered rating system. Saves all research to Google Sheets for historical tracking. Sends Slack alerts for high-quality research findings (9+ rating). Key Benefits Automated research analysis saves hours of manual reading time AI-powered insights extraction from complex research papers Quality validation ensures accurate and relevant summaries Centralized research database for team collaboration Real-time alerts for breakthrough research findings Professional documentation automatically generated Features Form-based trigger for easy research submission URL validation and accessibility checking AI-powered research analysis and summarization Decodo web scraping for reliable content extraction Multi-stage validation for accuracy and relevance Automated Google Docs report generation Quality assessment with structured rating system Google Sheets integration for research tracking Slack notifications for premium research findings Quality threshold filtering for optimal results Requirements Decodo API credentials for research scraping OpenAI API credentials for AI analysis Google Docs OAuth2 credentials for document creation Google Sheets OAuth2 credentials with edit access Slack Bot Token with chat:write permission Environment variables for configuration settings Target Audience AI research teams and data scientists Academic researchers and university labs Machine learning engineers and developers Technology innovation teams Research and development departments Content creators in AI/ML space Step-by-Step Setup Instructions Connect Decodo API credentials for research scraping functionality Set up OpenAI credentials for AI analysis and quality assessment Configure Google Docs for automated research document generation Add Google Sheets credentials for research tracking and history Set up Slack credentials for high-quality research alerts Customize quality thresholds for research rating (default: 6+ for processing, 9+ for alerts) Test with sample research URLs to verify analysis and formatting Deploy the form for team access to research analysis requests Monitor research database for trends and insights Pro Tip: Pro Tip: Use coupon code "YARON" to get 23K requests for testing This workflow transforms complex research into actionable intelligence with automated analysis, quality validation, and professional documentation!
by Joe Marotta
What This Flow Does Automated stock portfolio analysis system that performs comprehensive fundamental and technical analysis of your portfolio holdings on a scheduled basis, with intelligent follow-up capabilities. How It Works Two-Phase Analysis System: Monday Analysis (Main weekly analysis) Reads your stock holdings from Google Sheets Performs deep fundamental analysis using Claude AI with web search Conducts technical analysis with current market data Combines both analyses into final buy/sell/hold recommendations Emails you comprehensive analysis report Wednesday Follow-up (Interactive refinement) Sends midweek check-in email asking for additional input If you reply with documents, questions, or market observations Runs supplemental analysis incorporating your feedback Updates recommendations based on new information and market changes Delivers refined analysis via email Key Features Fractional share support - handles both whole and fractional stock positions Web-enabled AI analysis - Claude AI searches current market data, news, earnings Dual-analyst approach - separate fundamental and technical analysis for comprehensive coverage Interactive feedback loop - Wednesday follow-ups allow you to guide analysis Professional email reports - formatted HTML emails with actionable recommendations Setup Steps Google Sheets: Duplicate given template and fill with your investment information Gmail OAuth: Connect your Gmail account for sending reports Anthropic API: Add Claude AI credentials for analysis Replace placeholders: Update YOUR_EMAIL@gmail.com, YOUR_GOOGLE_SHEETS_ID, webhook IDs Schedule configuration: Currently set for Monday 12pm EST analysis, Wednesday 12pm EST follow-up Use Case Perfect for part time investors who want systematic, AI-powered analysis of their portfolio with the flexibility to provide additional context and refinements throughout the week.
by Cheng Siong Chin
How It Works Scheduled triggers initiate automated contract reviews. The system fetches documents from cloud storage and email, then uses AI to extract key terms, obligations, and compliance requirements. Multi-model parsing identifies gaps, inconsistencies, and potential risks. A scoring engine evaluates severity and routes alerts to the appropriate channels. The workflow then updates the CLM system and produces audit-ready documentation for tracking and governance. Setup Instructions Storage: Configure access to your Google Drive or webhook-based document repository. Email: Connect Gmail to automatically ingest contract-related emails. AI Extraction: Add the OpenAI API key and define extraction prompts for obligations and terms. CLM System: Enter credentials for your contract lifecycle management platform. Alerts: Set up Google Sheets logging and connect dashboard endpoints for risk and compliance alerts. Prerequisites Cloud storage access; Gmail credentials; OpenAI API key; CLM system credentials; document processing license Use Cases Contract renewal tracking; compliance audits; risk management; vendor agreement reviews; regulatory adherence monitoring Customization Adjust risk thresholds; modify extraction rules; add Slack notifications; extend compliance frameworks Benefits Reduces review time 80%; catches compliance gaps; automates audit trails;
by moosa
n8n Wizard 🪄 – Your personal AI assistant inside WhatsApp This workflow transforms WhatsApp into a powerful personal AI using n8n + Green-API. Send text or voice messages — the assistant understands intent and handles daily tasks automatically. Key features 💰 Expense & income tracking — record spending, view summaries & category breakdowns (Google Sheets, append-only) ✓ Google Tasks management — create, list, update, delete tasks & reminders 🐦 Post to X/Twitter — write and publish single tweets or short threads 📧 Gmail search & summaries — find recent/unread emails by sender, label, keyword (read-only) 🌐 Real-time answers — current weather, news, exchange rates, facts via web search 🧮 Quick calculations — math, percentages, currency conversions 🎤 Full voice support — incoming voice messages transcribed (Whisper), replies can be spoken (TTS) How it works Green-API webhook receives message (text or audio) Voice → transcribed automatically Main intelligent router agent selects one sub-agent/tool Action executed → result sent back as text or voice (if input was voice) Setup requirements Green-API instance (webhook + send endpoints) OpenAI API key (chat, Whisper, TTS) Google Sheets, Google Tasks, Twitter/X, Gmail (read scope), SerpAPI credentials Strict routing rules prevent misuse — no deletions, no guessing values, one tool per clear intent. Start commanding: “spent 3200 on groceries”, “remind dentist tomorrow”, “tweet: loving n8n!”, “weather in Lahore now”
by Cheng Siong Chin
How It Works This workflow automates end-to-end carbon emissions monitoring, strategy optimisation, and ESG reporting using a multi-agent AI supervisor architecture in n8n. Designed for sustainability managers, ESG teams, and operations leads, it eliminates the manual effort of tracking emissions, evaluating reduction strategies, and producing compliance reports. Data enters via scheduled pulls and real-time webhooks, then merges into a unified feed processed by a Carbon Supervisor Agent. Sub-agents handle monitoring, optimisation, policy enforcement, and ESG reporting. Approved strategies are auto-executed or routed for human sign-off. Outputs are consolidated and pushed to Slack, Google Sheets, and email, keeping all stakeholders informed. The workflow closes the loop from raw sensor data to actionable ESG dashboards with minimal human intervention. Setup Steps Connect scheduled trigger and webhook nodes to your emissions data sources. Add credentials for Slack (bot token), Gmail (OAuth2), and Google Sheets (service account). Configure the Carbon Supervisor Agent with your preferred LLM (OpenAI or compatible). Set approval thresholds in the Check Approval Required node. Map Google Sheets document ID for ESG report and KPI dashboard nodes. Prerequisites OpenAI or compatible LLM API key Slack bot token Gmail OAuth2 credentials Google Sheets service account Use Cases Corporate sustainability teams automating monthly ESG reporting Customisation Swap LLM models per agent for cost or accuracy trade-offs Benefits Eliminates manual emissions data aggregation and report generation
by Cheng Siong Chin
How It Works This workflow automates AI decision governance by tracing, assessing, and auditing automated decisions for risk and compliance. Designed for AI governance officers, compliance teams, and regulated industries, it addresses the critical need for explainability and accountability in AI-driven decisions. A schedule trigger initiates a simulated decision request, which is processed by a Decision Trace Agent to extract metadata. A Governance Agent then delegates to Risk Assessment and Compliance Checker sub-agents. Decisions are routed by risk level—high-risk cases trigger Slack alerts and are stored separately—while all outcomes are merged into a governance report sent via email, with full audit trail and explainability report stored for regulatory review. Setup Steps Set schedule trigger interval to match governance audit frequency. Add OpenAI API credentials to all OpenAI Model nodes. Configure Slack credentials and set high-risk alert channel. Add Gmail/SMTP credentials to Send Governance Report node. Replace simulated decision request with live AI system webhook. Prerequisites Slack workspace with bot token Gmail or SMTP credentials Google Sheets or database for audit storage Use Cases Regulatory compliance auditing for AI-driven loan or insurance decisions Automated fairness and bias detection in HR or admissions systems Customization Swap simulated input with live AI system API or decision log feed Add sub-agents for fairness, bias, or sector-specific compliance checks Benefits Automates end-to-end AI decision auditing on a schedule Ensures high-risk decisions are flagged and stored instantly
by Pake.AI
Overview This workflow converts a single topic into a full blog article through a structured multi-step process. Instead of generating everything in one pass, it breaks the task into clear stages to produce cleaner structure, better SEO consistency, and more predictable output quality. How this workflow differs from asking ChatGPT directly It does not produce an article in one step. It separates the process into two focused stages: outline generation and paragraph expansion. This approach gives you more control over tone, SEO, structure, and keyword placement. How it works 1. Generate outline The workflow sends your topic to an AI Agent. It returns a structured outline based on the topic, desired depth, language, and keyword focus. 2. Expand each subtopic The workflow loops through each outline item. Every subtopic is expanded into a detailed, SEO-friendly paragraph. Output is consistent and optimized for readability. 3. Produce final outputs Combines all expanded sections into: A clean JSON object A Markdown version ready for blogs or CMS The JSON includes: Title HTML content Markdown content You can send this directly to REST APIs such as WordPress, Notion, or documentation platforms. Content is validated for readability and typically scores well in tools like Yoast SEO. Uses GPT-4o Mini by default, with average token usage between 2000 and 3000 depending on outline size. Use cases Auto-generate long-form articles for blogs or content marketing. Turn Instagram or short-form scripts into complete SEO articles. Create documentation or educational content using consistent templates. Setup steps 1. Prepare credentials Add your OpenAI API Key inside n8n’s credential manager. 2. Adjust input parameters Topic or main idea Number of outline items Language Primary keyword Tone or writing style (optional) 3. Customize the workflow Switch the model if you want higher quality or lower token usage. Modify the prompt for the outline or paragraph generator to match your writing style. Add additional nodes if you want to auto-upload the final article to WordPress, Notion, or any API. 4. Run the workflow Enter your topic Execute the workflow Retrieve both JSON and Markdown outputs for immediate publishing If you need help expanding this into a full content pipeline or want to integrate it with other automation systems, feel free to customize further.
by vinci-king-01
Medical Research Tracker with Email and Pipedrive ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically scans authoritative healthcare policy websites for new research, bills, or regulatory changes, stores relevant findings in Pipedrive, and immediately notifies key stakeholders via email. It is ideal for healthcare administrators and policy analysts who need to stay ahead of emerging legislation or guidance that could impact clinical operations, compliance, and strategy. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Pipedrive account and API token SMTP credentials (or native n8n Email credentials) for sending alerts List of target URLs or RSS feeds from government or healthcare policy organizations Basic familiarity with n8n credential setup Required Credentials | Service | Credential Name | Purpose | |--------------------|-----------------|-----------------------------------| | ScrapeGraphAI | API Key | Perform web scraping | | Pipedrive | API Token | Create / update deals & notes | | Email (SMTP/Nodemailer) | SMTP creds | Send alert emails | Environment Variables (optional) | Variable | Example Value | Description | |-------------------------|------------------------------|-----------------------------------------------| | N8N_DEFAULT_EMAIL_FROM | policy-bot@yourorg.com | Default sender for Email Send node | | POLICY_KEYWORDS | telehealth, Medicare, HIPAA | Comma-separated keywords for filtering | How it works This workflow automatically scans authoritative healthcare policy websites for new research, bills, or regulatory changes, stores relevant findings in Pipedrive, and immediately notifies key stakeholders via email. It is ideal for healthcare administrators and policy analysts who need to stay ahead of emerging legislation or guidance that could impact clinical operations, compliance, and strategy. Key Steps: Manual Trigger**: Kick-starts the workflow or schedules it via cron. Set → URL List**: Defines the list of healthcare policy pages or RSS feeds to scrape. Split In Batches**: Iterates through each URL so scraping happens sequentially. ScrapeGraphAI**: Extracts headlines, publication dates, and links. Code (Filter & Normalize)**: Removes duplicates, standardizes JSON structure, and applies keyword filters. HTTP Request**: Optionally enriches data with summary content using external APIs (e.g., OpenAI, SummarizeBot). If Node**: Checks if the policy item is new (not already logged in Pipedrive). Pipedrive**: Creates a new deal or note for tracking and collaboration. Email Send**: Sends an alert to compliance or leadership teams with the policy summary. Sticky Note**: Provides inline documentation inside the workflow. Set up steps Setup Time: 15–20 minutes Install ScrapeGraphAI: In n8n, go to “Settings → Community Nodes” and install n8n-nodes-scrapegraphai. Create Credentials: a. Pipedrive → “API Token” from your Pipedrive settings → add in n8n. b. ScrapeGraphAI → obtain API key → add as credential. c. Email SMTP → configure sender details in n8n. Import Workflow: Copy the JSON template into n8n (“Import from clipboard”). Update URL List: Open the initial Set node and replace placeholder URLs with the sites you monitor (e.g., cms.gov, nih.gov, who.int, state health departments). Define Keywords (optional): a. Open the Code node “Filter & Normalize”. b. Adjust the const keywords = [...] array to match topics you care about. Test Run: Trigger manually; verify that: Scraped items appear in the execution logs. New deals/notes show up in Pipedrive. Alert email lands in your inbox. Schedule: Add a Cron node (e.g., every 6 hours) in place of Manual Trigger for automated execution. Node Descriptions Core Workflow Nodes: Manual Trigger** – Launches the workflow on demand. Set – URL List** – Holds an array of target policy URLs/RSS feeds. Split In Batches** – Processes each URL one at a time to avoid rate limiting. ScrapeGraphAI** – Scrapes page content and parses structured data. Code – Filter & Normalize** – Cleans results, removes duplicates, applies keyword filter. HTTP Request – Summarize** – Calls a summarization API (optional). If – Duplicate Check** – Queries Pipedrive to see if the policy item already exists. Pipedrive (Deal/Note)** – Logs a new deal or adds a note with policy details. Email Send – Alert** – Notifies subscribed stakeholders. Sticky Note** – Embedded instructions inside the canvas. Data Flow: Manual Trigger → Set (URLs) → Split In Batches → ScrapeGraphAI → Code (Filter) → If (Duplicate?) → Pipedrive → Email Send Customization Examples 1. Add Slack notifications // Insert after Email Send { "node": "Slack", "parameters": { "channel": "#policy-alerts", "text": New policy update: ${$json["title"]} - ${$json["url"]} } } 2. Use different CRM (HubSpot) // Replace Pipedrive node config { "resource": "deal", "operation": "create", "title": $json["title"], "properties": { "dealstage": "appointmentscheduled", "description": $json["summary"] } } Data Output Format The workflow outputs structured JSON data: { "title": "Telehealth Expansion Act of 2024", "date": "2024-05-30", "url": "https://www.congress.gov/bill/118th-congress-house-bill/1234", "summary": "This bill proposes expanding Medicare reimbursement for telehealth services...", "source": "congress.gov", "status": "new" } Troubleshooting Common Issues Empty Scrape Results – Check if the target site uses JavaScript rendering; ScrapeGraphAI may need a headless browser option enabled. Duplicate Deals in Pipedrive – Ensure the “If Duplicate?” node compares a unique field (e.g., URL or title) before creating a new deal. Performance Tips Limit batch size to avoid API rate limits. Cache or store the last scraped timestamp to skip unchanged pages. Pro Tips: Combine this workflow with an n8n “Cron” or “Webhook” trigger for fully automated monitoring. Use environment variables for keywords and email recipients to avoid editing nodes each time. Leverage Pipedrive’s automations to notify additional teams (e.g., legal) when high-priority items are logged.
by Chad M. Crowell
How it works This workflow automatically scans AWS accounts for orphaned resources (unattached EBS volumes, old snapshots >90 days, unassociated Elastic IPs) that waste money. It calculates cost impact, validates compliance tags, and sends multi-channel alerts via Slack, Email, and Google Sheets audit logs. Key Features: 🔍 Multi-region scanning with parallel execution 💰 Monthly/annual cost calculation with risk scoring 📊 Professional HTML reports with charts and tables 🏷️ Tag compliance validation (SOC2/ISO27001/HIPAA) ✅ Conditional alerting (only alerts when resources found) 📈 Google Sheets audit trail for trend analysis What gets detected: Unattached EBS volumes ($0.10/GB/month waste) Snapshots older than 90 days ($0.05/GB/month) Unassociated Elastic IPs ($3.60/month each) Typical savings: $50-10K/month depending on account size Set up steps Prerequisites AWS Configuration: Create IAM user n8n-resource-scanner with these permissions: ec2:DescribeVolumes ec2:DescribeSnapshots ec2:DescribeAddresses ec2:DescribeInstances lambda:InvokeFunction Deploy Lambda function aws-orphaned-resource-scanner (Node.js 18+) Add EC2 read-only permissions to Lambda execution role Generate AWS Access Key + Secret Key Lambda Function Code: See sticky notes in workflow for complete implementation using @aws-sdk/client-ec2 Credentials Required: AWS IAM (Access Key + Secret) Slack (OAuth2 or Webhook) Gmail (OAuth2) Google Sheets (OAuth2) Configuration Initialize Config Node: Update these settings: awsRegions: Your AWS regions (default: us-east-1) emailRecipients: FinOps team emails slackChannel: Alert channel (e.g., #cloud-ops) requiredTags: Compliance tags to validate snapshotAgeDays: Age threshold (default: 90) Set Region Variables: Choose regions to scan Lambda Function: Deploy function with provided code (see workflow sticky notes) Google Sheet: Create spreadsheet with headers: Scan Date | Region | Resource Type | Resource ID | Monthly Cost | Compliance | etc. Credentials: Connect all four credential types in n8n Schedule: Enable "Weekly Scan Trigger" (default: Mondays 8 AM UTC) Testing Click "Execute Workflow" to run manual test Verify Lambda invokes successfully Check Slack alert appears Confirm email with HTML report received Validate Google Sheets logging works Customization Options Multi-region:** Add regions in "Initialize Config" Alert thresholds:** Modify cost/age thresholds Additional resource types:** Extend Lambda function Custom tags:** Update required tags list Schedule frequency:** Adjust cron trigger Use Cases FinOps Teams:** Automated cloud waste detection and cost reporting Cloud Operations:** Weekly compliance and governance audits DevOps:** Resource cleanup automation and alerting Security/Compliance:** Tag validation for SOC2/ISO27001/HIPAA Executive Reporting:** Monthly cost optimization metrics Resources AWS IAM Best Practices Lambda Function Code
by Rakin Jakaria
Who this is for This workflow is for digital marketing agencies or sales teams who want to automatically find business leads based on industry & location, gather their contact details, and send personalized cold emails — all from one form submission. What this workflow does This workflow starts every time someone submits the Lead Machine Form. It then: Scrapes business data* (company name, website, phone, address, category) using *Apify** based on business type & location. Extracts the best email address* from each business website using *Google Gemini AI**. Stores valid leads* in *Google Sheets**. Generates cold email content** (subject + body) with AI based on your preferred tone (Friendly, Professional, Simple). Sends the cold email** via Gmail. Updates the sheet** with send status & timestamp. Setup To set this workflow up: Form Trigger – Customize the “Lead Machine” form fields if needed (Business Type, Location, Lead Number, Email Style). Apify API – Add your Apify Actor Endpoint URL in the HTTP Request node. Google Gemini – Add credentials for extracting email addresses. Google Sheets – Connect your sheet for storing leads & email status. OpenAI – Add your credentials for cold email generation. Gmail – Connect your Gmail account for sending cold emails. How to customize this workflow to your needs Change the AI email prompt to reflect your brand’s voice and offer. Add filters to only target leads that meet specific criteria (e.g., website must exist, email must be verified). Modify the Google Sheets structure to track extra info like “Follow-up Date” or “Lead Source”. Switch Gmail to another email provider if preferred.
by Koyanagi Naoyuki
## Who’s it for This workflow is designed for Japanese-speaking professionals, and learners who want to efficiently stay up to date with practical productivity, lifehack, and efficiency-related insights from Japanese content platforms such as blogs and media sites. It is particularly suitable for individuals who prefer actionable, real-world tips—such as workflow improvements, time management strategies, and daily efficiency techniques—rather than abstract motivational content. This workflow is also ideal for users who want to consume information not only as text but also as audio, enabling hands-free learning during commuting or multitasking. ## What it does This workflow automatically collects, evaluates, summarizes, and delivers high-quality Japanese productivity and lifehack articles from multiple RSS sources such as Roomie, Lifehacker, and note. It retrieves articles published within the last 24 hours, merges multiple RSS feeds, and filters them for relevance. Then, AI evaluates the articles based on practical usefulness, reproducibility, and clarity, selecting only the top 10 most valuable pieces. For each selected article, the workflow: Fetches the full article content Cleans and extracts readable text Generates structured summaries in Japanese, including: Summary Target audience Use cases Expected effects Additionally, the workflow converts the summarized content into natural, audio-friendly narration using AI, allowing users to listen to the information like a daily news briefing. Finally, it sends text summaries to Slack for easy daily consumption, while optionally saving audio outputs to Google Drive for later access and playback. ## How it works A scheduled trigger starts the workflow automatically at a specified time. RSS feeds from multiple sources (Roomie, Lifehacker, note hashtags) are fetched. All articles are merged into a single dataset. Articles are filtered to include only those published within the last 24 hours. Article data is normalized into a structured format for AI processing. Gemini AI evaluates and ranks articles based on practical value and selects the top 10. Each article link is accessed and HTML content is retrieved. HTML is cleaned and converted into readable text data. OpenAI generates structured summaries in Japanese. Another AI step converts summaries into natural, spoken-style Japanese scripts. The scripts are converted into audio files (text-to-speech) and stored in Google Drive. The summarized text results are formatted and delivered to Slack. ## Requirements Google Gemini API credentials OpenAI API credentials Slack OAuth2 credentials (Optional) Google Drive credentials for audio file storage A Slack channel for receiving notifications ## How to set up Set up your API credentials within n8n, configure the Slack destination channel, and activate the workflow. If audio generation is enabled, ensure your OpenAI API has access to text-to-speech features and optionally connect Google Drive for file storage. You can also adjust scheduling settings and AI prompts to match your preferences. ## How to customize the workflow You can customize this workflow by: Adding or replacing RSS sources (e.g., specific Japanese productivity blogs or niche communities) Adjusting the filtering window (e.g., last 48 hours instead of 24 hours) Modifying AI scoring criteria (e.g., prioritize beginner-friendly or advanced content) Changing summary structure (e.g., adding “key takeaway” or “action steps”) Customizing Slack message format for readability Enabling or disabling audio generation Changing narration tone or style for audio output Switching output language if needed