by Edson Encinas
🐟 Phishing URL Reputation Checker with VirusTotal This n8n template helps you automatically analyze URLs for phishing and malicious activity using VirusTotal’s multi-engine threat intelligence platform. It validates incoming URLs, submits them for scanning, polls for results, classifies risk, and logs verdicts for monitoring and incident response workflows. Use cases include security automation, SOC alerting, phishing triage pipelines, chatbot URL validation, and email security enrichment. This template is ideal for blue teams, security engineers, and automation builders who want real-time URL reputation checks without building a full security pipeline from scratch. 💡 Good to know VirusTotal enforces API rate limits. For high-volume environments, consider increasing polling intervals or rotating API keys. The workflow defangs suspicious and malicious URLs to prevent accidental clicks during investigation. This template uses asynchronous polling because VirusTotal scans are not always immediately available. ⚙️ How it works A webhook receives a URL from an API, form, chatbot, or automation trigger. The URL is normalized and validated to ensure proper formatting. Valid URLs are submitted to VirusTotal for reputation scanning. The workflow polls VirusTotal until the analysis is completed or retries are exhausted. Detection statistics are extracted and evaluated using threshold-based phishing logic. URLs classified as suspicious or phishing are defanged for safe handling. Results are returned and optionally logged to Google Sheets for auditing and tracking. 🧑💻 How to use Trigger the workflow using the webhook and send JSON like: { "url": "example.com" } Replace the webhook with other triggers such as email ingestion, Slack bots, or security tooling. Review the phishing verdict and risk level in the webhook response or Google Sheets log. 📋 Requirements VirusTotal API key (configured using HTTP Header Auth credentials) Google Sheets account for logging scan results 🧩 Customizing this workflow Send Slack, Microsoft Teams, or email alerts when the verdict is not SAFE. Extend the workflow with additional threat intelligence sources for stronger detection. Store scan results in databases like Airtable, PostgreSQL, or MySQL instead of Google Sheets for scalable logging and analytics.
by WeblineIndia
Shopify New Orders Auto-Sync to Google Sheets & AI Analysis This workflow automatically fetches new Shopify orders on a schedule, detects only newly created orders, enriches them using AI, and stores clean, structured data in Google Sheets for reporting and tracking. This workflow runs on a schedule, fetches recent Shopify orders, checks which ones are new using a stored sync timestamp, analyzes each new order with AI, and saves the final results into Google Sheets. You receive: Automatic order sync without webhooks** Duplicate-free processing using last sync tracking** AI-generated order insights (category, priority, notes)** Clean Google Sheets storage for reporting** Ideal for teams that want reliable order tracking, analysis, and reporting without relying on Shopify webhooks. Quick Start – Implementation Steps Connect your Shopify API credentials. Set the Schedule Trigger (e.g., every 15 minutes). Connect your Google Sheets account and select a sheet. Configure the AI node for order analysis. Activate the workflow — automated order sync begins. What It Does This workflow automates Shopify order processing: Runs on a defined schedule. Loads the last successful sync time from workflow static data. Fetches Shopify orders created after that time. Identifies only new orders to prevent duplicates. Extracts key order details: Order ID & number Product name & quantity Prices & currency Shipping country Uses AI to analyze each order and generate: Product category Priority level Internal review notes Formats the final data into clean fields. Saves each new order as a row in Google Sheets. Updates the last sync timestamp after successful processing. Who’s It For This workflow is ideal for: E-commerce operations teams Finance & accounting teams Store owners using Shopify Teams tracking orders in spreadsheets Businesses without Shopify webhook access Anyone needing automated order backups Requirements to Use This Workflow To run this workflow, you will need: n8n instance** (self-hosted or cloud) Shopify Admin API access** Google Sheets account** Optional: AI credentials (OpenAI or compatible) Basic understanding of Shopify order data How It Works Scheduled Trigger – Workflow runs at fixed intervals. Load Last Sync – Reads the last processed timestamp. Fetch Shopify Orders – Retrieves recent orders. New Order Detection – Filters only unprocessed orders. AI Order Analysis – Adds category, priority, and notes. Prepare Final Data – Cleans and structures fields. Save to Google Sheets – Appends a new row per order. Update Sync Time – Stores the latest successful run time. If no new orders exist, the workflow exits safely without errors. Setup Steps Import the workflow JSON into n8n. Open the Schedule Trigger and set the desired frequency. Configure the Shopify node with your store credentials. Connect the Google Sheets node and map columns correctly. Review the AI prompt for customization. Run the workflow once manually to initialize sync data. Activate the workflow. How To Customize Nodes Change Sync Frequency Adjust the Schedule Trigger to run every few minutes, hourly, or daily. Customize AI Analysis Modify the AI prompt to: Change priority rules Add risk or fraud flags Generate internal comments Extend Google Sheets Fields You can add: Customer email Payment status Fulfillment status AI confidence score Add-Ons (Optional Enhancements) You can extend this workflow to: Send Slack or email notifications Generate daily summary reports Detect high-value orders Add fraud or risk scoring Store data in a database instead of Sheets Support multiple Shopify stores Create dashboards from Google Sheets Use Case Examples 1. Operations Reporting Track daily orders in a shared spreadsheet. 2. Finance & Accounting Maintain an independent record of all sales. 3. Order Review Use AI notes to quickly understand important orders. 4. Backup & Auditing Keep an external copy of Shopify order data. Troubleshooting Guide | Issue | Possible Cause | Solution | |-----|---------------|----------| | No orders fetched | lastSync too recent | Reset static data | | Duplicate orders | Sync update missing | Ensure last sync node runs | | AI output not parsed | Invalid JSON | Adjust AI prompt | | Sheet not updating | Column mismatch | Match headers exactly | | Workflow stops early | No new data | Enable “Always Output Data” | Need Help? If you need help extending this workflow by adding alerts, dashboards advanced AI logic or scaling for production, then our n8n automation developers at WeblineIndia team are happy to assist with advanced n8n automation and Shopify integrations.
by Avkash Kakdiya
How it works This workflow captures new form submissions, cleans the data, and stores it securely before applying AI-based lead qualification. It analyzes each message to assign category, score, and priority. Based on this classification, leads are routed to the appropriate response path. The system ensures every lead is saved first, then enriched and responded to automatically. Step-by-step Capture and clean lead data** Typeform Trigger – Listens for new submissions in real time. Sanitize Lead Data – Cleans names, formats emails, and extracts domain. Store lead in database** Create a record – Saves lead details in Airtable with status “New”. Analyze and enrich with AI** AI Lead Analyzer – Uses OpenAI to classify, score, and prioritize leads. Merge – Combines original lead data with AI-generated insights. Route and respond automatically** Route by Category – Directs leads based on AI classification. Send a message – Sends tailored email for sales inquiries. Send a message1 – Sends confirmation email for support requests. Send a message2 – Sends response for partnership inquiries. Send a message3 – Sends fallback response for other categories. Send a message4 – Sends Discord alert for high-priority leads. Why use this? Ensures no lead is lost by storing data before processing Automatically prioritizes high-value opportunities using AI Reduces manual lead qualification and response time Provides personalized responses based on intent Enables real-time team alerts for important leads
by Rahul Joshi
📊 Description Most learning newsletters send the same email to everyone. This workflow does the opposite — every subscriber gets a completely different email every single day, personalized to their topic interest, learning phase, and how far they've come in their journey. Built specifically for women's skill development programs, coaching platforms, and edtech creators who want to deliver real personalized learning at scale without manually writing hundreds of emails. You manage the content library, the AI handles the personalization, and the workflow handles the delivery — every morning at 9AM without you touching anything. What This Workflow Does ⏰ Triggers every morning at 9AM automatically 📋 Fetches all subscribers from Google Sheets with their name, email, topic interest, and subscription date 🧮 Calculates each subscriber's day number, week number, and learning phase — Beginner, Intermediate, or Advanced — based on how long they've been subscribed 📚 Fetches your full content library and scores every lesson against each subscriber's topic interest and phase 🎯 Picks the best matching lesson per subscriber — falls back to day-index rotation if no strong match found 🔁 Loops through each subscriber one at a time to ensure every email is individually generated 🤖 Sends each subscriber's profile and matched lesson to GPT-4o which generates a fully personalized 3-paragraph lesson explanation, actionable task, key takeaways, and a motivational quote from a woman leader in their field 📧 Builds a beautiful branded HTML email and sends it via Gmail 📝 Logs every delivery to a SendLog sheet with date, name, lesson title, phase, category, and AI snippet Key Benefits ✅ Every subscriber gets a unique email — no generic blasts ✅ Learning phase auto-advances as subscribers stay longer ✅ GPT-4o adapts lesson tone and depth to Beginner, Intermediate, or Advanced ✅ Motivational quotes always come from women leaders in the relevant field ✅ Full delivery log in Google Sheets for tracking and analytics ✅ Works for any skill category — coding, finance, leadership, marketing, and more Features Cron-based daily trigger at 9AM Automatic learning phase calculation per subscriber Content scoring and matching engine Day-index fallback rotation for unmatched subscribers GPT-4o lesson personalization with phase-aware prompting Woman leader motivational quotes per field Branded HTML email template with inline CSS Dynamic subject line per subscriber Gmail delivery with individual personalization Full SendLog tracking in Google Sheets Loop-based processing — one subscriber at a time for accuracy Requirements OpenAI API key (GPT-4o access) Google Sheets OAuth2 connection Gmail OAuth2 connection A configured Google Sheet with 3 sheets: Subscribers, ContentLibrary, SendLog Setup Steps Copy the Google Sheet template and grab your Sheet ID Paste the Sheet ID into all Google Sheets nodes Add your Google Sheets OAuth2 credentials Add your OpenAI API key to the GPT-4o node Add your Gmail OAuth2 credentials Populate the Subscribers sheet with your learners Populate the ContentLibrary sheet with your lessons — at least 5-10 per category Run the workflow manually once to test with your first subscriber Confirm the HTML email looks correct in your inbox Target Audience 🎓 Women's skill development platforms and bootcamps 📧 Edtech creators running personalized learning newsletters 💼 Career coaches who want to deliver daily value to their community 🤖 Automation agencies building AI-powered email learning systems for clients
by oka hironobu
Who is this for Customer support teams and operations managers who receive support requests via email and need automated triage. Works well for small to mid-size teams using Notion as their ticket tracker. What this workflow does This workflow watches a Gmail inbox for incoming support emails. Each email is analyzed by Gemini AI to determine its category (billing, technical, feature request, or general), priority level, and a suggested response draft. A new page is created in a Notion database with all classified fields. Critical tickets trigger an immediate Slack alert to the on-call team, while all tickets get a summary notification. Setup Add a Gmail OAuth2 credential and configure label or address filters. Add a Google Gemini API credential for email classification. Add a Notion API credential and create a database with columns: Title, Category, Priority, Status, Summary. Add a Slack OAuth2 credential and set your alerts channel. Requirements Gmail account with OAuth2 access Google Gemini API key Notion workspace with API integration enabled Slack workspace with OAuth2 app How to customize Edit the AI prompt in "Classify ticket with AI" to add more categories or adjust priority rules. Change the critical priority condition in "Is critical priority" to include high-priority tickets. Replace Notion with Airtable or Google Sheets for a different ticket backend.
by WeblineIndia
(Wealth Management) Client Question → Instant Answer Assistant (n8n + Google Sheets + AI + API) This workflow allows users to ask portfolio-related questions in a simple format (C001: your question). It validates the input, fetches client data and holdings from Google Sheets, retrieves live market prices via API, calculates portfolio performance and generates a short AI-powered response. Quick Implementation Steps Connect Google Sheets (clients, holdings, interaction_logs) Configure Get Live Prices API endpoint Add credentials for Generate AI Answer (Google Gemini) Ensure input format: C001: your question Run test cases (valid + invalid inputs) What It Does This workflow acts as an intelligent financial assistant that responds to client portfolio queries in real time. It starts by receiving a chat message through the When chat message received node and processes it using Parse Client Message to extract the client ID and question. Once validated, it retrieves client details using Get Client Profile and portfolio holdings via Get Client Holdings. It then fetches live stock prices through the Get Live Prices API and merges all data using Merge Portfolio Data to compute metrics like invested value, current value, profit/loss and returns. Finally, the workflow builds a structured prompt in Build AI Prompt and generates a concise response using Generate AI Answer, ensuring the reply is accurate, controlled and based only on available data. Who Can Use This Workflow Financial advisors managing multiple client portfolios Wealth management platforms Fintech developers building AI-driven assistants Anyone looking to automate portfolio Q&A workflows Requirements n8n (self-hosted or cloud) Google Sheets account with: clients sheet holdings sheet interaction_logs sheet Live price API endpoint (used in Get Live Prices) Google Gemini API credentials (used in Generate AI Answer) How It Works & Setup Guide 1. Trigger & Input Parsing When chat message received** receives input Parse Client Message** extracts: client_id question IF Valid Input?** validates format If invalid: Build Invalid Input Response** Log Invalid Input** Return Invalid Response** 2. Client & Holdings Lookup Get Client Profile** fetches client details IF Client Found?** ensures existence Get Client Holdings** retrieves holdings Prepare Symbols** extracts stock symbols IF Holdings Found?** validates data Failure handling: Build Client Not Found Response** Build No Holdings Response** 3. Market Data Fetching Get Live Prices** calls external API Normalize Price Response** standardizes output IF Price API Worked?** validates API success Failure handling: Build API Failed Response** 4. Portfolio Calculation Merge Portfolio Data** computes: Invested amount Current value P&L Return % Best performer Weakest performer Missing prices 5. Market Context (Optional Enhancement) Get Market Context** Attach Market Context** Adds optional insights like: Nifty/Sensex movement Market tone 6. AI Response Generation Build AI Prompt** creates structured prompt Generate AI Answer** generates response Extract AI Answer** extracts clean reply 7. Final Output Final response includes: Client ID Question AI reply Status Timestamp How To Customize Nodes Parse Client Message** Modify input format rules Get Live Prices** Replace with another API (Alpha Vantage, Twelve Data, etc.) Merge Portfolio Data** Add more financial metrics (CAGR, allocation %, etc.) Build AI Prompt** Change tone (formal, advisory, aggressive) Generate AI Answer** Replace Gemini with Hugging Face / OpenAI models Add-Ons (Enhancements) Add Slack notifications for responses Save AI replies to CRM Add email delivery for clients Implement caching for price API Add retry logic for API failures Support multi-client batch processing Use Case Examples Client asks: “C001: How is my portfolio performing?” Advisor checks: “C002: Which stock is my top performer?” User queries: “C003: Why is my portfolio down?” Daily automated portfolio summary generation Integration with chatbot for real-time advisory There can be many more use cases depending on how this workflow is extended. Troubleshooting Guide | Issue | Possible Cause | Solution | |------|--------------|---------| | Invalid input error | Wrong format | Use C001: your question | | Client not found | Missing in sheet | Check clients sheet | | No holdings found | Empty data | Verify holdings sheet | | API failed | Endpoint issue | Check API URL or timeout | | AI reply empty | Model issue | Verify API credentials | | Incorrect calculations | Missing price data | Check API response | 🤝 Need Help? If you need help setting up this workflow, customizing nodes or building advanced automation solutions, feel free to reach out. For professional support, custom workflow development or enterprise-grade automation, contact our n8n workflow developers at WeblineIndia. We help businesses build scalable and intelligent automation systems tailored to their needs.
by Pake.AI
Overview This workflow converts a single topic into a full blog article through a structured multi-step process. Instead of generating everything in one pass, it breaks the task into clear stages to produce cleaner structure, better SEO consistency, and more predictable output quality. How this workflow differs from asking ChatGPT directly It does not produce an article in one step. It separates the process into two focused stages: outline generation and paragraph expansion. This approach gives you more control over tone, SEO, structure, and keyword placement. How it works 1. Generate outline The workflow sends your topic to an AI Agent. It returns a structured outline based on the topic, desired depth, language, and keyword focus. 2. Expand each subtopic The workflow loops through each outline item. Every subtopic is expanded into a detailed, SEO-friendly paragraph. Output is consistent and optimized for readability. 3. Produce final outputs Combines all expanded sections into: A clean JSON object A Markdown version ready for blogs or CMS The JSON includes: Title HTML content Markdown content You can send this directly to REST APIs such as WordPress, Notion, or documentation platforms. Content is validated for readability and typically scores well in tools like Yoast SEO. Uses GPT-4o Mini by default, with average token usage between 2000 and 3000 depending on outline size. Use cases Auto-generate long-form articles for blogs or content marketing. Turn Instagram or short-form scripts into complete SEO articles. Create documentation or educational content using consistent templates. Setup steps 1. Prepare credentials Add your OpenAI API Key inside n8n’s credential manager. 2. Adjust input parameters Topic or main idea Number of outline items Language Primary keyword Tone or writing style (optional) 3. Customize the workflow Switch the model if you want higher quality or lower token usage. Modify the prompt for the outline or paragraph generator to match your writing style. Add additional nodes if you want to auto-upload the final article to WordPress, Notion, or any API. 4. Run the workflow Enter your topic Execute the workflow Retrieve both JSON and Markdown outputs for immediate publishing If you need help expanding this into a full content pipeline or want to integrate it with other automation systems, feel free to customize further.
by vinci-king-01
Medical Research Tracker with Email and Pipedrive ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically scans authoritative healthcare policy websites for new research, bills, or regulatory changes, stores relevant findings in Pipedrive, and immediately notifies key stakeholders via email. It is ideal for healthcare administrators and policy analysts who need to stay ahead of emerging legislation or guidance that could impact clinical operations, compliance, and strategy. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Pipedrive account and API token SMTP credentials (or native n8n Email credentials) for sending alerts List of target URLs or RSS feeds from government or healthcare policy organizations Basic familiarity with n8n credential setup Required Credentials | Service | Credential Name | Purpose | |--------------------|-----------------|-----------------------------------| | ScrapeGraphAI | API Key | Perform web scraping | | Pipedrive | API Token | Create / update deals & notes | | Email (SMTP/Nodemailer) | SMTP creds | Send alert emails | Environment Variables (optional) | Variable | Example Value | Description | |-------------------------|------------------------------|-----------------------------------------------| | N8N_DEFAULT_EMAIL_FROM | policy-bot@yourorg.com | Default sender for Email Send node | | POLICY_KEYWORDS | telehealth, Medicare, HIPAA | Comma-separated keywords for filtering | How it works This workflow automatically scans authoritative healthcare policy websites for new research, bills, or regulatory changes, stores relevant findings in Pipedrive, and immediately notifies key stakeholders via email. It is ideal for healthcare administrators and policy analysts who need to stay ahead of emerging legislation or guidance that could impact clinical operations, compliance, and strategy. Key Steps: Manual Trigger**: Kick-starts the workflow or schedules it via cron. Set → URL List**: Defines the list of healthcare policy pages or RSS feeds to scrape. Split In Batches**: Iterates through each URL so scraping happens sequentially. ScrapeGraphAI**: Extracts headlines, publication dates, and links. Code (Filter & Normalize)**: Removes duplicates, standardizes JSON structure, and applies keyword filters. HTTP Request**: Optionally enriches data with summary content using external APIs (e.g., OpenAI, SummarizeBot). If Node**: Checks if the policy item is new (not already logged in Pipedrive). Pipedrive**: Creates a new deal or note for tracking and collaboration. Email Send**: Sends an alert to compliance or leadership teams with the policy summary. Sticky Note**: Provides inline documentation inside the workflow. Set up steps Setup Time: 15–20 minutes Install ScrapeGraphAI: In n8n, go to “Settings → Community Nodes” and install n8n-nodes-scrapegraphai. Create Credentials: a. Pipedrive → “API Token” from your Pipedrive settings → add in n8n. b. ScrapeGraphAI → obtain API key → add as credential. c. Email SMTP → configure sender details in n8n. Import Workflow: Copy the JSON template into n8n (“Import from clipboard”). Update URL List: Open the initial Set node and replace placeholder URLs with the sites you monitor (e.g., cms.gov, nih.gov, who.int, state health departments). Define Keywords (optional): a. Open the Code node “Filter & Normalize”. b. Adjust the const keywords = [...] array to match topics you care about. Test Run: Trigger manually; verify that: Scraped items appear in the execution logs. New deals/notes show up in Pipedrive. Alert email lands in your inbox. Schedule: Add a Cron node (e.g., every 6 hours) in place of Manual Trigger for automated execution. Node Descriptions Core Workflow Nodes: Manual Trigger** – Launches the workflow on demand. Set – URL List** – Holds an array of target policy URLs/RSS feeds. Split In Batches** – Processes each URL one at a time to avoid rate limiting. ScrapeGraphAI** – Scrapes page content and parses structured data. Code – Filter & Normalize** – Cleans results, removes duplicates, applies keyword filter. HTTP Request – Summarize** – Calls a summarization API (optional). If – Duplicate Check** – Queries Pipedrive to see if the policy item already exists. Pipedrive (Deal/Note)** – Logs a new deal or adds a note with policy details. Email Send – Alert** – Notifies subscribed stakeholders. Sticky Note** – Embedded instructions inside the canvas. Data Flow: Manual Trigger → Set (URLs) → Split In Batches → ScrapeGraphAI → Code (Filter) → If (Duplicate?) → Pipedrive → Email Send Customization Examples 1. Add Slack notifications // Insert after Email Send { "node": "Slack", "parameters": { "channel": "#policy-alerts", "text": New policy update: ${$json["title"]} - ${$json["url"]} } } 2. Use different CRM (HubSpot) // Replace Pipedrive node config { "resource": "deal", "operation": "create", "title": $json["title"], "properties": { "dealstage": "appointmentscheduled", "description": $json["summary"] } } Data Output Format The workflow outputs structured JSON data: { "title": "Telehealth Expansion Act of 2024", "date": "2024-05-30", "url": "https://www.congress.gov/bill/118th-congress-house-bill/1234", "summary": "This bill proposes expanding Medicare reimbursement for telehealth services...", "source": "congress.gov", "status": "new" } Troubleshooting Common Issues Empty Scrape Results – Check if the target site uses JavaScript rendering; ScrapeGraphAI may need a headless browser option enabled. Duplicate Deals in Pipedrive – Ensure the “If Duplicate?” node compares a unique field (e.g., URL or title) before creating a new deal. Performance Tips Limit batch size to avoid API rate limits. Cache or store the last scraped timestamp to skip unchanged pages. Pro Tips: Combine this workflow with an n8n “Cron” or “Webhook” trigger for fully automated monitoring. Use environment variables for keywords and email recipients to avoid editing nodes each time. Leverage Pipedrive’s automations to notify additional teams (e.g., legal) when high-priority items are logged.
by Chad M. Crowell
How it works This workflow automatically scans AWS accounts for orphaned resources (unattached EBS volumes, old snapshots >90 days, unassociated Elastic IPs) that waste money. It calculates cost impact, validates compliance tags, and sends multi-channel alerts via Slack, Email, and Google Sheets audit logs. Key Features: 🔍 Multi-region scanning with parallel execution 💰 Monthly/annual cost calculation with risk scoring 📊 Professional HTML reports with charts and tables 🏷️ Tag compliance validation (SOC2/ISO27001/HIPAA) ✅ Conditional alerting (only alerts when resources found) 📈 Google Sheets audit trail for trend analysis What gets detected: Unattached EBS volumes ($0.10/GB/month waste) Snapshots older than 90 days ($0.05/GB/month) Unassociated Elastic IPs ($3.60/month each) Typical savings: $50-10K/month depending on account size Set up steps Prerequisites AWS Configuration: Create IAM user n8n-resource-scanner with these permissions: ec2:DescribeVolumes ec2:DescribeSnapshots ec2:DescribeAddresses ec2:DescribeInstances lambda:InvokeFunction Deploy Lambda function aws-orphaned-resource-scanner (Node.js 18+) Add EC2 read-only permissions to Lambda execution role Generate AWS Access Key + Secret Key Lambda Function Code: See sticky notes in workflow for complete implementation using @aws-sdk/client-ec2 Credentials Required: AWS IAM (Access Key + Secret) Slack (OAuth2 or Webhook) Gmail (OAuth2) Google Sheets (OAuth2) Configuration Initialize Config Node: Update these settings: awsRegions: Your AWS regions (default: us-east-1) emailRecipients: FinOps team emails slackChannel: Alert channel (e.g., #cloud-ops) requiredTags: Compliance tags to validate snapshotAgeDays: Age threshold (default: 90) Set Region Variables: Choose regions to scan Lambda Function: Deploy function with provided code (see workflow sticky notes) Google Sheet: Create spreadsheet with headers: Scan Date | Region | Resource Type | Resource ID | Monthly Cost | Compliance | etc. Credentials: Connect all four credential types in n8n Schedule: Enable "Weekly Scan Trigger" (default: Mondays 8 AM UTC) Testing Click "Execute Workflow" to run manual test Verify Lambda invokes successfully Check Slack alert appears Confirm email with HTML report received Validate Google Sheets logging works Customization Options Multi-region:** Add regions in "Initialize Config" Alert thresholds:** Modify cost/age thresholds Additional resource types:** Extend Lambda function Custom tags:** Update required tags list Schedule frequency:** Adjust cron trigger Use Cases FinOps Teams:** Automated cloud waste detection and cost reporting Cloud Operations:** Weekly compliance and governance audits DevOps:** Resource cleanup automation and alerting Security/Compliance:** Tag validation for SOC2/ISO27001/HIPAA Executive Reporting:** Monthly cost optimization metrics Resources AWS IAM Best Practices Lambda Function Code
by Yaron Been
Overview Watch target companies for C-level and VP hiring signals, then send AI-personalized outreach emails when leadership roles are posted. This workflow reads a list of target company domains from Google Sheets, checks each one for leadership-level job openings via the PredictLeads Job Openings API, enriches matching companies with additional company data, and uses OpenAI to generate a personalized outreach email referencing the specific leadership hire. The email is sent automatically through Gmail. How it works A schedule trigger runs the workflow daily at 8:00 AM. The workflow reads target account domains from Google Sheets. It loops through each company and fetches job openings from PredictLeads. It filters for leadership roles such as CRO, CMO, CTO, VP, Head of, Chief, and Director. If leadership roles are found, it enriches the company with PredictLeads company data such as industry, size, and location. It builds a structured prompt combining company context and the detected leadership roles. It sends the prompt to OpenAI to generate a personalized outreach email. It sends the AI-generated email through Gmail with a tailored subject line. It loops back to process the next company. Setup Create a Google Sheet with these columns: domain company_name Connect your Gmail account using OAuth2 for sending outreach emails. Add your OpenAI API key in the Generate Outreach Email HTTP Request node. Add your PredictLeads API credentials using the X-Api-Key and X-Api-Token headers. Requirements Google Sheets OAuth2 credentials Gmail OAuth2 credentials OpenAI API account using gpt-4o-mini PredictLeads API account: https://docs.predictleads.com Notes The leadership role filter uses regex matching for roles such as CRO, CMO, CTO, VP, Vice President, Head of, and Chief. You can customize this as needed. The AI prompt instructs OpenAI to write concise emails with a maximum of 150 words, referencing the specific leadership hire. PredictLeads Job Openings and Company API docs: https://docs.predictleads.com
by Rahul Joshi
📊 Description Most period tracking apps tell you when your period is coming. This workflow goes further — it tracks every phase of every subscriber's unique cycle, sends the right email at exactly the right time, and delivers GPT-4o powered wellness coaching every week tailored to where each woman is in her cycle. Built for women's health platforms, wellness coaches, femtech creators, and community builders who want to deliver genuinely useful cycle-aware health support at scale without building a custom app. What This Workflow Does 📝 Subscribers fill in a simple form — name, email, last period date, and cycle length 🧮 Instantly calculates all key cycle dates — next period, ovulation day, fertile window start and end, and PMS window start 📧 Sends a personalized welcome email with their complete cycle overview 🕐 Runs every morning at 8AM checking all active subscribers 🔍 Detects which phase event is happening today for each subscriber 📬 Sends the right phase-specific reminder email on the exact right day: - 3 days before period — preparation tips - Period start day — comfort and self-care tips - Ovulation day — fertility awareness and energy tips - PMS window start — mood, energy, and boundary tips 🔒 Duplicate send prevention ensures each email type is only sent once per cycle per subscriber 📝 Updates each subscriber's last email sent record after every send 📊 Logs every delivery to Send Log sheet with date, phase, cycle day, and email type 💜 Every Sunday generates a personalized weekly wellness digest for every subscriber using GPT-4o based on their current cycle phase — with energy, nutrition, movement, and mindset tips Key Benefits ✅ Fully automated — set up once and runs forever ✅ Every subscriber gets emails timed to their unique cycle not a generic schedule ✅ 4 different phase-specific reminder emails with tailored content and colors ✅ GPT-4o generates unique wellness tips per phase every week — never repetitive ✅ Duplicate send prevention — no subscriber ever gets the same email twice in one cycle ✅ Auto-recalculates cycle dates on period start for continuous tracking ✅ Full send log for tracking delivery history and engagement patterns How It Works SW1 — Subscriber Intake & Cycle Calculator Subscribers open the form and enter their name, email, last period start date, and average cycle length. The workflow immediately calculates all key dates using standard cycle science — next period is last period plus cycle length, ovulation is next period minus 14 days, fertile window opens 5 days before ovulation and closes 1 day after, and PMS window starts 5 days before the next period. All dates are saved to the Subscribers sheet and a branded welcome email is sent instantly showing the subscriber their complete cycle overview with all dates laid out clearly. SW2 — Daily Cycle Monitor & Smart Reminders Every morning at 8AM the workflow reads all active subscribers and calculates where each one is in their cycle today. It checks if today matches any of the 4 key trigger dates — 3 days before period, period start, ovulation day, or PMS start. If there is a match it builds the appropriate phase-specific HTML email with tailored tips, colors, and messaging and sends it via Gmail. Before sending it checks the last email sent field to prevent duplicate sends within the same cycle. After every send it updates the subscriber record and logs the delivery to the Send Log sheet. SW3 — Weekly Wellness Digest Every Sunday at 9AM the workflow reads all active subscribers and calculates each one's current cycle phase — Menstrual, Follicular, Fertile, Ovulation, or PMS. It builds a personalized prompt for each subscriber including their name, phase, cycle day, and days until next period and sends it to GPT-4o. The AI generates phase-specific tips across 5 categories — energy management, nutrition, movement, mindset, and what to expect this week — plus a weekly affirmation. The response is assembled into a branded HTML email where the header color and emoji adapt automatically to the current phase. Every send is logged to the Send Log sheet. Features n8n Form intake — no external form tool needed Automatic cycle date calculation from last period and cycle length 4 phase-specific trigger emails with unique content per phase Duplicate send prevention per cycle per subscriber Phase detection engine covering all 5 cycle phases GPT-4o weekly wellness coaching per phase Phase-adaptive email colors and emojis 5 wellness categories per digest — energy, nutrition, movement, mindset, what to expect Weekly affirmation generated per phase Full delivery logging to Send Log sheet Active subscriber filtering — easy to pause or deactivate users Requirements OpenAI API key (GPT-4o access) Google Sheets OAuth2 connection Gmail OAuth2 connection A configured Google Sheet with 2 sheets — Subscribers and Send Log Setup Steps Create a Google Sheet called "Period Health Tracker" with 2 sheets — Subscribers and Send Log Paste your Sheet ID into all Google Sheets nodes Connect your Google Sheets OAuth2 credentials Add your OpenAI API key to the GPT-4o node Connect your Gmail OAuth2 credentials Target Audience 🌸 Women's health and wellness platforms delivering cycle-aware content 💼 Femtech creators building automated health tracking without a custom app 🧘 Wellness coaches who want to deliver personalized cycle coaching at scale 🤖 Automation agencies building health and wellness products for women's communities
by yuta tokumitsu
AI-Powered Japanese Social Media Content Generator with Quality Control 🎯 Who's it for Marketing teams and social media managers in Japan who want to automate content creation while maintaining high quality standards and cultural appropriateness. Perfect for businesses that need consistent Japanese-language social media presence with built-in compliance checks. 📝 What it does This workflow creates an intelligent content generation system that: Generates culturally-aware Japanese Twitter posts using GPT-4 Automatically scores content quality across 5 dimensions (engagement, SEO, brand voice, readability, CTA) Performs sentiment analysis and risk detection for controversial topics Routes content intelligently: auto-posts high-quality/low-risk content, flags medium-risk content for approval, and rejects high-risk content Includes an auto-improvement loop that refines content up to 3 times if quality scores are below 70 Provides weekly performance analytics and recommendations 🔧 How it works Daily Content Generation Flow: Schedule trigger runs weekday mornings at 9 AM Fetches Japanese cultural context (seasons, holidays, business events) Analyzes brand voice from past 30 days of posts Generates 3 Twitter post variations with GPT-4 Each post is scored on quality metrics (100-point scale) Low-scoring content enters auto-improvement loop Risk analysis checks for controversy, cultural sensitivity, and sentiment Decision routing: auto-approve and post OR send for manual approval OR reject Approval Workflow: Pending posts trigger approval emails Webhook receives approval/rejection/edit actions Approved posts are published to Twitter and archived in Notion Weekly Analytics: Monday morning trigger analyzes past week's performance GPT-4 generates insights report with best practices Email sent to team with recommendations ⚙️ Requirements APIs & Credentials: OpenAI API (GPT-4 access) Twitter API v2 with OAuth 2.0 Notion API (database for content storage) Email sending service (SMTP or SendGrid) Setup: Create a Notion database with columns: Content, Hashtags, Quality Score, Risk Level, Status, Engagement Configure OpenAI API credentials with HTTP Header Auth Set up Twitter OAuth 2.0 credentials Configure email service for approval notifications 🎨 How to customize Adjust Quality Thresholds: Modify the quality scoring criteria in "AI Quality Scoring" node Change auto-approval threshold (currently 70+ points) Content Generation: Edit GPT-4 prompts in "Generate Content with GPT-4" node to match your brand tone Adjust temperature settings for more/less creative content Modify the number of posts generated per run Risk Detection: Customize risk factors in "Sentiment & Risk Analysis" node Add industry-specific compliance checks Brand Voice Learning: Adjust the lookback period in "Get Past 30 Days Posts" (currently 30 days) Modify brand voice analysis logic in "Analyze Brand Voice" node Scheduling: Change cron expressions for daily content generation and weekly reports Add additional triggers for special campaigns ⚠️ Important Notes This workflow uses Japanese language prompts - modify system prompts if using for other languages Ensure compliance with Twitter's API rate limits and automation policies Review auto-posted content regularly to validate AI quality assessments The workflow stores all generated content in Notion for audit trails