by Oneclick AI Squad
This workflow automatically sends timely medication reminders to patients after a prescription is marked as sent in the system. It reads the medication schedule from prescription data, schedules reminders for each dosage time, and delivers notifications via WhatsApp, SMS, or email. All reminders are tracked and logged, ensuring patients stay on track with their treatment while providing healthcare providers with acknowledgment records. 📋 Simple Google Sheets Structure "Prescriptions" Sheet - Required columns: prescription_id patient_name patient_phone patient_email medication dosage times_per_day (1, 2, 3, or 4) duration_days (7, 14, 30, etc.) start_date (YYYY-MM-DD) prescription_status (set to "sent") reminders_created (auto-updated to "yes") "Reminders" Sheet (auto-created): Simple tracking of all scheduled reminders 🔧 Workflow Components (Only 10 Nodes!) Part 1: Schedule Creation Watch Sheet → Monitors for "sent" prescriptions Filter New → Only processes unscheduled prescriptions Create Schedule → Generates reminder times automatically Save Reminders → Stores schedule in sheet Mark Processed → Prevents duplicate scheduling Part 2: Send Reminders Cron Timer → Checks every 10 minutes Get Reminders → Retrieves all scheduled reminders Find Due → Identifies reminders due now Send Messages → WhatsApp + Email simultaneously Mark Sent → Updates status to prevent duplicates ⚙️ Simple Setup Replace these values: YOUR_GOOGLE_SHEET_ID YOUR_WHATSAPP_PHONE_NUMBER_ID Email sender address Add credentials: Google Sheets API WhatsApp API SMTP for email Sample data: prescription_id: RX001 patient_name: John Doe patient_phone: +1234567890 patient_email: john@email.com medication: Amoxicillin 500mg dosage: 1 tablet times_per_day: 3 duration_days: 7 start_date: 2025-01-15 prescription_status: sent reminders_created: no 📱 Default Schedule 1x daily:** 9:00 AM 2x daily:** 9:00 AM, 9:00 PM 3x daily:** 8:00 AM, 2:00 PM, 8:00 PM 4x daily:** 8:00 AM, 12:00 PM, 4:00 PM, 8:00 PM
by Warren Gates
What it does An AI model identifies, describes, and give an estimated value range of a personal property item based on pictures of the item. How it works One of more images, and an optional description, are posted to a n8n webhook. Note that the images should be of the same item. Submitting multiple images of different items will result in unpredictable output. The input is validated. The images and optional description are sent to an AI model, currently set to Anthropic's Claude Sonnet 4.6, for analysis. The model identifies the item in the image(s) and returns various properties, including name, description, and high and low value estimates. The AI model's output, along with the original images, are saved in a Notion database. Requirements An AI provider account, such as OpenAI, Anthropic, Gemini. As configured the template uses Anthropic, but any image-capable model should work. A Notion account with a compatible database. How to use Configure authentication for your AI provider and the webhook header authentication. iOS users can download a shortcut here that allow you to submit images from your photo library using the share sheet. You'll need to update the shortcut with your webhook url and webhook header authentication secret. Both are set in the 'Get contents of' shortcut node. When submitting images the Description input can be used to provide clarifying information, such a model number, to help the AI model correctly identify the item. Configure a Notion database with the appropriate columns. A starter template is available here. Configure Notion authentication in n8n and check that column mappings are correct in the Notion node Customization Edit system message to fine tune output requirements. Use a form trigger to accept images and description from desktop/laptop systems.
by Hardikkumar
This workflow is the AI analysis and alerting engine for a complete social media monitoring system. It's designed to work with data scraped from X (formerly Twitter) using a tool like the Apify Tweet Scraper, which logs the data into a Google Sheet. The workflow then automatically analyzes new tweets with Google Gemini and sends tailored alerts to Slack. How it works This workflow automates the analysis and reporting part of your social media monitoring: tweet Hunting:** It finds tweets for the query entered in the set node and passes the data to the google sheets Fetches New Tweets:** It gets all new rows from your Google Sheet that haven't been processed yet (it looks for "Notmarked" in the 'action taken' column). Prepares for AI:** It combines the data from all new tweets into a single, clean prompt for the AI to analyze. AI Analysis with Gemini:* It sends the compiled data to Google Gemini, asking for a full summary report *and a separate, machine-readable JSON list of any urgent items. Splits the Response:** The workflow intelligently separates the AI's text summary from the JSON data for urgent alerts. Sends Notifications:** The high-level summary is sent to a general Slack channel (e.g., #brand-alerts). Each urgent item is sent as a separate, detailed alert to a high-priority Slack channel (e.g., #urgent). Set up steps It should take about 5-10 minutes to get this workflow running. Prerequisite - Data Source: Ensure you have a Google Sheet being populated with tweet data. For a complete automation, you can set up a new google sheet with the same structure for saving the tweets data and run the Tweet Scraper on a schedule. Configure Credentials: Make sure you have credentials set up in your n8n instance for Google Sheets, Google Gemini (PaLM) API, and Slack. Google Sheets Node ("Get row(s) in sheet"): Select your Google Sheet containing the tweet data. Choose the specific sheet name from the dropdown. Ensure your sheet has a column named action taken so the filter works correctly. Google Gemini Chat Model Node: Select your Google Gemini credential from the dropdown. Slack Nodes ("Send a message" & "Send a message1"): In the first Slack node, choose the channel for the summary report. In the second Slack node, choose the channel for urgent alerts. Save and Activate: Once configured, save your workflow and turn it on!
by go-surfe
🚀 What this template does Automatically finds and enriches key contacts in a deal’s buying group by combining the company domain from the HubSpot deal with the buying group criteria you define (departments, seniorities, countries, job titles). It then pushes these contacts into HubSpot and emails your team a clean summary with direct HubSpot links—so no decision-maker falls through the cracks. Before starting, make sure you have: Buying Group Criteria Excel – Contains two sheets: Buying group reference values (reference list) Your Buying Group Criterias (where you define your filters) You’ll import the Excel file into Google Sheets during setup. ❓ What Problem Does This Solve? When a new opportunity/deal is created, sales teams often miss adjacent decision-makers (e.g., VP Sales, Head of Marketing). This template searches for those people, enriches their contact data, adds/updates them in HubSpot, and notifies your team with a one-glance table. 🧰 Prerequisites: To use this template, you’ll need: A self-hosted or cloud instance of n8n A Surfe API Key (Bearer token for People Search & Bulk Enrich) A Google Sheets account (OAuth2 or service account) with access to your criteria sheet A HubSpot developer account (for the HubSpot Deal Trigger) A HubSpot normal account (where there is your deals, contacts, companies) A Gmail account to send the enrichment summary email The workflow JSON file (included with this tutorial) Buying Group Criteria Excel (included with this tutorial) 📌 Your input (Google Sheets) This workflow uses a Google Sheet with two tabs: Buying group reference values – A read-only reference list of all available options for the departments and seniorities columns. You’ll use this list to choose your search filters. Your Buying Group Criterias – The sheet where you define the actual filters used in the workflow. ⚠️ Before you start: Import the provided Excel file into your Google Sheets account so both tabs appear exactly as in the template. How to fill it the tab “Your Buying Group Criterias”: departments (Column A) → Select one or more values from the reference tab. Only rows containing a value will be used in the search. seniorities (Column B) → Select one or more values from the reference tab. Only rows containing a value will be used in the search. countries (Column C) → Enter any ISO Alpha-2 country codes (e.g., fr, gb, de). This is a free-text filter. jobTitles (Column D) → Enter any job title keywords you want to search for (e.g., CTO, Head of Marketing). This is also a free-text filter. The workflow will read the filled cells from each column, clean duplicates, and pass them to the Surfe People Search API. ⚙️ Setup Instructions 4.1 🔐 Create Your Credentials in n8n 4.1.1 📊 Google Sheets OAuth2 API Go to n8n → Credentials Create new credentials: Type: Google Sheets OAuth2 API Here a pop-up will open where you can login to your Google account from where you will read the Google Sheets When it’s done you should see this on n8n 4.1.2 📧 Gmail OAuth2 API Go to n8n → Credentials Create new credentials: Type: Gmail OAuth2 API A pop-up window will appear where you can log in with your Google account that is linked to Gmail Make sure you grant email send permissions when prompted 4.1.3 🚀 Surfe API In your Surfe dashboard → Use Surfe Api → copy your API key Go to n8n → Credentials → Create Credential Choose Credential Type: Bearer Auth Name it something like SURFE API Key Paste your API key into the Bearer Token Save 4.1.4 🎯 HubSpot OAuth2 API Go to n8n → Credentials → Create Credential → HubSpot Oauth2 API Here make sure to select your normal hubspot account where your companies, deals and contacts are and not the hubspot-developers-xxx.com Done ✅ 4.1.5 🔓 HubSpot Private App Token Go to HubSpot → Settings → Integrations → Private Apps Create an app with scopes: crm.objects.contacts.read crm.objects.contacts.write crm.schemas.contacts.read Save the App token Go to n8n → Credentials → Create Credential → HubSpot App Token Paste your App Token 4.1.6 🎯 HubSpot Developer Api: In order to Use the HubSpot Trigger node, you need to setup HubSpot Developer Api To configure this credential, you'll need a HubSpot developer account and: A Client ID: Generated once you create a public app. A Client Secret: Generated once you create a public app. A Developer API Key: Generated from your Developer Apps dashboard. An App ID: Generated once you create a public app. To create the public app and set up the credential: Log into your HubSpot app developer account. Select Apps from the main navigation bar. Select Get HubSpot API key. You may need to select the option to Show key. Copy the key and enter it in n8n as the Developer API Key. Still on the HubSpot Apps page, select Create app. On the App Info tab, add an App name, Description, Logo, and any support contact info you want to provide. Anyone encountering the app would see these. Open the Auth tab. Copy the App ID and enter it in n8n. Copy the Client ID and enter it in n8n. Copy the Client Secret and enter it in n8n. In the Scopes section, select Add new scope. Add all the scopes listed in Required scopes for HubSpot Trigger node to your app. Select Update. Copy the n8n OAuth Redirect URL and enter it as the Redirect URL in your HubSpot app. Select Create app to finish creating the HubSpot app. Refer to the HubSpot Public Apps documentation for more detailed instructions. You should see this on n8n at the end. ✅ You are now all set for the credentials 4.2 📥 Import and Configure the N8N Workflow Import the provided JSON workflow into N8N Create a New Blank Workflow click the … on the top left Import from File 4.2.1 🔗 Link Nodes to Your Credentials In the workflow, link your newly created credentials to each node of this list : Google Sheets -> Credentials to connect with → Google Sheets Account Gmail Node Credentials to connect with → Gmail account HubSpot: Create or Update →Credentials to connect with → Huspot App Token Account HubSpot Get Company →Credentials to connect with → Huspot App Token Account HubSpot get deal →Credentials to connect with → Huspot App Token Account HubSpot Trigger →Credentials to connect with → Huspot Developer account HTTP Node GET deal associated companies from HUBSPOT → Credential Type → Hubspot OAuth2 API Surfe HTTP nodes: Authentication → Generic Credential Type Generic Auth Type → Bearer Auth Bearer Auth → Select the credentials you created before Surfe HTTP nodes 4.2.2 🔧 Additional Setup for the node Google Sheets READ CRITERIAS: Paste the url of your google sheets in Document → By URL Select the sheet Your Buying Group Criterias 🔄 How This N8N Workflow Works A new deal is created in HubSpot, which triggers the workflow. The workflow retrieves the company domain linked to that deal. It reads the buying group criteria from your Google Sheet (departments, seniorities, countries, job titles). These criteria are combined with the company domain to create a search payload for Surfe’s People Search API (limited to 200 people per run). Matching contacts are then sent to Surfe’s Bulk Enrichment API to retrieve emails, phone numbers, and other details. n8n polls Surfe until the enrichment job is complete. Enriched contact data is extracted and filtered so that only contacts with at least one valid email or phone number remain. These contacts are created or updated in HubSpot. Finally, a Gmail summary email is sent to your team with a clean table of the new or updated contacts and direct links to view them in HubSpot. 🧩 Use Cases Net-new deal created → instantly surface the rest of the buying group and enrich contacts. 🛠 Customization Ideas 🔁 Add retry logic for failed Surfe enrichment jobs 📤 Log enriched contacts into a Google Sheet or Airtable for auditing 📊 Extend the flow to generate a basic summary report of enriched vs rejected contacts ⏳ Trigger the enrichment not only on deal creation but also at a specific deal stage change 📧 Send the summary email to multiple recipients or a team mailing list ✅ Summary This template automates buying-group discovery and enrichment off a new HubSpot deal, writes enriched contacts back to HubSpot, and emails a neat table to your team—so reps focus on outreach, not admin. Import it, connect credentials, point at your criteria sheet, and Let Surfe do the rest.
by Jose Castillo
Track your daily mood in one tap and receive automated AI summaries of your emotional trends every week and month. Perfect for self-reflection, wellness tracking, or personal analytics. This workflow logs moods sent through a webhook (/mood) into Data Tables, analyzes them weekly and monthly with OpenAI (GPT-4o), and emails you clear summaries and actionable recommendations via Gmail. ⚙️ How It Works Webhook – Mood → Collects new entries (🙂, 😐, or 😩) plus an optional note. Set Mood Data → Adds date, hour, and note fields automatically. Insert Mood Row → Stores each record in a Data Table. Weekly Schedule (Sunday 20:00) → Aggregates the last 7 days and sends a summarized report. Monthly Schedule (Day 1 at 08:00) → Aggregates the last 30 days for a deeper AI analysis. OpenAI Analysis → Generates insights, patterns, and 3 actionable recommendations. Gmail → Sends the full report (chart + AI text) to your inbox. 📊 Example Auto-Email Weekly Mood Summary (last 7 days) 🙂 5 ██████████ 😐 2 ████ 😩 0 Average: 1.7 (Positive 🙂) AI Insights: You’re trending upward this week — notes show that exercise days improved mood. Try keeping short walks mid-week to stabilize energy. 🧩 Requirements n8n Data Tables enabled OpenAI credential (GPT-4o or GPT-4 Turbo) Gmail OAuth2 credential to send summaries 🔧 Setup Instructions Connect your credentials: Add your own OpenAI and Gmail OAuth2 credentials. Set your Data Table ID: Open the Insert Mood Row node and enter your own Data Table ID. Without this, new moods won’t be stored. Replace the email placeholder: In the Gmail nodes, replace your.email@example.com with your actual address. Deploy and run: Send a test POST request to /mood (e.g. { "mood": "🙂", "note": "productive day" }) to log your first entry. ⚠️ Before activating the workflow, ensure you have configured the Data Table ID in the “Insert Mood Row” node. 🧠 AI Analysis Interprets mood patterns using GPT-4o. Highlights trends, potential triggers, and suggests 3 specific actions. Runs automatically every week and month. 🔒 Security No personal data is exposed outside your n8n instance. Always remove or anonymize credential references before sharing publicly. 💡 Ideal For Personal mood journaling and AI feedback Therapists tracking client progress Productivity or self-quantification projects 🗒️ Sticky Notes Guide 🟡 Mood Logging Webhook POST /mood receives mood + optional note. ⚠️ Configure your own Data Table ID in the “Insert Mood Row” node before running. 🟢 Weekly Summary Runs every Sunday 20:00 → aggregates last 7 days → generates AI insights + emails report. 🔵 Monthly Summary Runs on Day 1 at 08:00 → aggregates last 30 days → creates monthly reflection. 🟣 AI Analysis Uses OpenAI GPT-4o to interpret trends and recommend actions. 🟠 Email Delivery Sends formatted summaries to your inbox automatically.
by Mirai
This n8n template automates targeted lead discovery, AI-driven data structuring, and personalized cold-email sending at controlled intervals. It’s ideal for sales teams, founders, and agencies that want to scale outreach without losing personalization. Good to know Can run on an interval (e.g., every 10 minutes) to fetch and process new leads. Requires API keys for OpenAI (content + parsing) and Apify (lead discovery). Emails are sent one-by-one with delays (the Wait node) to reduce spam risk. Lead data is written to Google Sheets—we recommend separate sheets for leads with and without emails. Works with Gmail, Outlook, or your own SMTP—just plug in your credentials. How it works Form Trigger (START) A form collects: Job Title, Company Size, Keywords, Location. Apollo URL Generator (GPT) The model turns the form fields into a precise Apollo search URL. Run Apify (Actor) Apify fetches contacts/companies that match your preferences for downstream processing. Limit Caps how many records are prepared per run (e.g., max 5). Parse Lead Data (GPT) Extracts key fields (full name, email, title, LinkedIn, company, company links). Synthesizes a short 2–3 sentence sales-ready summary for each lead. Sorting (If) Splits leads into with email vs. without email. With email → main sheet + email pathway Without email → a separate sheet for later enrichment Email Magic (GPT) Uses the parsed data to personalize your fixed email template for each lead (keeps structure/intent, swaps in the right details). Sending Emails (Loop + Wait + Sender) Loop Over sends messages individually. Wait inserts a pause between sends (fully configurable). Delivery via Gmail or SMTP (custom domain / Outlook). Confirmation After the loop finishes, a Gmail node sends a “campaign complete” confirmation. How to use Enable the workflow and open the start form. Enter preferences: job title, company size, keywords, location. Add credentials: OpenAI (for parsing + email generation) Apify (Bearer token in Run Apify) Google (Sheets + optionally Gmail) SMTP/Outlook (if not using Gmail) Set limits (the Limit node) and send interval (the Wait node). Choose sheets for leads with/without email. Run—the workflow will fetch leads, prepare emails, and send them with spacing. Requirements OpenAI API key Apify API token (access to the chosen Actor) Google Sheets for storage Gmail or SMTP/Outlook credentials for sending An operational n8n instance Customising this workflow Email template: Edit the text in “Creating a email” while preserving placeholders. Segmentation: Add more conditions (role, industry, country) and route to different templates/sheets. Follow-ups: Add a second loop that reads statuses and sends timed reminders. Data enrichment: Insert additional APIs before “Parse Lead Data.” Anti-spam: Increase Wait duration, rotate senders, vary subject lines. Reporting: Add a “send status” sheet and an error log. Security & compliance tips Store API keys in n8n Credentials, not plain-text nodes. Respect GDPR/opt-out—track source and first-contact date in your sheet. Start with a small batch, validate deliverability, then scale up. In short Automated lead capture → AI cleaning + summary → personalized emails → spaced sending → completion notice. Scalable, customizable, and ready to plug into your preferred sender and template.
by Rajeet Nair
📖 Description 🔹 How it works This workflow uses AI (Mistral LLM + Pollinations.ai) to generate high-quality visual content for social media campaigns. It automates the process from brand/campaign input to final image upload, ensuring consistency and relevance. Input Brand & Campaign Data Retrieves brand profile and campaign goals from Google Drive. Cleans and merges the data into a structured JSON format. Campaign Goal Generation AI summarizes campaign goals, audience, success metrics, and keywords. Produces a clear campaign goal summary for content planning. Image Prompt Generation AI creates 5 detailed image prompts reflecting the campaign story. Includes 1 caption and 4–6 relevant hashtags. Image Creation Pollinations.ai generates images based on the AI prompts. Each image is renamed systematically (photo1 → photo5). Post-Processing & Upload All images are merged into a single item. Workflow uploads the final output to Google Drive for campaign use. ⚙️ Set up steps Connect Credentials Add Google Drive and Mistral API credentials in n8n. Configure Google Drive Input Nodes Set fileId for brand profile and campaign goals. Customize AI Prompts Sticky notes explain AI nodes for goal summary and image prompt generation. Optionally modify tone, keywords, or target audience for brand-specific campaigns. Check Image Output Nodes Ensure Pollinations.ai HTTP request nodes are active. Verify renaming code nodes for proper photo sequence. Activate Workflow Test workflow manually to ensure images are generated and uploaded correctly. 🔹 Data Handling & Output This workflow pulls brand profile and campaign goal data from Google Drive. Data is processed into structured JSON, including: Brand Profile: name, mission, vision, values, services, tone, keywords, contact info. Campaign Goal: primary goal, focus, success metrics, target audience, core message. Supports population of multiple campaigns or brands dynamically. JSON output can be used downstream for image prompt generation, reporting, or analytics. All processing is automated, with clear nodes for extraction, parsing, and merging. pollinations.ai is an open-source free text and image generation API available. No signups or API keys required. which prioritize your privacy with zero data storage and completely anonymous usage. ⚡ Result: A fully automated AI-to-image workflow that transforms campaign goals into ready-to-use social media visuals, saving time and maintaining brand consistency.
by Parth Pansuriya
AI Meeting Summary Generator with Google Docs Integration Who’s it for Teams that record meetings and want fast, clear summaries without manual note-taking. Managers who need action items extracted automatically. Anyone using Google Drive + Google Docs as their central workspace. How it works / What it does This workflow automates meeting documentation: Watches a Google Drive folder for new audio/video meeting files. Downloads the file and transcribes speech into text using Gemini AI. Summarizes transcripts into Key Discussions and Action Items. Creates or updates a Google Doc with the formatted summary (title, bullets, checkmarks, styling). Sends final output to Docs with bold headings, bullets, and spacing for readability. How to set up Add your Google Drive Trigger to monitor a folder. Connect Gemini AI to handle transcription + summarization. Configure the Google Docs Tool to create/update your summary documents. (Optional) Use the Code Node + Docs API to apply bullet/checkmark formatting. Requirements Google Drive OAuth2 – for monitoring & downloading files Google Docs OAuth2 – for creating and updating documents Google Gemini API – for transcription + AI-powered summarization How to customize the workflow Change the Google Drive folder to monitor a different workspace. Edit the system prompt in the Summarizer to tweak summary style (e.g., more detail, decisions only, etc.). Modify the Code Node formatting rules (bullets, checkmarks, bold text). Add integrations (e.g., Slack, Email, Notion) to send summaries beyond Google Docs.
by Robert Breen
This n8n workflow template creates an intelligent data analysis chatbot that can answer questions about data stored in Google Sheets using OpenAI's GPT-5 Mini model. The system automatically analyzes your spreadsheet data and provides insights through natural language conversations. What This Workflow Does Chat Interface**: Provides a conversational interface for asking questions about your data Smart Data Analysis**: Uses AI to understand column structures and data relationships Google Sheets Integration**: Connects directly to your Google Sheets data Memory Buffer**: Maintains conversation context for follow-up questions Automated Column Detection**: Automatically identifies and describes your data columns 🚀 Try It Out! 1. Set Up OpenAI Connection Get Your API Key Visit the OpenAI API Keys page. Go to OpenAI Billing. Add funds to your billing account. Copy your API key into your OpenAI credentials in n8n (or your chosen platform). 2. Prepare Your Google Sheet Connect Your Data in Google Sheets Data must follow this format: Sample Marketing Data First row** contains column names. Data should be in rows 2–100. Log in using OAuth, then select your workbook and sheet. 3. Ask Questions of Your Data You can ask natural language questions to analyze your marketing data, such as: Total spend** across all campaigns. Spend for Paid Search only**. Month-over-month changes** in ad spend. Top-performing campaigns** by conversion rate. Cost per lead** for each channel. 📬 Need Help or Want to Customize This? 📧 rbreen@ynteractive.com 🔗 LinkedIn 🔗 n8n Automation Experts
by Oneclick AI Squad
This automated n8n workflow enables an AI-powered movie recommendation system on WhatsApp. Users send messages like "I want to watch a horror movie" or "Where can I watch the Jumanji movie?" The workflow uses AI to interpret the request, searches relevant APIs (e.g., TMDb, JustWatch), and replies with movie recommendations or streaming platform availability via WhatsApp. Fundamental Aspects WhatsApp Webhook Trigger**: Initiates the workflow when a WhatsApp message is received. Analyze WhatsApp Message**: Uses AI (e.g., Ollama Model) to interpret the user's intent and extract request type. Check Request Type**: Determines if the request is for a movie genre or a specific movie title. Check Where Request**: Identifies if the request includes a "where to watch" query. Extract Movie Title**: Extracts the movie title from the message if specified. Extract Genre**: Identifies the movie genre from the message if specified. Search Specific Movie Title**: Queries an API (e.g., TMDb) for details about a specific movie. Search Movies by Genre**: Queries an API (e.g., TMDb) for movies matching the genre. Get Streaming Availability**: Queries an API (e.g., JustWatch) for streaming platforms. Format Streaming Response**: Prepares the response with streaming platform details. Format Genre Recommendations**: Prepares the response with genre-based movie recommendations. Prepare WhatsApp Message**: Formats the final response for WhatsApp. Send WhatsApp Response**: Sends the recommendation or streaming info back to the user via WhatsApp. Setup Instructions Import the Workflow into n8n: Download the workflow JSON and import it via the n8n interface. Configure API Credentials: Set up WhatsApp Business API credentials with a valid phone number and token. Configure TMDb API key (e.g., https://api.themoviedb.org). Configure JustWatch API key (e.g., https://api.watchmode.com). Set up AI model credentials (e.g., Ollama Model). Run the Workflow: Activate the webhook trigger and test with a WhatsApp message. Verify Responses: Check WhatsApp for accurate movie recommendations or streaming info. Adjust Parameters: Fine-tune API endpoints or AI model as needed. Features AI Interpretation**: Uses AI to analyze user intents (genre or movie title). API Integration**: Searches TMDb for movie details and JustWatch for streaming availability. Real-Time Responses**: Sends instant replies via WhatsApp. Custom Recommendations**: Provides genre-based or specific movie recommendations. Technical Dependencies WhatsApp Business API**: For receiving and sending messages. TMDb API**: For movie details and genre searches. JustWatch API**: For streaming availability. Ollama Model**: For AI-based message analysis. n8n**: For workflow automation and integration. Customization Possibilities Add More APIs**: Integrate additional movie databases (e.g., IMDb). Enhance AI**: Train the Ollama Model for better intent recognition. Support More Languages**: Add multilingual support for WhatsApp responses. Add Email Alerts**: Include email notifications for admin monitoring. Customize Responses**: Adjust the format of recommendations or streaming info.
by Yang
Who’s it for This template is perfect for SEO writers, niche bloggers, and content marketers who want to generate high-quality blog posts from a single keyword without spending hours on research and writing. If you often find yourself stuck at the research stage or manually drafting blog content, this workflow automates the entire process from topic discovery to publication. What it does The workflow takes a keyword, performs a Google search using Dumpling AI, analyzes the top-ranking pages and People Also Ask (PAA) questions, and then uses GPT-4 to generate a detailed blog post based on the most valuable question. The blog draft is sent for approval via email, and once approved, it’s automatically published to WordPress. Here’s what happens step by step: Receives a keyword through a simple form Uses Dumpling AI to perform a Google search and extract: Top 2 organic search results People Also Ask (PAA) questions and answers Top related searches Filters for insightful PAA questions Sends the data to GPT-4 to generate a blog post in JSON format Emails the draft blog post for manual review and approval If approved, publishes the post automatically to WordPress How it works Form Trigger: Captures the keyword input Dumpling AI: Searches Google and extracts SEO data including top results, PAA, and related searches Code Node: Processes the raw search data into a structured format for GPT-4 Filter Node: Checks if PAA questions are available GPT-4: Chooses a strong PAA question and writes the blog post Gmail: Sends the draft blog post to your inbox for review Approval Node: Waits for manual approval WordPress: Publishes the approved post automatically Requirements ✅ Dumpling AI API key stored securely as credentials ✅ OpenAI GPT-4 credentials ✅ Gmail account with OAuth2 connected to n8n ✅ WordPress account with API credentials configured How to customize Edit the GPT-4 prompt to control the blog structure, tone, or style Add extra filters to select specific types of PAA questions (e.g., how-to, guides) Change the review recipient email in the Gmail node Add additional formatting or SEO optimization steps before publishing Integrate with Notion, Airtable, or Slack to log or notify team members after publication > This workflow turns a single keyword into a fully researched, GPT-4 generated, and auto-published blog post — helping you scale content creation efficiently while maintaining quality.
by Trung Tran
Chat-Based AWS IAM Policy Generator with OpenAI Agent > Chat-driven workflow that lets IT and DevOps teams generate custom AWS IAM policies via AI, automatically apply them to AWS, and send an email notification with policy details. 👤 Who’s it for This workflow is designed for: Cloud Engineers / DevOps* who need to quickly generate and apply *custom IAM policies** in AWS. IT Support / Security teams* who want to create IAM policies through a *chat-based interface** without manually writing JSON. Teams that want automatic notifications (via email) once new policies are created. ⚙️ How it works / What it does Trigger → Workflow starts when a chat message is received. IAM Policy Creator Agent → Uses OpenAI to: Interpret user requirements (e.g., service, actions, region). Generate a valid IAM policy JSON following AWS best practices. IAM Policy HTTP Request → Sends the generated policy to AWS IAM CreatePolicy API. Email Notification → Once AWS responds with a CreatePolicyResponse, an email is sent with policy details (name, ARN, ID, timestamps, etc.) using n8n mapping. Result: The user can chat with the AI agent, create a policy, and receive an email confirmation with full details. 🛠 How to set up Chat Trigger Node Configure the When chat message received node to connect your preferred chat channel (Slack, MS Teams, Telegram, etc.). IAM Policy Creator Agent Add OpenAI Chat Model as the LLM. Use a system prompt that enforces AWS IAM JSON best practices (least privilege, correct JSON structure). Connect Memory (Simple Memory) and Structured Output Parser to ensure consistent JSON output. IAM Policy HTTP Request Set method: POST URL: https://iam.amazonaws.com/ Add authentication using AWS Signature v4 (Access Key + Secret Key). Body: Action=CreatePolicy PolicyName={{ $json.CreatePolicyResponse.CreatePolicyResult.Policy.PolicyName }} PolicyDocument={{ $json.policyDocument }} Version=2010-05-08 Email for tracking 📋 Requirements n8n instance (self-hosted or cloud). AWS IAM user/role with permission to iam:CreatePolicy. AWS Access Key + Secret Key (for SigV4 signing in HTTP request). OpenAI API key (for the Chat Model). Email server credentials (SMTP or provider integration). 🎨 How to customize the workflow Restrict services/actions** → Adjust the IAM Policy Creator Agent system prompt to limit what services/policies can be generated. Notification channels** → Replace the email node with Slack, MS Teams, or PagerDuty to alert other teams. Tagging policies** → Modify the HTTP request to include Tags when creating policies in AWS. Human-readable timestamps** → Add a Function or Set node to convert CreateDate and UpdateDate from Unix epoch to ISO datetime before sending emails. Approval step** → Insert a manual approval node before sending the policy to AWS for compliance workflows.