by Anshul Chauhan
Deploy a Multi-Tool AI Assistant on WhatsApp with Google Gemini Deploy a true AI assistant on WhatsApp. This n8n workflow uses a sophisticated hierarchical agent structure to not only handle conversations but also manage your emails and calendar directly from your chat, all powered by Google Gemini. Key Features Powered by Google Gemini:** Utilizes the advanced capabilities of Google's Gemini models for understanding complex commands and generating natural, human-like responses. Intelligent Task Delegation (Hierarchical Agents):* Features a central *Personal Agent** that understands the user's intent and intelligently delegates tasks to specialized sub-agents for email, calendar, or general chat. Full Email & Calendar Management:** Connects directly to your Google Workspace to send emails, create drafts, apply labels, create/update/delete calendar events, check your availability, and more. Context-Aware Conversations:** Employs memory at multiple levels, allowing the assistant to remember the context of your requests for a coherent and intuitive user experience. Seamless WhatsApp Integration:** Connects directly with the WhatsApp Business API to send and receive messages, engaging users on one of the world's most popular messaging platforms. Easy to Deploy & Customize:** Get your assistant running with minimal configuration and easily extend its capabilities by adding new tools or modifying the prompts of the existing agents. How It Works The workflow uses an advanced agent-based model to process incoming messages: The Whatsapp Trigger node listens for and receives new messages sent to your WhatsApp Business number. The message is passed to the main Manager Agent. The Personal Agent analyzes the message to understand the user's intent (e.g., "send an email," "check my schedule," or just "hello"). Based on the intent, it routes the task to the appropriate sub-agent: the Email Tool, the Calendar Tool, or the general Chatbot Model. The selected sub-agent executes the task using its own dedicated tools (e.g., the Email Tool uses Gmail nodes to send a message). The result or response from the sub-agent is passed back to the Send message (WhatsApp) node, which delivers the reply to the user. Prerequisites An active n8n instance. A Meta Business Account and a configured Meta App with the "WhatsApp Business" product added. A Google Gemini API Key. A Google Account with pre-configured OAuth2 credentials in n8n for Gmail and Google Calendar. Step-by-Step Setup Guide 1. Configure WhatsApp Credentials: In your n8n instance, add new "WhatsApp Business" credentials. You will need a Permanent Access Token and a Phone Number ID from your Meta App's "WhatsApp > API Setup" dashboard. 2. Set Up the WhatsApp Trigger: Open the Whatsapp Trigger node. In the "Webhook URL" section, copy the Test URL. Go to your Meta App's dashboard under "WhatsApp > Configuration". Click "Edit" in the Webhooks section. Paste the n8n Test URL into the Callback URL field. Create and enter a Verify token (a simple password of your choice). Enter this same token in the Whatsapp Trigger node in n8n. Subscribe to the messages webhook event. Once verified, copy the Production URL from n8n and paste it into the same Callback URL field in the Meta dashboard. 3. Configure the Google Gemini Nodes: You must add your Google Gemini API Key to the credentials for all the Google Gemini Chat Model nodes. This includes the one in the Chatbot Model, Email Tool, and Calendar Tool. 4. Configure the Google Tools (Email & Calendar): Email Tool:* Open the group of nodes labeled *Email Tool**. For every Gmail node (Send Email, Create Draft, Get Labels, etc.), select your pre-configured Google OAuth2 credential. Calendar Tool:* Open the group of nodes labeled *Calendar Tool**. For every Google Calendar node (Create Event, Get all event, etc.), select your pre-configured Google OAuth2 credential. 5. Activate and Test: Save and activate the workflow. Send a message to your configured WhatsApp Business number.
by Daniel
Transform any website into a custom logo in seconds with AI-powered analysis—no design skills required! 📋 What This Template Does This workflow receives a website URL via webhook, captures a screenshot and fetches the page content, then leverages OpenAI to craft an optimized prompt based on the site's visuals and text. Finally, Google Gemini generates a professional logo image, which is returned as a binary response for immediate use. Automates screenshot capture and content scraping for comprehensive site analysis Intelligently generates tailored logo prompts using multimodal AI Produces high-quality, context-aware logos with Gemini's image generation Delivers the logo directly via webhook response 🔧 Prerequisites n8n self-hosted or cloud instance with webhook support ScreenshotOne account for website screenshots OpenAI account with API access Google AI Studio account for Gemini API 🔑 Required Credentials ScreenshotOne API Setup Sign up at screenshotone.com and navigate to Dashboard → API Keys Generate a new access key with screenshot permissions In the workflow, replace "[Your ScreenshotOne Access Key]" in the "Capture Website Screenshot" node with your key (no n8n credential needed—it's an HTTP query param) OpenAI API Setup Log in to platform.openai.com → API Keys Create a new secret key with chat completions access Add to n8n as "OpenAI API" credential type and assign to "OpenAI Prompt Generator" node Google Gemini API Setup Go to aistudio.google.com/app/apikey Create a new API key (free tier available) Add to n8n as "Google PaLM API" credential type and assign to "Generate Logo Image" node ⚙️ Configuration Steps Import the workflow JSON into your n8n instance Assign the required credentials to the OpenAI and Google Gemini nodes Replace the placeholder API key in the "Capture Website Screenshot" node's query parameters Activate the workflow to enable the webhook Test by sending a POST request to the webhook URL with JSON body: {"websiteUrl": "https://example.com"} 🎯 Use Cases Marketing teams prototyping brand assets**: Quickly generate logo variations for client websites during pitches, saving hours on manual design Web developers building portfolios**: Auto-create matching logos for new sites to enhance visual consistency in demos Freelance designers iterating ideas**: Analyze competitor sites to inspire custom logos without starting from scratch Educational projects on AI design**: Teach students how multimodal AI combines text and images for creative outputs ⚠️ Troubleshooting Screenshot fails (timeout/error)**: Increase "timeout" param to 120s or check URL accessibility; verify API key and quotas at screenshotone.com Prompt generation empty**: Ensure OpenAI credential has sufficient quota; test node isolation with a simple query Logo image blank or low-quality**: Refine the prompt in "Generate Logo Prompt" for more specifics (e.g., add style keywords); check Gemini API limits Webhook not triggering**: Confirm POST method and JSON body format; view execution logs for payload details
by noda
Overview Auto-translate YouTube uploads to Japanese and post to Slack (DeepL + Slack) Who’s it for Marketing or community teams that follow English-speaking creators but share updates with Japanese audiences; language learners who want JP summaries of newly released videos; internal comms teams curating industry channels for a JP workspace. What it does This workflow detects new YouTube uploads, retrieves full metadata, translates the title and description into Japanese using DeepL, and posts a formatted message to a Slack channel. It also skips non-English titles to avoid unnecessary translation. How it works ・RSS watches a channel for new items. ・YouTube API fetches the full snippet (title/description). ・Text is combined into a single payload and sent to DeepL. ・The translated result + original metadata is merged and posted to Slack. Requirements ・YouTube OAuth (for reliable snippet retrieval) ・DeepL API key (Free or Pro) ・Slack OAuth How to set up ・Duplicate this template. ・Open the Config (Set) node and fill in YT_CHANNEL_ID, TARGET_LANG, SLACK_CHANNEL. ・Connect credentials for YouTube, DeepL, and Slack (don’t hardcode API keys in HTTP nodes). ・Click Execute workflow and verify one sample post. How to customize ・Change TARGET_LANG to any language supported by DeepL. ・Add filters (exclude Shorts, skip videos under N characters). ・Switch to Slack Blocks for richer formatting or thread replies. ・Add a fallback translator or retry logic on HTTP errors. Notes & limits DeepL Free/Pro have different endpoints/quotas and monthly character limits. YouTube and Slack also enforce rate limits. Keep credentials in n8n’s credential store; do not commit keys into templates. Rotate keys if you accidentally exposed them.
by phil
This workflow is your all-in-one AI Content Strategist, designed to generate comprehensive, data-driven content briefs by analyzing top-ranking competitors. It operates through a simple chat interface. You provide a target keyword, and the workflow automates the entire research process. First, it scrapes the top 10 Google search results using the powerful Bright Data SERP API. Then, for each of those results, it performs a deep dive, using the Bright Data Web Unblocker to reliably extract the full content from each page, bypassing any anti-bot measures. Finally, all the gathered data—titles, headings, word counts, and page summaries—is synthesized by a Large Language Model (LLM) to produce a strategic content plan. This plan identifies search intent, core topics, and crucial content gaps, giving you a clear roadmap to outrank the competition. This template is indispensable for SEO specialists, content marketers, and digital agencies looking to scale their content production with strategies that are proven to work. Why Use This AI Content Strategist Workflow ? Data-Driven Insights: Base your content strategy on what is actually ranking on **Google, not guesswork. Automated Competitive Analysis: Instantly understand the structure, length, and key themes of the **top-performing articles for any keyword. Strategic Gap Detection: The **AI analysis highlights poorly covered topics and missed opportunities, allowing you to create content that provides unique value. Massive Time Savings: Condenses hours of **manual research into a fully automated process that runs in minutes. How It Works Chat Interaction Begins: The workflow is initiated via a chat UI. The user enters a target keyword to start the analysis. Google SERP Scraping (Bright Data): The "Google SERP" node uses Bright Data's SERP API to fetch the top 10 organic results, providing the URLs for the next stage. Individual Page Scraping (Bright Data): The workflow loops through each URL. The "Access and extract data" node uses the Bright Data Web Unblocker to ensure successful and complete HTML scraping of every competitor's page. Content Extraction & Aggregation: A series of Code nodes clean the raw HTML and extract structured data (title, meta description, headings, word count). The Aggregate node then compiles the data from all 10 pages into a single dataset. AI Synthesis (OpenRouter): The "Analysis" node sends the entire compiled dataset to an LLM via OpenRouter. The AI performs a holistic analysis to identify search intent, must-cover topics, and differentiation opportunities. Strategic Brief Generation: The "Format Output" node takes the AI's structured JSON analysis and transforms it into a clean, human-readable Markdown report, which is then delivered back to the user in the chat interface. 🔑 Prerequisites To use this workflow, you will need active accounts with both Bright Data (for web scraping) and OpenRouter (for AI model access). Setting Up Your Credentials: Bright Data Account: Sign up for a free trial account on their website. Inside your Bright Data dashboard, you will need to activate both the SERP API and the Web Unblocker products to create the necessary Zones. In n8n, navigate to the Credentials section, add a new "Brightdata API" credential, and enter your API key. In the workflow, select your newly created credential in both the "Google SERP" node and the "Access and extract data from a specific URL" node. OpenRouter Account: Sign up for an account at OpenRouter.ai. Navigate to your account settings to find your API Key. In n8n, go to Credentials, add a new "OpenRouter API" credential, and paste your key. In the workflow, select this credential in all three "OpenRouter Chat Model" nodes. Phil | Inforeole 🇫🇷 Contactez nous pour automatiser vos processus
by David Olusola
📧 Auto-Send AI Follow-Up Emails to Zoom Attendees This workflow automatically emails personalized follow-ups to every Zoom meeting participant once the meeting ends. ⚙️ How It Works Zoom Webhook → Captures meeting.ended event + participant list. Normalize Data → Extracts names, emails, and transcript (if available). AI (GPT-4) → Drafts short, professional follow-up emails. Gmail → Sends thank-you + recap email to each participant. 🛠️ Setup Steps 1. Zoom App Enable meeting.ended event. Include participant email/name in webhook payload. Paste workflow webhook URL. 2. Gmail Connect Gmail OAuth in n8n. Emails are sent automatically per participant. 3. OpenAI Add your OpenAI API key. Uses GPT-4 for personalized drafting. 📊 Example Output Email Subject: Follow-Up: Marketing Strategy Session Email Body: Hi Sarah, Thank you for joining our Marketing Strategy Session today. Key points we discussed: Campaign launch next Monday Budget allocation approved Need design assets by Thursday Next steps: I'll follow up with the creative team and share the updated timeline. Best, David ⚡ With this workflow, every attendee feels valued and aligned after each meeting.
by Gabriel Santos
This workflow helps HR teams run smoother monthly Q\&A sessions with employees. Who’s it for** HR teams and managers who want to centralize employee questions, avoid duplicates, and keep meetings focused. How it works** Employees submit questions through a styled form. Questions are stored in a database. HR selects a date range to review collected questions. An AI Agent deduplicates and clusters similar questions, then generates a meeting script in Markdown format. The Agent automatically creates a Google Calendar event (with a Google Meet link) on the last Friday of the current month at 16:00–17:00. The script is returned as a downloadable .txt file for HR to guide the session. Requirements** MySQL (or compatible DB) for storing questions Google Calendar credentials OpenAI (or another supported LLM provider) How to customize** Adjust meeting day/time in the Set node expressions Change database/table name in MySQL nodes Modify clustering logic in the AI Agent prompt Replace the form styling with your company’s branding This template ensures no repeated questions, keeps HR better prepared with a structured script, and automates meeting scheduling in just one click.
by Razvan Bara
How it works This workflow for trip weather forecasting is event-driven, starting when a calendar event is created or updated, and provides timely weather alerts and forecasts tailored to your travel dates and locations. Overall, this workflow efficiently integrates calendar travel plans with real-time and updated weather intelligence for ultimate travel preparedness and peace of mind. From the creator If you’re jetting off frequently, bouncing between time zones, juggling meetings, and squeezing every drop of life out of travel, you need this flow. This ain’t your grandma’s weather app. It’s a bulletproof system that scans your calendar, mines your trips, and delivers laser-targeted weather intel and urgent alerts, right when you need it. No more surprises**. No more scrambling**. Just real-time weather mastery that saves your schedule. You’re not just traveling: you’re dominating. This flow makes sure the only thing you worry about is your next move, not whether the weather’s gonna ruin it. Time to upgrade from a tourist to a boss. Step-by-step 📅 Google Calendar Triggers (Event Created/Updated): The workflow starts immediately upon creation or update of any calendar event, enabling real-time detection of new or changed travel plans. ✈ Identify Trips: Filters these calendar events to detect travel-related trips by matching keywords such as "trip," "flight," or "vacation" in titles or descriptions. 📍Extract Locations: Parses each trip event’s details to extract start and end dates and the trip destination from the summary/description/location fields. 🌐 Build interrogation URL: Constructs a Visual Crossing API request URL dynamically based on the extracted trip location and dates, including daily forecasts and alerts. Fetches the detailed weather forecast and alert data for the trip location and duration right after detecting the event. Formats the raw weather data into a readable summary 🌤️🌪🌀 including temperatures, precipitation probabilities, conditions, and eventual severe weather alerts. 📲 📧 Send Forecast: Sends the forecast summary with alerts via Telegram to keep the user informed instantly. ⌛One day before the trip: Pauses the workflow until exactly one day before the trip start date, ensuring a timely second fetch when more accurate or updated weather data is available and the updated forecast is sent. Optional You can replace the Telegram node with email, WhatsApp, Slack, SMS notifications, or add multiple notification nodes to receive them across all desired channels.
by András Farkas
E.ON W1000 → n8n → Home Assistant (Spook) “Integration” This workflow processes emails from the E.ON portal containing 15-minute +A -A (import/export) data and daily 1.8.0 2.8.0 meter readings. It extracts the required columns from the attached XLSX file, groups the 15-minute values by hour, then: updates the Spook/Recorder statistics under the IDs sensor.grid_energy_import and sensor.grid_energy_export, and sets the current meter readings for the entities input_number.grid_import_meter and input_number.grid_export_meter. > You may need to modify the workflow if there are changes in how E.ON sends scheduled exports. If the exported data format changes, please report it on Github! Requirements n8n (cloud or self-hosted) HACS addon available here: Rbillon59/home-assistant-addons Official n8n Docker Compose template Simplified n8n Docker Compose template available on Github (For Gmail) Gmail API authentication (OAuth2) read-only email access to the account receiving the messages Setup guide available here (For IMAP) IMAP provider credentials Home Assistant access via Long-Lived Access Token or API key Setup guide available here Spook integration Documentation and installation guide available here E.ON Portal Setup Create a scheduled export on the E.ON portal with the following parameters: Under the Remote Meter Reading menu, click on the + new scheduled export setting button. Specify POD identifier(s): choose one of the PODs you want to query. Specify meter variables: select the following: +A Active Power Consumption -A Active Power Feed-In DP_1-1:1.8.0*0 Active Power Consumption Daily Reading DP_1-1:2.8.0*0 Active Power Feed-In Daily Reading Export sending frequency: daily Days of historical data to include: recommended 7 days to backfill missed data. Email subject: by default, use [EON-W1000]. If processing multiple PODs with the workflow, give each a unique identifier. Home Assistant Preparation Create the following input_number entities in configuration.yaml or via Helpers: input_number: grid_import_meter: name: grid_import_meter mode: box initial: 0 min: 0 max: 9999999999 step: 0.001 unit_of_measurement: kWh grid_export_meter: name: grid_export_meter mode: box initial: 0 min: 0 max: 9999999999 step: 0.001 unit_of_measurement: kWh > If you name the entities differently, make sure to reflect these changes in the workflow. Create the following template_sensor entities in configuration.yaml or via Helpers: input_number: grid_import_meter: name: grid_import_meter mode: box initial: 0 min: 0 max: 9999999999 step: 0.001 unit_of_measurement: kWh grid_export_meter: name: grid_export_meter mode: box initial: 0 min: 0 max: 9999999999 step: 0.001 unit_of_measurement: kWh >If you name the entities differently, make sure to reflect these changes in the workflow. create the following template_sensor entities in config.yaml or via Helpers: template: sensor: name: "grid_energy_import" state: "{{ states('input_number.grid_import_meter') | float(0) }}" unit_of_measurement: "kWh" device_class: energy state_class: total_increasing name: "grid_energy_export" state: "{{ states('input_number.grid_export_meter') | float(0) }}" unit_of_measurement: "kWh" device_class: energy state_class: total_increasing > If you name the entities differently, make sure to reflect these changes in the workflow. n8n import and authentication importing the workflow In n8n → Workflows → Import from File/Clipboard → paste the JSON. Select the downloaded JSON and paste it into a new workflow using Ctrl+V. Set up n8n Credentials The credentials must be configured in the Home Assistant and Gmail nodes. The setup process is described in the Requirements section.
by Robert Schröder
Portrait Photo Upscaler Workflow Overview Automated workflow that retrieves portrait photos from Airtable, upscales them using AI, and stores the enhanced images in Google Drive with organized folder structure. Features Automated Folder Creation**: Creates timestamped folders in Google Drive AI-Powered Upscaling**: Uses Replicate's Real-ESRGAN for 2x image enhancement Batch Processing**: Handles multiple images efficiently with loop processing Error Handling**: Continues processing even if individual images fail Airtable Integration**: Retrieves images from specified database records Prerequisites Required Credentials Google Drive OAuth2 API**: For folder creation and file uploads Airtable Token API**: For accessing image records Replicate HTTP Header Auth**: For AI upscaling service Airtable Setup Column name: PortraitFotoAuswahl Column type: Attachment field containing image files Required: Valid Airtable Base ID and Table ID Workflow Steps Manual Trigger: Initiates the workflow execution Create Folder: Generates new Google Drive folder with custom name Get Airtable Record: Retrieves specified record containing portrait images Extract URLs: Processes attachment URLs from Airtable field Loop Processing: Iterates through each image for individual processing AI Upscaling: Enhances images using Replicate's Real-ESRGAN (2x scale) Download Results: Retrieves processed images from Replicate Upload to Drive: Stores enhanced images in created folder Configuration Required Inputs Folder Name**: Custom name for Google Drive folder Record ID**: Airtable record identifier containing images Base ID**: (configurable) Table ID**: (configurable) Upscaling Settings Scale Factor**: 2x (configurable) Face Enhancement**: Disabled (configurable) Model**: Real-ESRGAN v1.3 Technical Details Node Configuration Error Handling**: Continue on individual failures Response Format**: File binary for image processing Naming Convention**: LoRa{timestamp}.png Batch Processing**: Automatic item splitting API Endpoints Replicate**: https://api.replicate.com/v1/predictions Model Version**: nightmareai/real-esrgan:f121d640bd286e1fdc67f9799164c1d5be36ff74576ee11c803ae5b665dd46aa Use Cases Portrait photography enhancement Batch image processing for portfolios Automated content preparation workflows Quality improvement for archived images Output Enhanced images stored in Google Drive Organized folder structure with timestamps Preserved original filenames with processed variants Failed processes continue without stopping workflow Template Benefits Scalable**: Processes unlimited images in single execution Reliable**: Built-in error handling and continuation logic Organized**: Automatic folder management and file naming Professional**: High-quality AI upscaling for commercial use
by Miha
This n8n template turns a small, targeted HubSpot list into tailored outreach. It scans each contact’s recent Gmail conversations, builds a lightweight persona with AI (tone, goals, pain points, decision style), then drafts a concise sales email aligned to your offer—saved to Gmail as a reviewable draft. Perfect for SDRs and founders who want personalization at scale without writing from scratch. This template was originally created by Jim Le. How it works Manual trigger** starts a controlled run. HubSpot search** pulls a focused list of contacts (e.g., hs_buying_role = DECISION_MAKER). Batch loop** processes contacts one by one. Gmail fetch** grabs up to 20 recent threads from each contact. AI persona extraction** (Information Extractor + Gemini) analyzes messages to capture: decision-making style, communication preferences, goals/motivations pain points, work style, personality traits, buying behavior, values, market awareness Variables node* sets core fields (first name, last name, email) and the *offer** you want to pitch. AI email generation* (Gemini) mirrors the contact’s tone and priorities; outputs *subject* + *HTML body**. Gmail draft** is created for the contact so a rep can skim, tweak, and send. How to use Connect HubSpot on the “Get Contacts” node and refine the filter to your segment. Connect Gmail on both read and draft nodes (same account recommended). Add Gemini key to both Gemini nodes. In Variables, update product_to_sell with your offer and confirm the contact field mappings. (Optional) Tweak the persona attributes or the email prompt for tone/length/CTA. Click Test workflow. Review drafts in Gmail, edit if needed, then send. Requirements HubSpot** (OAuth2) for contact targeting Gmail** (read + draft) Google Gemini** (API key) for persona + copy generation Notes & customization Tighter targeting:** Change the HubSpot filter (e.g., industry, territory, lifecycle stage) to keep the list small and measurable. Richer inputs:** Increase Gmail limit or include received/sent filters to capture more context (mind rate limits). Brand voice:** Add a short style guide to the email generator’s system prompt (e.g., sentence length, jargon rules, sign-off). Offer variants:** Replace product_to_sell per segment, or branch by industry to load different value props. Compliance & privacy:** Limit stored outputs to essentials; avoid copying sensitive content from threads verbatim. Auto-send option:** Swap the draft step for “send email” if you want hands-off delivery for known segments.
by rana tamure
Google Email Ice Breaker Workflow Description This n8n workflow automates the creation of personalized cold emails for dental clinics to promote an AI chatbot service. It retrieves verified email data from a Google Sheet, generates tailored email subject lines and bodies using OpenAI’s GPT-4o-mini model, processes the output, and updates the Google Sheet with the results. Designed for dental clinics or marketers, it streamlines outreach by crafting engaging, seemingly hand-researched emails that drive appointment bookings through an AI chatbot integration. Key Features Data-Driven Outreach: Pulls verified emails, company names, descriptions, and websites from a Google Sheet to create targeted emails. AI-Powered Email Generation: Uses OpenAI’s GPT-4o-mini to craft concise, persuasive, and personalized cold email content. Personalization: Shortens company names and locations (e.g., "San Fran" instead of "San Francisco") and references specific business details for a tailored feel. Batch Processing: Handles multiple prospects efficiently using a looping mechanism. Google Sheet Integration: Updates the sheet with generated email subjects, bodies, and a status marker for tracking. Customizable Prompts: Easily modify the AI prompt to adapt the tone or service offering for different industries. Requirements Credentials: Google Sheets OAuth2 API (for data access) and OpenAI API (for email generation). Setup: Configure a Google Sheet with columns for "EMAIL verified", "companyName", "description", "website", "category", "email subject", "body", and "email created". Ensure the sheet is accessible via your Google account. Dependencies: No external packages required; uses n8n’s built-in Google Sheets, OpenAI, and Code nodes. How It Works Trigger & Input: Starts manually (e.g., via "Test workflow") and retrieves data from a Google Sheet, filtering for rows where "category" is "Good" and "email created" is "no". Batch Processing: Loops over filtered rows to process each prospect individually. Email Generation: OpenAI generates a JSON output with a subject and body, personalized using the prospect’s company name, description, and website. Content Processing: A Code node cleans and parses the AI output, extracting the subject and body. Sheet Update: Updates the Google Sheet with the generated subject, body, and sets "email created" to "yes". Looping: Continues processing until all prospects are handled. Benefits Time Efficiency: Automates cold email creation, reducing manual effort from hours to minutes. Personalized Outreach: Crafts emails that feel deeply researched, increasing engagement rates. Scalability: Processes large lists of prospects in batches, ideal for high-volume campaigns. Tracking: Updates the Google Sheet to track which emails have been generated. Versatility: Adaptable for other industries by modifying the AI prompt or Google Sheet structure. Potential Customizations Prompt Adjustments: Tweak the OpenAI prompt to target different services (e.g., marketing tools, SaaS products) or industries. Filter Modifications: Change Google Sheet filters to target specific prospect categories or regions. Output Expansion: Add nodes to send emails directly or integrate with CRMs like HubSpot. Notifications: Include email or Slack notifications when the workflow completes.
by Matt Chong
Automatically Rename Gmail Attachments with AI and Save to Google Drive Who is this for? This workflow is perfect for anyone who regularly receives important email attachments like reports, invoices, or PDFs and wants them: Renamed using clean AI generated filenames Automatically saved to a specific Google Drive folder Neatly organized without manual work It is ideal for freelancers, business owners, accountants, and productivity enthusiasts. What does it solve? Manually naming and organizing email attachments takes time and often leads to messy files. This workflow solves that by: Automatically downloading unread Gmail attachments Using AI to understand the content and generate clean, consistent filenames Saving the renamed files to your chosen Google Drive folder Marking emails as read after processing No more confusing filenames like "Attachment 1.pdf". How it works The workflow runs on a scheduled interval (every hour by default) It checks Gmail for any unread emails with attachments. For each email: Downloads attachments Extracts and reads PDF content Uses AI to generate a new filename in the format: YYYYMMDD-keyword-summary.pdf Saves the file to Google Drive with the new name Marks the email as read to avoid duplicates How to set up? Connect these accounts in your n8n credentials: Gmail (OAuth2) Google Drive (OAuth2) OpenAI (API key) Update the folder URL in the Google Drive node to your target folder Optional: adjust the trigger interval if you want it to run more or less often How to customize this workflow to your needs Change the AI prompt to create different naming rules, such as including sender or topic Dynamically set Drive folders based on email sender or subject