by Robert Breen
Use this template to upload an image, run a first-pass OpenAI Vision analysis, then re-attach the original file (binary/base64) to the next step using a Merge node. The pattern ensures your downstream AI Agent (or any node) can access both the original file (data) and the first analysis result (content) at the same time. ✅ What this template does Collects an image file* via *Form Trigger** (binary field labeled data) Analyzes the image* with *OpenAI Vision* (GPT-4o) using *base64** input Merges* the original upload and the analysis result (combine by position) so the next node has *both** Re-analyzes/uses* the image alongside the first analysis in an *AI Agent** step 🧩 How it works (Node-by-node) Form Trigger Presents a simple upload form and emits a binary/base64 field named data. Analyze image (OpenAI Vision) Reads the same data field as base64 and runs image analysis with GPT-4o. The node outputs a text content (first-pass analysis). Merge (combine by position) Combines the two branches so the next node receives both the original upload (data) and the analysis (content) on the same item. AI Agent Receives data + content together. Prompt includes the original image (=data) and the first analysis ({{$json.content}}) to compare or refine results. OpenAI Chat Model Provides the language model for the Agent (wired as ai_languageModel). 🛠️ Setup Instructions (from the JSON) > Keep it simple: mirror these settings and you’re good to go. 1) Form Trigger (n8n-nodes-base.formTrigger) Path:* d6f874ec-6cb3-46c7-8507-bd647c2484f0 *(you can change this) Form Title:** Image Document Upload Form Description:** Upload a image document for AI analysis Form Fields:** Label: data Type: file Output:* emits a binary/base64 field named *data**. 2) Analyze image (@n8n/n8n-nodes-langchain.openAi) Resource:** image Operation:** analyze Model:** gpt-4o Text:* =data *(use the uploaded file field) Input Type:** base64 Credentials:* OpenAI (use your stored *OpenAI API** credential) 3) Merge (n8n-nodes-base.merge) Mode:** combine Combine By:** combineByPosition Connect Form Trigger → Merge (input 2) Connect Analyze image → Merge (input 1) This ensures the original file (data) and the analysis (content) line up on the same item. 4) AI Agent (@n8n/n8n-nodes-langchain.agent) Prompt Type:** define Text:** System Message:** analyze the image again and see if you get the same result. Receives:** merged item containing data + content. 5) OpenAI Chat Model (@n8n/n8n-nodes-langchain.lmChatOpenAi) Model:** gpt-4.1-mini Wiring:* connect as *ai_languageModel* to the *AI Agent** Credentials:** same OpenAI credential as above > Security Note: Store API keys in Credentials (do not hardcode keys in nodes). 🧠 Why “Combine by Position” fixes the binary issue Some downstream nodes lose access to the original binary once a branch processes it. By merging the original branch (with data) and the analysis branch (with content) by position, you restore a single item with both fields—so the next step can use the image again while referencing earlier analysis. 🧪 Test Tips Upload a JPG/PNG and execute the workflow from the Form Trigger preview. Confirm Merge output contains both data (binary/base64) and content (text). In the AI Agent, log or return both fields to verify availability. 🔧 Customize Swap GPT-4o for another Vision-capable model if needed. Extend the AI Agent to extract structured fields (e.g., objects detected, text, brand cues). Add a Router after Merge to branch into storage (S3, GDrive) or notifications (Slack, Email). 📝 Requirements n8n (cloud or self-hosted) with web UI access OpenAI** credential configured (Vision support) 🩹 Troubleshooting Binary missing downstream?* Ensure *Merge* receives *both** branches and is set to combineByPosition. Wrong field name?* The *Form Trigger* upload field must be labeled *data** to match node expressions. Model errors?* Verify your *OpenAI* credential and that the chosen model supports *image analysis**. 💬 Sticky Note (included in the workflow) > “Use Binary Field after next step” — This workflow demonstrates how to preserve and reuse an uploaded file (binary/base64) after a downstream step by using a Merge node (combineByPosition). A user uploads an image via Form Trigger → the image is analyzed with OpenAI Vision → results are merged back with the original upload so the next AI Agent step can access both the original file (data) and the first analysis (content) at the same time. 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your PDF)? 📧 rbreen@ynteractive.com 🔗 https://www.linkedin.com/in/robert-breen-29429625/ 🌐 https://ynteractive.com
by Wessel Bulte
Generate Weekly n8n Execution Report and Email Summary Description: How it works Automatically runs every 7 days to pull all n8n workflow executions from the past week Merges execution data with workflow information to provide context Generates a professional HTML report with execution statistics (errors, successes, waiting status) Sends the formatted report with Outlook or Gmail Set up steps 1. Configure n8n API Credential Go to your n8n instance → Settings → API Create a new API token with read access to workflows and executions In this workflow, add a new "n8n" credential and paste your API token This credential is used by two nodes: "Get all Workflows" and "Get all previous executions" 2. Connect Email Services Configure your Outlook credential in the "Send a message outlook" node Configure your Gmail credential in the "Send a message gmail" node Set your preferred email recipients in both nodes 3. Adjust Schedule (Optional) By default, the workflow runs every 7 days Edit the "Schedule Trigger" node to change the interval if needed Key features Tracks workflow execution status and runtime metrics Calculates average and total runtime for each status type Provides visual HTML report with color-coded status indicators Dual email delivery (Outlook + Gmail options) Requires only n8n API credentials (no external API keys needed) Need Help 🔗 LinkedIn – Wessel Bulte
by David Olusola
🧠 Workflow Summary Automates lead management by: 🔗 Webhook Trigger: Captures form data from your website. 🧼 Code Node: Standardizes the data format. 📄 Google Sheets: Appends a new row with lead info. 🔔 Slack Notification: Alerts your team instantly. ⚙️ Quick Setup 1. Import Workflow In n8n, go to Workflows → + New → Import from JSON. 2. Add Credentials Google Sheets**: Use OAuth2 to connect your account. Slack**: Create a Slack App → Add bot scopes → Connect via OAuth2. 3. Google Sheet Prep Create a sheet with these columns in row 1: Full Name Email Address Business Name Project Intent/Needs Project Timeline Budget Range Received At 4. Configure Nodes Webhook**: Use the generated URL in your form settings. Code**: Cleans and maps form fields. Google Sheets Node**: Set to Append Map fields using expressions like ={{ $json.email }} Slack Node**: Choose your channel Send a templated lead alert message with form data 5. Activate & Test Click Activate Send a test POST to the Webhook Confirm: New row in Sheet ✅ Slack alert sent ✅ 🎯 Use this automation to capture leads, log data, and notify your team—all hands-free.
by Gabriel Santos
This workflow streamlines how employees request equipment/items and how those requests reach the Procurement team. It validates the employee by enrollment number, detects whether a manager exists, and then either requests approval (if the requester has a manager) or routes the request directly to Procurement (if the requester is a manager). All messages are written in a professional tone using an LLM, and emails are sent via Gmail with a built-in approve/deny step for managers. Who’s it for HR/IT/Operations teams that handle equipment requests, need a lightweight approval flow, and want clean, professional emails without manual drafting. How it works Employee submits their enrollment number. Workflow fetches employee (and manager, if any). Employee describes the requested item(s). If a manager exists → an approval email (double opt-in) is sent to the manager. Approved → notify employee and forward a polished request to Procurement. Denied → notify employee. If the requester is a manager → skip approval and send directly to Procurement. End pages confirm the outcome. Requirements MySQL (or compatible DB) with an employees table (id, name, email, enrollment_number, manager). Gmail credentials (OAuth) in n8n. LLM provider (OpenAI or compatible) for message polishing. How to customize Replace the Procurement NoOp nodes with your email, helpdesk, or ERP integration. Adjust email copy and tone in the LLM prompt nodes. Add tracking IDs, SLA text, or CCs for auditing. Style the forms with your brand (CSS blocks provided).
by Rakin Jakaria
Use cases are many: Automate resume screening, candidate scoring, and interview communication in one seamless pipeline. Perfect for HR teams hiring at scale, startups that need quick filtering of applicants, or enterprises like Samsung running multiple roles at once. Good to know At time of writing, each Gemini request is billed per token. See Gemini Pricing for the latest info. The workflow automatically sends acceptance or rejection emails to candidates — be sure to configure your Gmail account and email templates carefully. How it works Form Submission**: Applicants fill out a custom form with their name, email, job role (Executive Assistant, IT Specialist, or Manager), and resume (PDF). Resume Processing**: The PDF resume is extracted into text using the Extract from File node. AI Evaluation**: Gemini-powered AI reviews the resume against the job role and generates: A score (0–10) A status (Accepted/Rejected) A personalized email (acceptance or rejection) Information Extraction**: The AI output is structured into fields: Score, Status, Mail Subject, and Mail Body. Email Sending**: The candidate automatically receives their personalized result via Gmail. Record Keeping**: All candidate details (Name, Job, Score, Status, Email, Email Status) are stored in Google Sheets for tracking. How to use Share the generated form link with applicants. When they submit, the system handles scoring and sends an email instantly. HR teams can review all results directly in Google Sheets. Requirements Google Gemini API key (for resume evaluation) Gmail account with OAuth2 (for sending acceptance/rejection emails) Google Sheets (for candidate tracking) n8n form node (for application collection) Customising this workflow Add more job positions to the form dropdown. Adjust the acceptance threshold (e.g., accept at 8/10 instead of 7/10). Modify email templates for a more formal or branded tone. Extend the pipeline to trigger a Calendly invite for accepted candidates. Integrate with Slack or Teams to notify HR when a candidate is accepted.
by WhySoSerious
📬 Plex Recently Added → Responsive Email Newsletter (Tautulli Alternative) What it is This workflow automatically creates a weekly Plex newsletter that highlights recently added Movies & TV Shows. It’s designed to be mobile-friendly and Gmail/iOS Mail compatible, making it easy to share Plex updates with friends, family, or your community. How it works • ⏰ Runs on a weekly schedule (customizable). • 🎬 Fetches recently added Movies & TV Shows from Tautulli API. • 📰 Builds a responsive HTML newsletter that works in Gmail, iOS Mail, and most clients. • 📧 Sends one personalized email per recipient via SMTP. • 🗒️ Every node has a Sticky Note explaining setup and purpose. How to set up Replace the placeholders in the nodes with your own details: • YOUR_TAUTULLI_URL • YOUR_API_KEY • YOUR_PLEX_TOKEN • YOUR_PLEX_SERVER_ID Update the recipient list in Prepare Emails for Recipients. Add your SMTP credentials in Send Newsletter Emails. (Optional) Customize the HTML/CSS in Generate HTML Newsletter. Requirements • Plex Media Server with Tautulli installed. • SMTP account (Gmail, custom domain, etc.). Customization • Change the schedule to daily/weekly as needed. • Edit the HTML template for your own branding. • Extend with additional nodes (Discord, Slack, etc.). ⸻ ⚡ Workflow Overview: ``⏰ Schedule Trigger → 🎬 Fetch Movies → 📺 Fetch TV → 🔗 Merge → 📰 Build HTML → 📧 Prepare Recipients → 📤 Send → ✅ Finish ``
by Avkash Kakdiya
How it works This workflow automates the generation of ad-ready product images by combining product and influencer photos with AI styling. It runs on a scheduled trigger, fetches data from Google Sheets, and retrieves product and influencer images from Google Drive. The images are processed with OpenAI and OpenRouter to generate enhanced visuals, which are then saved back to Google Drive. Finally, the result is logged into Google Sheets with a ready-to-publish status. Step-by-step 1. Trigger & Data preparation Schedule Trigger** – Runs workflow automatically on a set schedule. Google Sheets (Get the Raw)** – Retrieves today’s product and model URLs. Google Drive (Download Product Image)** – Downloads the product image. Google Drive (Download Influencer Image)** – Downloads the influencer image. Extract from File (Binary → Base64)** – Converts both product and model images for AI processing. 2. AI analysis & image generation OpenAI (Analyze Image)** – Creates an ad-focused visual description (lighting, mood, styling). HTTP Request (OpenRouter Gemini)** – Generates an AI-enhanced image combining product + influencer. Code Node (Cleanup)** – Cleans base64 output to remove extra prefixes. Convert to File** – Transforms AI output into a proper image file. 3. Save & update Google Drive (Upload Image)** – Uploads generated ad image to target folder. Google Sheets (Append/Update Row)** – Stores the Drive link and updates publish status. Why use this? Automates the entire ad image creation process without manual design work. Ensures product visuals are consistent, styled, and ad-ready. Saves final creatives in Google Drive for easy access and sharing. Keeps campaign tracking organized by updating Google Sheets automatically. Scales daily ad production efficiently for multiple products or campaigns.
by Mohamed Abubakkar
How it works This workflow fully automates the reconciliation process between your Local Database transactions and Payment Gateway transactions. It compares both data sources, identifies mismatches, categorizes discrepancies, logs them into Google Sheets, generates a final summary, and sends an automated reconciliation report to your finance team. This ensures accurate, consistent, and error-free financial reporting without manual work. Key Features Automatic data extraction from two Google Sheets Transaction comparison with result categorization Duplicate detection Real-time discrepancy logging Summary generation and storage Automated email reporting Zero manual effort required Setup Steps 1. Connect Required Credentials You must connect the following credentials: Google Sheets (Service Account or OAuth) Email SMTP (Gmail or custom) 2. Replace Default Values Update the workflow with: Your Google Sheet IDs Your tab/sheet names SMTP email, sender, and recipient Optional: Custom domain or business branding 3. Customize Email Template Modify subject, message body, or formatting style based on your reporting standards. 4. Adjust Trigger You may choose: Manual Trigger Cron Trigger for daily/weekly reconciliation Webhook Trigger integrated with your system Detailed Process Flow 1. Fetch Local & Payment Gateway Transactions The workflow reads all transaction records from the Local Database Sheet and Payment Gateway Sheet. 2. Compare both Transactions Using the compare operation, the workflow splits result into a. Valid Transactions b. Invalid Transactions c. Amount Differences Transactions d. Missing Transactions 3. Duplicate Transaction Detection The workflow scans local transactions to detect duplicate transaction and logs them seperately 5. Logging Transactions Each category is appended to its respective Google Sheet: DuplicateData AmountDifference DataNotInsert Reconciliation Summary RealData 6. Count all categories The workflow counts: Number of valid transactions Number of invalid transactions Number of missing transactions Number of Amount mismatch transactions The final summary row is appended to a dedicated Reconciliation Summary Sheet. 7. Send Final Email Report The finance team receive a email of final summary report. Final Output At the end the workflow, you get: Fully categorized reconciliation logs Complete Summary Stored Sheets Email Report Clean audit ready data
by Supira Inc.
How It Works This workflow automatically classifies incoming Gmail messages into categories such as High Priority, Inquiry, and Finance/Billing, and then generates professional draft replies using GPT-4. By combining Gmail integration with AI-powered text generation, the workflow helps business owners and freelancers reduce the time spent managing emails while ensuring that important messages are handled quickly and consistently. When a new email arrives, the workflow: Triggers via Gmail. Uses an AI classifier to categorize the message. Applies the appropriate Gmail label. Passes the email body to GPT-4 to generate a tailored draft reply. Saves the draft in Gmail, ready for review and sending. Requirements A Gmail account with API access enabled. An OpenAI API key with GPT-4 access. n8n account or self-hosted instance. Setup Instructions Import this workflow into your n8n instance. Under Credentials, connect your Gmail account and OpenAI API key. Replace placeholder YOUR_LABEL_ID_XXX values with your Gmail label IDs (obtainable via Gmail → List Labels). Execute the workflow and check that draft replies are generated in your Gmail account. Customization Add or edit categories to fit your business needs (e.g., “Sales Leads” or “Support”). Adjust the GPT-4 prompts inside each “Generate Draft” node to match your preferred tone and style. Combine with other workflows (e.g., CRM integration, Slack alerts) for a complete email automation system. This template is especially useful for small businesses and freelancers who want to save time, improve response speed, and maintain professional communication without manually writing every reply.
by SIÁN Agency
Who is this for Real estate investors comparing markets across cities, agencies generating market reports for clients, property consultants doing due diligence, or analysts tracking price trends in Southern European property markets. What this workflow does Every Monday at 8am, this workflow scrapes property listings from multiple Idealista markets, calculates key statistics, builds an HTML comparison report, emails it to you, and logs data to Google Sheets for long-term trend tracking. The Schedule Trigger fires every Monday at 8am Two Idealista Scraper nodes fetch Madrid and Barcelona listings in parallel via API-based extraction (never breaks) Code nodes calculate per-market statistics: average/median price, price range, price per m², average size, average rooms The Merge node combines both market analyses into one dataset A Code node builds a professionally formatted HTML comparison table The report is emailed via Gmail and weekly stats are logged to Google Sheets Idealista has no official API and no built-in market analytics. This workflow turns raw listing data into actionable market intelligence, automatically, every week. Setup Install n8n-nodes-idealista-scraper via Settings > Community Nodes (self-hosted n8n only) Add your Apify API credential (get token) Add your Gmail credential (OAuth2) Create a Google Sheet with a tab named "MarketHistory" Update the email recipient in the Gmail node Activate the workflow Requirements Self-hosted n8n instance (community node not available on n8n Cloud) Apify account with API token Gmail account with OAuth2 credential configured in n8n Google Sheets account with OAuth2 credential configured in n8n How to customize this workflow Add more cities by duplicating a Scraper + Analysis pair (Valencia, Rome, Lisbon, Milan) Switch operation from sale to rent to analyze rental markets Add price filters to focus on specific segments (luxury above 1M EUR, budget below 200K EUR) Calculate rental yield by scraping both sale and rent for the same area Add an IF node after analysis to send alerts when average price drops below a threshold Cost: $0.50/week (2 markets x 3 pages x 40 properties each)
by Jason Krol
Using the power and ease of Telegram, send a simple text or audio message to a bot with a request to add a new Task to your Notion Tasks database. How it works ChatGPT is used to transacribe the audio or text message, parse it, and determine the title to add as a new Notion Task. You can optionally include a "do date" as well and ChatGPT will include that when creating the task. Once complete you will receive a simple confirmation message back. Minimal Setup Required Just follow n8n's instructions on how to connect to Telegram and create your own chatBot, provide the chatID in the 2 Telegram nodes, and you're finished! A few optional settings include tweaking the ChatGPT system prompt (unnecessary) and the timezone for your Notion Task(s).
by clearcue.ai
Who’s it for This workflow is for marketers, founders, and content strategists who want to identify business opportunities by analyzing Reddit discussions. It’s ideal for B2B, SaaS, and tech professionals looking for fresh LinkedIn post ideas or trend insights. How it works / What it does This workflow automatically: Fetches Reddit posts & comments based on a selected subreddit and keyword. Extracts pain points & insights using OpenAI (ChatGPT) to identify key frustrations and trends. Generates LinkedIn post ideas with headlines, hooks, and CTAs tailored for professional audiences. Saves all results into Google Sheets for easy tracking, editing, and sharing. It uses AI to turn unstructured Reddit conversations into actionable content marketing opportunities. How to set up Clone this workflow in your n8n instance. Configure credentials: Reddit OAuth2 (for fetching posts & comments) OpenAI API key (no hardcoding—use credentials in n8n) Google Sheets OAuth2 (for output) Run the workflow or trigger it using the built-in Form Trigger (provide subreddit & keyword). Check the generated Google Sheet for analyzed insights and post suggestions. Requirements n8n (self-hosted or cloud) Reddit account with API credentials OpenAI API key (GPT-4o recommended) Google Sheets account How to customize the workflow Change the AI prompt to adjust tone or depth of insights. Add filtering logic to target posts with higher engagement. Modify the Google Sheets output schema to include custom fields. Extend it with Slack/Email notifications to instantly share top insights.