by Maximiliano Rojas-Delgado
Scale Your Creative Strategy: 100+ Creative Ads from 1 Image using Fal.AI, Nano Banana model and GPT-5.1 This workflow turns a single reference image into up to 100 high-performing ad variations using Fal.AI's Nano Banana model and GPT-5.1. Simply upload your inspiration to a form, and watch as unique, campaign-ready images appear in your Google Drive. Why use this? Scale Instantly:** Turn one idea into a full campaign without manual design work. Smart Variations:** Uses GPT-5.1 to analyze your reference and rewrite prompts based on proven creative recipes (Metaphors, Pointers, Comparisons). Cost Effective:* Generates 100 variations for approximately *$4.60** (vs. hundreds of dollars with manual design). How does it work? You submit the form with your reference image, company name, and selected "Creative Recipe." GPT-5.1 analyzes your image deeply ("Describe the image extremely comprehensively..."). It rewrites the prompt into multiple unique variations based on your chosen strategy (e.g., "The Metaphor Bake"). Fal.AI generates the images using the fast and efficient Nano Banana model. The results are saved directly to your Google Drive output folder. What do you need? Google Drive Credentials:** To upload your reference and save the results (Guide). OpenAI API Key:** For GPT-5.1 analysis and prompt engineering. Fal.AI API Key:** For generating the images (ensure you have credits). A Google Drive Folder:** One for inputs and one for outputs. 💰 Estimated Costs Fal.AI:** ~$4.00 for 100 images ($0.04/image). GPT-5.1:** ~$0.60 for 100 descriptions (Input: $1.25/1M tokens, Output: $10.00/1M tokens). Total:** ~$4.60 per 100-ad campaign. Important: The Prompt Strategy This automation doesn't just copy your image. It uses a specific instruction to "Describe the image extremely comprehensively. don't leave anything behind". This ensures GPT-5.1 captures every visual detail—lighting, composition, and mood—before rewriting it into new creative angles. That’s it! Just fill the form and let the automation build your ad campaign. If you have any questions, just contact me in X @maxrojasdelgado.
by Rahul Joshi
Description: Automatically classify invoices by industry (Retail, Manufacturing, or EdTech) using GPT-4o-powered AI parsing in this intelligent n8n automation template. Designed for teams managing high-volume billing data, this workflow fetches invoices from Google Drive, extracts PDF text, classifies each document using AI, and automatically moves files to the correct folder based on the predicted industry. This smart auto-sorting system turns your invoice processing into a zero-touch AI workflow—ideal for finance teams, document processing agencies, and operations managers dealing with multi-client or multi-industry invoicing pipelines. What This Template Does (Step-by-Step) 📂 Google Drive Search Scans a designated folder (e.g., “Incoming Invoices”) Collects all PDF files available for classification ⬇️ Download & Extract PDF Text Downloads each file using Google Drive API Extracts invoice text from PDFs using the “Extract from File” node 🔁 Batch Handling Loops through each invoice in a batch using the SplitInBatches node Ensures each document is processed one at a time 🧠 GPT-4o Mini via LangChain Agent Sends extracted invoice content to GPT-4o AI Classifies the document into one of: Retail, Manufacturing, EdTech Returns clean, structured classification output 🔀 Smart Switch Logic Evaluates the classification result Routes the invoice to the correct folder based on its predicted industry 📁 Auto-Move Files Uses Google Drive API to move files into industry-specific folders: Retail → Folder A Manufacturing → Folder B EdTech → Folder C Required Integrations: ✅ Google Drive (OAuth2 authentication) ✅ Azure OpenAI (GPT-4o or compatible model) ✅ LangChain agent setup in n8n Best For: 🧾 Finance teams classifying vendor or client invoices 🏭 Companies handling multi-industry procurement 🧠 AI automation agencies building custom document sorters 🗂️ Back-office automation for Google Drive file workflows Key Benefits: 💡 No manual labeling required — AI classifies based on content 📦 Automatically moves files to clean, organized folders 🔄 Works in batch mode for bulk invoice handling 💬 Simple prompt customization for other classification types 🧠 GPT-4o-powered classification ensures high accuracy
by Yashraj singh sisodiya
Bakery Data Analytics Workflow Explanation Aim The aim of the Bakery Data Analytics Workflow is to automate the process of analyzing bakery sales and stock data stored in Google Sheets. It allows bakery owners or managers to interact with an AI assistant via chat and receive clear, concise, and actionable insights about their business performance without manually reviewing spreadsheets. Goal The goal is to: Enable users to query bakery sales and stock data through a chat interface. Use an AI Agent to interpret user queries and fetch the required data. Retrieve relevant sales/stock figures from a Google Sheets dataset. Generate insights in plain English, with short summaries, highlights, or breakdowns. Maintain conversation context so users can ask follow-up questions naturally. This ensures that bakery owners can make quick, informed decisions about sales trends, inventory shortages, or product performance with minimal manual effort. Requirements The workflow relies on the following components and configurations: n8n Platform The automation platform hosting the workflow. Node Requirements When chat message received (Trigger) Captures user input via chat. Initiates the workflow execution. AI Agent Central reasoning engine. Interprets queries, decides when to fetch data, and ensures professional responses. Uses short, structured insights (bullets, tables, or compact summaries). Simple Memory Stores short-term conversation history. Maintains context across multiple user queries. Retrieve bakery data (Google Sheets) Connects to a linked Google Sheets file. Fetches sales/stock data (e.g., daily totals, item performance). Data source: Bakery Google Sheet. Azure OpenAI Chat Model Backend language model powering the AI Agent. Provides natural language understanding and generates concise responses. Credentials Google Sheets OAuth2 account** (for accessing bakery data). Azure OpenAI API account** (for AI-driven reasoning and conversation). Input Requirements User question/query via chat (e.g., “What was the best-selling pastry last week?”). Output Compact, conversational insights (totals, highlights, trends) delivered via chat. API Usage The workflow integrates two main APIs: Google Sheets API Used by the Retrieve bakery data node. Fetches structured data (sales, stock, dates) from the bakery dataset. Provides the AI Agent with real-time data access. Azure OpenAI API Used by the Azure OpenAI Chat Model node. Powers natural conversation, ensures responses are plain English, concise, and business-focused. Aligns with AI Agent rules to avoid assumptions and provide actionable insights only when asked. Workflow Summary The Bakery Data Analytics Workflow automates bakery performance analysis by: Triggering on chat message input. Passing the query to the AI Agent for interpretation. Using Simple Memory to track context across the conversation. Fetching relevant data from Google Sheets when needed. Leveraging Azure OpenAI to generate structured, professional responses. This creates an interactive AI-powered assistant for bakery data, enabling quick insights into sales and inventory trends without manually combing through spreadsheets.
by Rahul Joshi
Description: Deliver instant answers and automate customer support on WhatsApp with this intelligent n8n workflow template! The system routes incoming messages using keyword-based logic and provides dynamic, AI-powered responses for greetings, FAQs, and complex queries—ensuring your customers always get the right reply without manual effort. This automation is designed for businesses, service providers, and support teams who want to streamline WhatsApp engagement, reduce manual workload, and provide consistent, conversational answers that scale with demand. What This Template Does (Step-by-Step): 📲 Capture Incoming WhatsApp Messages Triggers on every new message received via WhatsApp API. 🔄 Keyword-Based Routing Sequential IF conditions check for predefined keywords (e.g., “hi”, “pricing”, “support”). 💬 Send Tailored Keyword Responses Returns fast, pre-written responses for greetings, FAQs, or common scenarios. 🤖 AI-Powered Fallback with OpenAI Chat Model For advanced or unrecognized queries, the workflow generates context-aware, conversational answers using AI. 🚀 Deliver Automated Replies in Real Time Replies are instantly sent back to WhatsApp for seamless customer communication. 📊 Optional: Conversation Logging Extend the template to log chats in Notion, Airtable, or your CRM for tracking and insights. Perfect For: Customer support teams handling repetitive queries Businesses wanting instant replies for FAQs & greetings Service providers delivering personalized, scalable engagement Anyone looking to combine rule-based automation with AI intelligence Built With: WhatsApp API (message triggers & replies) n8n IF Node (keyword routing) OpenAI Chat Model (AI fallback for complex queries) Extendable storage (Notion, Google Sheets, Airtable, etc.) Key Benefits: ✅ Faster, automated customer support on WhatsApp 🔍 Accurate, human-like replies for complex questions 🧠 Hybrid system: keyword rules + AI intelligence 📒 Centralized chat logging for insights (optional) 🛠 100% no-code and customizable in n8n
by Adrian Bent
Part two of the Indeed Job Scraper, Filter, and Enrichment workflow, this workflow takes information about the scraped and filtered job listings on Indeed via Apify, which is stored in Google Sheets to generate a customized, five-line email icebreaker that implies the rest of the icebreaker is personalized. Personalized IJSFE (Indeed Job Scraper For Enrichment). ++ I am an engineering student, so I love my acronyms. Benefits Instant Icebreaker Generation - Convert hours of research, copywriting and personalization into seconds automatically Live Integration - Generate and send personalized icebreakers whenever, wherever Virtually Complete Automation - From research into company and job description to personalized response with the AI solution, this workflow does it in a click. Professional Presentation - Because the chatbot has context about the company and the job listing, it generates an icebreaker that makes the reader think that there was some research done How It Works Google Sheets Search: Google sheets node fetches all rows where the icebreaker column is empty Each row is an item returned that contains information about the company and the job listing AI Personalization: Uses sophisticated GPT-4 prompting Converts a bunch of information about a job posting and company into a customized, five-line personalized email icebreaker Applies a consistent and casual tone automatically to seem more human-written Database Update: Updates all rows fetched in the search and updates only the icebreaker column with the new personalized icebreaker Each item is returned as a row that contains information about the company, the job listing and the icebreaker Required Template Setup Google Sheets Template: Create a Google Sheet with these variables: jobUrl - Unique identifier for job listings title - Position Title descriptionText - Description of job listing hiringDemand/isHighVolumeHiring - Are they hiring at high volume? hiringDemand/isUrgentHire - Are they hiring at high urgency? isRemote - Is this job remote? jobType/0 - Job type: In person, Remote, Part-time, etc. companyCeo/name - CEO name collected from Tavily’s search icebreaker - AI-generated icebreakers for each job listing scrapedCeo - CEO name collected from Apify Scraper email - Email listed on for job listing companyName - Name of the company that posted the job companyDescription - Description of the company that posted the job companyLinks/corporateWebsite - Website of the company that posted the job companyNumEmployees - Number of employees the company listed that they have location/country - Location of where job is to take place salary/salaryText - Salary on job listing Setup Instructions: Google Sheets Search & Update Setup: Create a new Google Sheet with these column headers in the first row Name the sheet whatever you please Connect your Google Sheets OAuth credentials in n8n Update the document ID in the workflow nodes The search logic in the first Google Sheets node relies on the ID column for icebreaker generation, so this structure is essential for the workflow to function correctly. Feel free to reach out for additional help or clarification at my Gmail: terflix45@gmail.com, and I'll get back to you as soon as I can. AI Icebreaker Generation Setup: Configure OpenAI API for sophisticated proposal writing Implement example-based training with input/output for more specific output Set up JSON formatting for structure (Personally, I think JSON is easier to handle as an output) Setup Steps: Search & Fetch Rows Setup Create a Google Sheets database with the provided column structure Connect Google Sheets OAuth credentials Configure the filer on the get rows node to only include empty icebreaker columns Set up AI Personalization Add OpenAI API credentials for personalized icebreaker generation Customize the AI prompts for your specific niche, requirements or interest Update Google Sheets Setup Remember to map all items to their respective columns based on the row number All fields in the update sheets node should have the same value as the first sheets node, and the icebreaker field is to take the ChatGPT output as its value
by vinci-king-01
Deep Research Agent with AI Analysis and Multi-Source Data Collection 🎯 Target Audience Market researchers and analysts Business intelligence teams Academic researchers and students Content creators and journalists Product managers conducting market research Consultants performing competitive analysis Data scientists gathering research data Marketing teams analyzing industry trends 🚀 Problem Statement Manual research processes are time-consuming, inconsistent, and often miss critical information from multiple sources. This template solves the challenge of automating comprehensive research across web, news, and academic sources while providing AI-powered analysis and actionable insights. 🔧 How it Works This workflow automatically conducts deep research on any topic using AI-powered web scraping, collects data from multiple source types, and provides comprehensive analysis with actionable insights. Key Components Webhook Trigger - Receives research requests and initiates the automated research process Research Configuration Processor - Validates and processes research parameters, generates search queries Multi-Source AI Scraping - Uses ScrapeGraphAI to collect data from web, news, and academic sources Data Processing Engine - Combines and structures data from all sources for analysis AI Research Analyst - Uses GPT-4 to provide comprehensive analysis and insights Data Storage - Stores all research findings in Google Sheets for historical tracking Response System - Returns structured research results via webhook response 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | sessionId | String | Unique research session identifier | "research_1703123456789" | | query | String | Research query that was executed | "artificial intelligence trends" | | timestamp | DateTime | When the research was conducted | "2024-01-15T10:30:00Z" | | analysis | Text | AI-generated comprehensive analysis | "Executive Summary: AI trends show..." | | totalSources | Number | Total number of sources analyzed | 15 | 🛠️ Setup Instructions Estimated setup time: 20-25 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials OpenAI API account and credentials Google Sheets account with API access Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up OpenAI Credentials Add OpenAI API credentials Enter your API key from OpenAI dashboard Ensure you have access to GPT-4 model Test the connection to verify API access 4. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for research data Configure the sheet name (default: "Research_Data") 5. Configure Research Parameters Update the webhook endpoint URL Customize default research parameters in the configuration processor Set appropriate search query generation logic Configure research depth levels (basic, detailed, comprehensive) 6. Test the Workflow Send a test webhook request with research parameters Verify data collection from all source types Check Google Sheets for proper data storage Validate AI analysis output quality 🔄 Workflow Customization Options Modify Research Sources Add or remove source types (web, news, academic) Customize search queries for specific industries Adjust source credibility scoring algorithms Implement custom data extraction patterns Extend Analysis Capabilities Add industry-specific analysis frameworks Implement comparative analysis between sources Create custom insight generation rules Add sentiment analysis for news sources Customize Data Storage Add more detailed metadata tracking Implement research versioning and history Create multiple sheet tabs for different research types Add data export capabilities Output Customization Create custom response formats Add research summary generation Implement citation and source tracking Create executive dashboard integration 📈 Use Cases Market Research**: Comprehensive industry and competitor analysis Academic Research**: Literature reviews and citation gathering Content Creation**: Research for articles, reports, and presentations Business Intelligence**: Strategic decision-making support Product Development**: Market validation and trend analysis Investment Research**: Due diligence and market analysis 🚨 Important Notes Respect website terms of service and robots.txt files Implement appropriate delays between requests to avoid rate limiting Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider data privacy and compliance requirements Validate research findings from multiple sources 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status OpenAI API errors: Check API key and model access permissions Google Sheets permission errors: Check OAuth2 scope and permissions Research data quality issues: Review search query generation logic Rate limiting: Adjust request frequency and implement delays Webhook response errors: Check response format and content Support Resources: ScrapeGraphAI documentation and API reference OpenAI API documentation and model specifications n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations
by tanaypant
Get daily SMS updates to tell you if you should wear a sweater
by Harshil Agrawal
This workflows allows you to create a customer and an invoice and send the invoice to the customer. QuickBooks node: This node will create a new customer in QuickBooks. QuickBooks1 node: This node will create an invoice for the customer that we created in the previous node. QuickBooks2 node: This node will send the invoice that we created in the previous node.
by Harshil Agrawal
This workflow allows you to create a collection and create, update, and get a bookmark in Raindrop. Raindrop node: This node will create a new collection in Raindrop. If you already have a collection, you can skip this node. Raindrop1 node: This node will create a new bookmark and add it to a collection. Raindrop2 node: This node will update the bookmark that we created in the previous node. Raindrop3 node: This node will return the information about the bookmark that we created earlier.
by Shahrear
🧾 Image Extraction Pipeline (Google Drive + VLM Run + n8n) ⚙️ What This Workflow Does This workflow automates the process of extracting images from uploaded documents in Google Drive using the VLM Run Execute Agent, then downloads and saves those extracted images into a designated Drive folder. 🧩 Requirements Google Drive OAuth2 credentials** VLM Run API credentials** with Execute Agent access A reachable n8n Webhook URL (e.g., /image-extract-via-agent) ⚡Quick Setup Configure Google Drive OAuth2 and create upload folder and folder for saving extracted images. Install the verified VLM Run node by searching for VLM Run in the node list, then click Install. Once installed, you can start using it in your workflows. Add VLM Run API credentials for document parsing. ⚙️ How It Works Monitor Uploads – The workflow watches a specific Google Drive folder for new file uploads (e.g., receipts, reports, or PDFs). Download File – When a file is created, it’s automatically downloaded in binary form. Extract Images (VLM Run) – The file is sent to the VLM Run Execute Agent, which analyzes the document and extracts image URLs via its callback. Receive Image Links (Webhook) – The workflow’s Webhook node listens for the agent’s response containing extracted image URLs. Split & Download – The Split Out node processes each extracted link, and the HTTP Request node downloads each image. Save Image – Finally, each image is uploaded to your chosen Google Drive folder for storage or further processing. 💡Why Use This Workflow Manual image extraction from PDFs and scanned files is repetitive and error-prone. This pipeline automates it using VLM Run, a vision-language AI service that: Understands document layout and structure Handles multi-page and mixed-content files Extracts accurate image data with minimal setup. For example- the output contains URLs to extracted images { "image_urls": [ "https://vlm.run/api/files/img1.jpg", "https://vlm.run/api/files/img2.jpg" ] } Works with both images and PDFs 🧠 Perfect For Extracting photos or receipts from multi-page PDFs Archiving embedded images from reports or invoices Preparing image datasets for labeling or ML model training 🛠️ How to Customize You can extend this workflow by: Adding naming conventions or folder structures based on upload type Integrating Slack/Email notifications when extraction completes Including metadata logging (file name, timestamp, source) into Google Sheets or a database Chaining with classification or OCR workflows using VLM Run’s other agents ⚠️ Community Node Disclaimer This workflow uses community nodes (VLM Run) that may need additional permissions and custom setup.
by Oneclick AI Squad
This n8n workflow automates the end-to-end proof-of-delivery process for logistics operations. It ingests POD data via webhook—including driver signatures, delivery photos, and GPS coordinates—performs AI-driven verification for package integrity and authenticity, updates ERP systems with delivery status, triggers automated invoicing for verified cases, and handles disputes by creating evidence-backed tickets and alerting teams. Designed for seamless integration, it minimizes errors in billing and reconciliation while accelerating resolution for mismatches. Benefits Reduced Manual Effort:** Automates verification and status updates, cutting processing time from hours to minutes. Enhanced Accuracy:** AI analysis detects damages, location discrepancies, and signature fraud with high confidence scores, preventing billing disputes. Faster Revenue Cycle:** Instant invoicing for verified deliveries improves cash flow and reduces DSO (Days Sales Outstanding). Proactive Dispute Management:** Generates high-priority tickets with linked evidence, enabling quicker resolutions and lower escalation costs. Audit-Ready Traceability:** Logs all decisions, AI outputs, and actions for compliance with logistics standards like ISO 9001. Scalability:** Handles high-volume deliveries without proportional staff increases, supporting growth in e-commerce fulfillment. Useful for Which Industry Logistics & Supply Chain:** Ideal for 3PL providers, freight forwarders, and courier services managing last-mile deliveries. E-Commerce & Retail:** Supports platforms like Amazon or Shopify sellers verifying customer receipts and automating returns. Manufacturing & Distribution:** Streamlines B2B shipments with ERP integrations for just-in-time inventory. Pharmaceuticals & Healthcare:** Ensures tamper-evident deliveries with photo verification for cold-chain compliance. Food & Beverage:** Tracks perishable goods with damage detection to maintain quality assurance. Workflow Process Webhook Intake:** Receives POD submission (driver ID, signature image, delivery photo, recipient, GPS) via POST/GET. Input Validation:** Checks for required fields; branches to error if incomplete. Parallel AI Verification:** AI Vision (OpenAI GPT-4): Analyzes photo for package condition, location match, and damage. Signature Validation: AI checks legitimacy, handwritten authenticity, and completeness. Merge & Decide:** Consolidates results with confidence scoring; routes to verified (true) or dispute (false). Verified Path:** Update ERP: POSTs status, timestamps, and coordinates to delivery system. Trigger Invoicing: Generates billable invoice with POD reference via billing API. Success Response: Returns confirmation to caller. Dispute Path:** Create Ticket: POSTs high-priority support ticket with evidence (images, scores). Alert Team: Notifies dispute team via email/Slack with issue summary and ticket link. Dispute Response: Returns status and next steps to caller. Error Handling:** Returns detailed feedback for invalid inputs. Setup Instructions Import Workflow: Paste JSON into n8n Workflows → Import from Clipboard. Configure Webhook: Set URL for POD submissions (e.g., from mobile apps); test with sample POST data. AI Setup: Add OpenAI API key to vision/signature nodes; specify GPT-4 model. Integrate Systems: Update ERP/billing URLs and auth in update/trigger nodes (e.g., https://your-erp.com/api). Dispute Config: Link support API (e.g., Zendesk) and notification service (e.g., Slack webhook). Threshold Tuning: Adjust confidence scores in decision node (e.g., >85% for auto-approve). Test Run: Execute manually with valid/invalid POD samples; verify ERP updates and ticket creation. Prerequisites n8n instance (v1.50+) with webhook and HTTP request nodes enabled. OpenAI API access for GPT-4 vision (image analysis credits required). ERP/billing APIs with POST endpoints and authentication (e.g., OAuth tokens). Support ticketing system (e.g., Zendesk, Jira) for dispute creation. Secure image storage (e.g., AWS S3) for POD uploads. Basic API testing tools (e.g., Postman) for endpoint validation. Modification Options Add OCR for recipient name extraction from photos in validation step. Integrate geofencing APIs for automated location alerts in AI vision. Support multi-signature PODs for group deliveries by expanding parallel branches. Add partial invoicing logic for mixed verified/disputed items. Incorporate blockchain for immutable POD records in high-value shipments. Extend alerts to SMS via Twilio for on-the-road driver notifications. Build analytics export to Google Sheets for delivery success rates.
by Aitor | 1Node
This workflow connects JotForm submissions to Vapi AI, triggering a personalized outbound call via an AI voice assistant immediately after a user submits your form. Requirements JotForm A JotForm account JotForm API credentials** enabled in n8n A published JotForm form with a phone number field Vapi A Vapi account with credit A connected phone number for making calls An assistant created and ready for outbound calls Your Vapi API key Workflow Steps 1. JotForm Trigger Starts the workflow when a new form submission is received. 2. Information Extractor Formats the phone number** with a +, country code, and full number (e.g., +391234567890) for compatibility with Vapi. 4. Set Fields for Vapi Configure these fields: phone_number_id: ID of the Vapi number used for the call assistant_id: ID of the Vapi assistant api_key: Your Vapi API key 5. Start Outbound Vapi Call Sends a POST request to https://api.vapi.ai/call with: The formatted phone number All required Vapi fields Any additional info mapped from the form, for personalization Customization Add more form fields:** Include extra data (such as name, appointment time) and add to the Vapi payload. Conditional logic:** Use n8n filter nodes to control if/when calls are made. Dynamic assistant selection:** Route submissions to different assistants or numbers based on user responses. Notes Ensure phone numbers are formatted correctly in the extractor node to prevent call errors. Any field from your form can be passed in the API payload and used in the assistant's script. Need Help? For additional resources or help, visit 1 Node.