by Automate With Marc
Automatic Personalized Sales Follow-Up with GPT-5, Pinecone, and Tavily Research Description Never let a lead go cold. This workflow automatically sends personalized follow-up emails to every inbound inquiry. It combines GPT-5, Pinecone Vector DB, and Tavily Research to craft responses that align with your brand’s best practices, tone, and the latest product updates. With embedded research tools, every response is both timely and relevant—helping your sales team convert more leads without manual effort. 👉 Watch step-by-step builds of workflows like these on: www.youtube.com/@automatewithmarc How It Works Form Trigger – Captures inbound lead details (name, company, email, and message). AI Sales Agent (GPT-5) – Researches the lead’s business and problem statement, referencing Pinecone for your brand guidelines and product updates. Uses Tavily research for real-time enrichment. Structured Output Parser – Ensures the subject line and email body are formatted cleanly in JSON. Send Follow-Up Email (Gmail Node) – Delivers a polished, ready-to-go follow-up directly to the lead. Simple Memory – Maintains context across follow-ups for more natural conversations. Why Sales Teams Will Love It ⏱ Faster responses — every lead gets an immediate, high-quality reply. 📝 On-brand every time — Pinecone ensures tone matches your playbook. 🌍 Research-driven — Tavily enriches responses with fresh, relevant context. 📈 Higher conversions — timely, personalized outreach drives more meetings. 🤖 Hands-off automation — sales reps focus on closing, not chasing. Setup Instructions Form Trigger Configure your inbound form to capture lead details (name, email, company, message). Connect it to this workflow. Pinecone Setup Create a Pinecone index and embed your brand guidelines, sales playbook, and product updates. Update the Pinecone Vector Store node with your index name. Tavily Setup Add your Tavily API key to the Tavily Research node. OpenAI Setup Add your OpenAI API key to the GPT-5 Chat Model node. Adjust the system prompt inside the AI Agent to reflect your company’s style and tone. Gmail Node Connect your Gmail account to the Send Follow-Up Email node. Update sender details if you want the emails to come from a shared inbox or a rep’s personal account. Customization Tone of Voice – Modify the system prompt inside the AI Agent to be more professional, casual, or industry-specific. Scheduling Links – Replace the default Calendly link with your own booking tool. Form Fields – Add or remove fields depending on the information you collect (e.g., budget, role, region). Requirements Gmail account (for sending follow-up emails) OpenAI API key (GPT-5) Pinecone account (for storing/retrieving guidelines + updates) Tavily API key (for online research enrichment)
by Sk developer
This workflow fetches free Udemy courses hourly via the Udemy Coupons and Courses API on RapidAPI, filters them, and updates a Google Sheet. It sends alerts on errors for smooth monitoring. Node-by-Node Explanation Schedule Trigger: Runs the workflow every hour automatically. Fetch Udemy Coupons: Sends POST request to the Udemy Coupons and Courses API on RapidAPI to get featured courses. Check API Success: Verifies if the API response is successful; routes accordingly. Filter Free Courses: Selects only courses with sale_price of zero (free courses). Send Error Notification: Emails admin if API fetch fails for quick action. Sync Courses to Google Sheet: Appends or updates the filtered free courses into Google Sheets. Google Sheets Columns id name price sale_price image lectures views rating language category subcategory slug store sale_start Google Sheets Setup & Configuration Steps Create Google Sheet: Create or open a Google Sheet where you want to sync courses. Set Headers: Add columns headers matching the fields synced (id, name, price, etc.). Enable Google Sheets API: Go to Google Cloud Console, enable Google Sheets API for your project. Create Service Account: In Google Cloud Console, create a Service Account with editor access. Download Credentials: Download the JSON credentials file from the service account. Share Sheet: Share your Google Sheet with the Service Account email (found in JSON file). Configure n8n Google Sheets Node: Use the service account credentials, set operation to “Append or Update”, provide Sheet URL and sheet name or gid. Match Columns: Map the course fields to your sheet columns and set id as the unique key for updates. How to Obtain RapidAPI Key & Setup API Request Sign up/Login: Visit RapidAPI Udemy Coupons and Courses API and create an account or log in. Subscribe to API: Subscribe to the Udemy Coupons and Courses API plan (free or paid). Get API Key: Navigate to your dashboard and copy your x-rapidapi-key. Configure HTTP Request: In your workflow’s HTTP Request node: Set method to POST. URL: https://udemy-coupons-and-courses.p.rapidapi.com/featured.php Add headers: x-rapidapi-host: udemy-coupons-and-courses.p.rapidapi.com x-rapidapi-key: your copied API key Set content type to multipart/form-data. Add body parameter: page=1 (or as needed). Test API: Run the node to ensure the API responds with data successfully before continuing workflow setup. Use Cases & Benefits Automates daily updates of free Udemy courses in your sheet using the Udemy Coupons and Courses API on RapidAPI. Saves manual effort in tracking coupons and deals. Enables quick error alerts to maintain data accuracy. Ideal for course aggregators, affiliate marketers, or learning platforms needing fresh course data. Who This Workflow Is For Content curators and edtech platforms tracking free courses. Affiliate marketers promoting Udemy deals. Anyone needing real-time access to updated free Udemy coupons.
by Msaid Mohamed el hadi
Overview This workflow automates the discovery, extraction, enrichment, and storage of business information from Google Maps search queries using AI tools, scrapers, and Google Sheets. It is ideal for: Lead generation agencies Local business researchers Digital marketing firms Automation & outreach specialists 🔧 Tools & APIs Used Google Maps Search (via HTTP)** Custom JavaScript Parsing** URL Filtering & De-duplication** Google Sheets (Read/Write)** APIFY Actor** for business scraping LangChain AI Agent** (OpenRouter - Gemini 2.5) n8n Built-in Logic** (Loops, Conditions, Aggregators) 🧠 Workflow Summary Trigger The automation starts via schedule (every hour). Read Queries from Google Sheet Loads unprocessed keywords from a Google Sheet tab named keywords. Loop Through Keywords Each keyword is used to search Google Maps for relevant businesses. Extract URLs JavaScript parses HTML to find all external website URLs from the search results. Clean URLs Filters out irrelevant domains (e.g., Google-owned, example.com, etc.), and removes duplicates. Loop Through URLs For each URL: Checks if it already exists in the Google Sheet (to prevent duplication). Calls the APIFY Actor to extract full business data. Optionally uses AI Agent (Gemini) to provide detailed insight on the business, including: Services, About, Market Position, Weaknesses, AI suggestions, etc. Converts the AI result (text) to a structured JSON object. Save to Google Sheet Adds all extracted and AI-enriched business information to a separate tab (Sheet1). Mark Queries as Processed Updates the original row in keywords to avoid reprocessing. 🗃️ Output Fields Saved The following information is saved per business: Business Name, Website, Email, Phone Address, City, Postal Code, Country, Coordinates Category, Subcategory, Services About Us, Opening Hours, Social Media Links Legal Links (Privacy, Terms) Logo, Languages, Keywords AI-Generated Description** Google Maps URL 📈 Use Cases Build a prospect database for B2B cold outreach. Extract local SEO insights per business. Feed CRMs or analytics systems with enriched business profiles. Automate market research for regional opportunity detection. 📩 Want a Similar Workflow? If you’d like a custom AI-powered automation like this for your business or agency, feel free to contact me: 📧 msaidwolfltd@gmail.com
by Sulieman Said
How it Works This workflow automates the process of discovering companies in different cities, extracting their contact data, and storing it in Airtable. City Loop (Airtable → Google Maps API) Reads a list of cities from Airtable. Uses each city combined with a search term (e.g., SEO Agency, Berlin) to query Google Maps. Marks processed cities as “checked” to allow safe restarts if interrupted. Business Discovery & Deduplication Searches for businesses via Google Maps Text Search. Checks Airtable to avoid scraping the same company multiple times. Fetches detailed info for each business via Google Maps Place Details API. Impressum Extraction (Website → HTML Parsing) Builds an Impressum page URL for each business. Requests the HTML and cleans out ads, headers, footers, etc. Extracts relevant contact info using an AI extractor (OpenAI node). Contact Information Extraction Pulls out: Decision Maker (Name + Position in one string, if available). Email address (must be valid, containing @). Phone number (international format if possible). Filters out incomplete results (e.g., empty email). Database Storage Writes company data back into Airtable: Company name Address Website Email Phone number Decision Maker (Name + Position) Search term & city used Setup Steps 1. Prerequisites Google Maps API Key with access to: Places API → Text Search + Place Details Airtable base with at least two tables: Cities (with columns: ID, City, Country, Status) Companies (for scraped results) OpenAI API key (for decision maker + contact extraction). 2. Authentication Configure your Airtable API credentials in n8n. Set up HTTP Query Auth with your Google Maps API key. Add your OpenAI API key in the OpenAI Chat node. 3. Configuration In the Airtable “Cities” table, list all cities you want to scrape. Define your search term in the “Execute Workflow” node (e.g., SEO Agency). Adjust the batch sizes and wait intervals if you want faster/slower scraping (Google API has strict rate limits). 4. Execution Start manually or from another workflow. The workflow will scrape all companies in each city step by step. It can be safely stopped and resumed — cities already marked as processed will be skipped. 5. Results Enriched company dataset stored in Airtable, ready for CRM import, lead generation, or further automation. Tips & Notes Always respect GDPR and local laws when handling scraped data. The workflow is modular → you can swap Airtable with Google Sheets, Notion, or a database of your choice. Add custom filters to limit results (e.g., only companies with websites). Use sticky notes inside the workflow to understand each step (mandatory for template publishing). Keep an eye on Google Places API costs** — queries are billed after the free quota. If you are still within the first 2 months of the Google Cloud Developer free trial, you can benefit from free credits. Questions or custom requests? 📩 suliemansaid.business@gmail.com
by Cheng Siong Chin
Introduction Automate peer review assignment and grading with AI-powered evaluation. Designed for educators managing collaborative assessments efficiently. How It Works Webhook receives assignments, distributes them, AI generates review rubrics, emails reviewers, collects responses, calculates scores, stores results, emails reports, updates dashboards, and posts analytics to Slack. Workflow Template Webhook → Store Assignment → Distribute → Generate Review Rubric → Notify Slack → Email Reviewers → Prepare Response → Calculate Score → Store Results → Check Status → Generate Report → Email Report → Update Dashboard → Analytics → Post to Slack → Respond to Webhook Workflow Steps Receive & Store: Webhook captures assignments, stores data. Distribute & Generate: Assigns peer reviewers, AI creates rubrics. Notify & Email: Alerts via Slack, sends review requests. Collect & Score: Gathers responses, calculates peer scores. Report & Update: Generates reports, emails results, updates dashboard. Analyze & Alert: Posts analytics to Slack, confirms completion. Setup Instructions Webhook & Storage: Configure endpoint, set up database. AI Configuration: Add OpenAI key, customize rubric prompts. Communication: Connect Gmail, Slack credentials. Dashboard: Link analytics platform, configure metrics. Prerequisites OpenAI API key Gmail account Slack workspace Database or storage system Dashboard tool Use Cases University peer review assignments Corporate training evaluations Research paper assessments Customization Multi-round review cycles Custom scoring algorithms LMS integration (Canvas, Moodle) Benefits Eliminates manual distribution Ensures consistent evaluation Provides instant feedback and analytics
by Jose Cuartas
Sync Gmail emails to PostgreSQL with S3 attachment storage Automated Gmail Email Processing System Who's it for Businesses and individuals who need to: Archive email communications in a searchable database Backup email attachments to cloud storage Analyze email patterns and communication data Comply with data retention policies Integrate emails with other business systems What it does This workflow automatically captures, processes, and stores Gmail emails in a PostgreSQL database while uploading file attachments to S3/MinIO storage. It handles both individual emails (via Gmail Trigger) and bulk processing (via Schedule Trigger). Key features: Dual processing: real-time individual emails + scheduled bulk retrieval Complete email metadata extraction (sender, recipients, labels, timestamps) HTML to plain text conversion for searchable content Binary attachment processing with metadata extraction Organized S3/MinIO file storage structure UPSERT database operations to prevent duplicates How it works Email Capture: Gmail Trigger detects new emails, Schedule Trigger gets bulk emails from last hour Parallel Processing: Emails with attachments go through binary processing, others go directly to transformation Attachment Handling: Extract metadata, upload to S3/MinIO, create database references Data Transformation: Convert Gmail API format to PostgreSQL structure Storage: UPSERT emails to database with linked attachment information Requirements Credentials needed: Gmail OAuth2 (gmail.readonly scope) PostgreSQL database connection S3/MinIO storage credentials Database setup: Run the provided SQL schema to create the messages table with JSONB fields for flexible data storage. How to set up Gmail OAuth2: Enable Gmail API in Google Cloud Console, create OAuth2 credentials PostgreSQL: Create database and run the SQL schema provided in setup sticky note S3/MinIO: Create bucket "gmail-attachments" with proper upload permissions Configure: Update authenticatedUserEmail in transform scripts to your email Test: Start with single email before enabling bulk processing How to customize Email filters**: Modify Gmail queries (in:sent, in:inbox) to target specific emails Storage structure**: Change S3 file path format in Upload node Processing schedule**: Adjust trigger frequencies based on email volume Database fields**: Extend PostgreSQL schema for additional metadata Attachment types**: Add file type filtering in binary processing logic Note: This workflow processes emails from the last hour to avoid overwhelming the system. Adjust timeframes based on your email volume and processing needs.
by Muhammad Bello
Description This n8n template demonstrates how to turn raw YouTube comments into research-backed content ideas complete with hooks and outlines. Use cases include: Quickly mining a competitor’s audience for video ideas. Generating hooks and outlines for your own channel’s comments. Validating content opportunities with live audience feedback. Good to know Apify is used to scrape YouTube comments (requires an API token). GPT-4.1-mini is used for both filtering and content generation. Tavily provides fresh research to ground the AI’s responses. All outputs are stored in Google Sheets, making it easy to manage and track ideas. How it works Trigger – Paste a YouTube URL into the chat trigger. Scrape Comments – Apify fetches all comments and metadata. Filter – GPT-4.1-mini decides if each comment could inspire a content idea. Store – Comments and “Yes/No” decisions are appended to Google Sheets. Research & Enrich – For “Yes” comments, Tavily provides context, and GPT generates a topic, hook, and outline. Update Sheet – The same row in Google Sheets is updated with enriched fields. Google Sheets Setup Your Google Sheet should include these columns (in this order): id | text | author | likes | isIdea | topic | research | hook | outline id** – unique identifier for each comment text** – the full YouTube comment author** – commenter’s name/handle likes** – number of likes on the comment isIdea** – “Yes” or “No” depending on GPT filter topic** – extracted video topic research** – 300–500 word background from Tavily hook** – engaging opening sentence for a video outline** – structured video outline Setup Steps Connect your Apify, OpenAI, Tavily, and Google Sheets credentials in n8n. Point the Google Sheets nodes to your own document and ensure the above headers exist. Replace sample API keys with your own stored in n8n Credentials. Time to set up: \~15–25 minutes for a first-time n8n user (less if you already have credentials handy). Customizing this workflow Filter logic** – Loosen the GPT filter to allow borderline ideas, or tighten it to only accept the best ones. Research depth** – Change Tavily’s search depth (e.g., depth: basic vs depth: advanced) to control how detailed the background research is. Notification channels* – Send new “Yes” ideas directly to *Slack* (#content-ideas), *Notion* (your content board), or *Email** (notify the content manager instantly). Alternative outputs** – Instead of hooks/outlines, generate: A script draft for YouTube Shorts. Blog post angles based on the same audience comments. A poll question for community engagement.
by Oneclick AI Squad
This n8n workflow monitors and alerts you about new construction projects in specified areas, helping you track competing builders and identify business opportunities. The system automatically searches multiple data sources and sends detailed email reports with upcoming projects. Good to know Email parsing accuracy depends on the consistency of request formats - use the provided template for best results. The workflow includes fallback mock data for demonstration when external APIs are unavailable. Government data sources may have rate limits - the workflow includes proper error handling. Results are filtered to show only upcoming/recent projects (within 3 months). How it works Email Trigger** - Detects new email requests with "Construction Alert Request" in the subject line Check Email Subject** - Validates that the email contains the correct trigger phrase Extract Location Info** - Parses the email body to extract area, city, state, and zip code information Search Government Data** - Queries government databases for public construction projects and permits Search Construction Sites** - Searches construction industry databases for private projects Process Construction Data** - Combines and filters results from both sources, removing duplicates Wait For Data** - Wait for Combines and filters results. Check If Projects Found** - Determines whether to send a results report or no-results notification Generate Email Report** - Creates a professional HTML email with project details and summaries Send Alert Email** - Delivers the construction project report to the requester Send No Results Email** - Notifies when no projects are found in the specified area The workflow also includes a Schedule Trigger that can run automatically on weekdays at 9 AM for regular monitoring. Email Format Examples Input Email Format To: alerts@yourcompany.com Subject: Construction Alert Request Area: Downtown Chicago City: Chicago State: IL Zip: 60601 Additional notes: Looking for commercial projects over $1M Alternative format: To: alerts@yourcompany.com Subject: Construction Alert Request Please search for construction projects in Miami, FL 33101 Focus on residential and mixed-use developments. Output Email Example Subject: 🏗️ Construction Alert: 8 Projects Found in Downtown Chicago 🏗️ Construction Project Alert Report Search Area: Downtown Chicago Report Generated: August 4, 2024, 2:30 PM 📊 Summary Total Projects Found: 8 Search Query: Downtown Chicago IL construction permits 🔍 Upcoming Construction Projects New Commercial Complex - Downtown Chicago 📍 Location: Downtown Chicago | 📅 Start Date: March 2024 | 🏢 Type: Mixed Development Description: Mixed-use commercial and residential development Source: Local Planning Department Office Building Construction - Chicago 📍 Location: Chicago, IL | 📅 Start Date: April 2024 | 🏢 Type: Commercial Description: 5-story office building with retail space Source: Building Permits [Additional projects...] 💡 Next Steps • Review each project for potential competition • Contact project owners for partnership opportunities • Monitor progress and timeline changes • Update your competitive analysis How to use Setup Instructions Import the workflow into your n8n instance Configure Email Credentials: Set up IMAP credentials for receiving emails Set up SMTP credentials for sending alerts Test the workflow with a sample email Set up scheduling (optional) for automated daily checks Sending Alert Requests Send an email to your configured address Use "Construction Alert Request" in the subject line Include location details in the email body Receive detailed project reports within minutes Requirements n8n instance** (cloud or self-hosted) Email account** with IMAP/SMTP access Internet connection** for API calls to construction databases Valid email addresses** for sending and receiving alerts API Integration Code Examples Government Data API Integration // Example API call to USA.gov jobs API const searchGovernmentProjects = async (location) => { const response = await fetch('https://api.usa.gov/jobs/search.json', { method: 'GET', headers: { 'Content-Type': 'application/json', }, params: { keyword: 'construction permit', location_name: location, size: 20 } }); return await response.json(); }; Construction Industry API Integration // Example API call to construction databases const searchConstructionProjects = async (area) => { const response = await fetch('https://www.construction.com/api/search', { method: 'GET', headers: { 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36', 'Accept': 'application/json' }, params: { q: ${area} construction projects, type: 'projects', limit: 15 } }); return await response.json(); }; Email Processing Function // Extract location from email content const extractLocationInfo = (emailBody) => { const lines = emailBody.split('\n'); let area = '', city = '', state = '', zipcode = ''; for (const line of lines) { if (line.toLowerCase().includes('area:')) { area = line.split(':')[1]?.trim(); } if (line.toLowerCase().includes('city:')) { city = line.split(':')[1]?.trim(); } if (line.toLowerCase().includes('state:')) { state = line.split(':')[1]?.trim(); } if (line.toLowerCase().includes('zip:')) { zipcode = line.split(':')[1]?.trim(); } } return { area, city, state, zipcode }; }; Customizing this workflow Adding New Data Sources Add HTTP Request nodes for additional APIs Update the Process Construction Data node to handle new data formats Modify the search parameters based on API requirements Enhanced Email Parsing // Custom email parsing for different formats const parseEmailContent = (emailBody) => { // Add regex patterns for different email formats const patterns = { address: /(\d+\s+[\w\s]+,\s[\w\s]+,\s[A-Z]{2}\s*\d{5})/, coordinates: /(\d+\.\d+),\s*(-?\d+\.\d+)/, zipcode: /\b\d{5}(-\d{4})?\b/ }; // Extract using multiple patterns // Implementation details... }; Custom Alert Conditions Modify the Check If Projects Found node to filter by: Project value/budget Project type (residential, commercial, etc.) Distance from your location Timeline criteria Advanced Scheduling // Set up multiple schedule triggers for different areas const scheduleConfigs = [ { area: "Downtown", cron: "0 9 * * 1-5" }, // Weekdays 9 AM { area: "Suburbs", cron: "0 14 * * 1,3,5" }, // Mon, Wed, Fri 2 PM { area: "Industrial", cron: "0 8 * * 1" } // Monday 8 AM ]; Integration with CRM Systems Add HTTP Request nodes to automatically create leads in your CRM when high-value projects are found: // Example CRM integration const createCRMLead = async (project) => { await fetch('https://your-crm.com/api/leads', { method: 'POST', headers: { 'Authorization': 'Bearer YOUR_TOKEN', 'Content-Type': 'application/json' }, body: JSON.stringify({ name: project.title, location: project.location, value: project.estimatedValue, source: 'Construction Alert System' }) }); }; Troubleshooting No emails received**: Check IMAP credentials and email filters Empty results**: Verify API endpoints and add fallback data sources Failed email delivery**: Confirm SMTP settings and recipient addresses API rate limits**: Implement delays between requests and error handling
by Ninmegne Paul
🔧 How it works Scheduled Trigger The workflow is triggered automatically every day at 12:00 PM using a Cron node. RSS Feed Collection It fetches the latest content from multiple RSS feeds related to Technology, Manga, and Movies. Content Processing & Formatting The collected data is filtered and organized based on your interests. A dynamic HTML email template is generated to present the content in a clean and readable layout. Email Delivery The final newsletter is sent directly to your inbox using the Send Email node. ⚙️ Set up steps Configure RSS Sources Update the RSS feed URLs inside the Set nodes to match your preferred sources. Set Email Recipient Replace the email address in the Send Email node with your own. Adjust Schedule Modify the execution time in the Cron Trigger node if you want the newsletter to be sent at a different time. Activate the Workflow Enable the workflow to start receiving your personalized daily newsletter automatically.
by Carl Fung
✨ Intro This workflow shows how to go beyond a “plain” AI chatbot by: 🧠 Adding a Personality Layer — Link an extra LLM to inject a custom tone and style. Here, it’s Nova, a sassy, high-fashion assistant. You can swap in any personality without changing the main logic. 🎨 Custom Styling with CSS — Easily restyle the chatbot to match your brand or project theme. Together, these make your bot smart, stylish, and uniquely yours. ⚙️ How it Works 📥 Route Input Chat trigger sends messages to a Switch. If a Telegram video note exists → runs the audio path. Otherwise → runs the text path. 🎤 Audio Path Telegram Get a File → OpenAI Speech-to-Text → pass transcript to the agent. 💬 Text Path Chat text is normalized and sent to the agent. 🛠 Agent Brain Uses tools like Gmail 📧, Google Calendar 📅, Google Drive 📂, Airtable 📋, SerpAPI 🌐, Wikipedia 📚, Hacker News 📰, and Calculator ➗. 🧾 Memory Keeps the last 20 messages for context-aware replies. 💅 Optional Personality Polish An LLM Chain adds witty or cheeky tone on top of the agent’s response. 🛠 Setup Steps ⏱ Time Required ~10–15 minutes (+5 minutes for each Google/Airtable connection). 🔑 Connect Credentials OpenAI (and/or Anthropic) Telegram Bot Gmail, Google Calendar, Google Drive Airtable SerpAPI 📌 Configure IDs Set Airtable base/table. Set Calendar email. Adjust Drive search query defaults if needed. 🎙 Voice Optional Disable Telegram + Transcribe nodes if you only want text chat. 🎭 Choose Tone Edit Chat Trigger’s welcome text/CSS for custom look. Or disable persona chain for neutral voice. 🚀 Publish Activate workflow and share the chat URL. 💡 Detailed behavior notes are available as sticky notes inside the workflow.
by Davide
This workflow automates the process of generating advertising (ADV) images from multiple reference images and publishing them directly to social media (Instagram and Facebook with Upload-Post) with Seedream v4 AI. This workflow automates the process of generating an AI image based on a user's text prompt and up to 6 reference images. The process is triggered by a user submitting a web form. Key Advantages ✅ Automated Image Creation – Generates high-quality, consistent visuals from multiple references without manual editing. ✅ Seamless Social Media Publishing – Automatically posts to Instagram and Facebook with minimal effort. ✅ SEO-Optimized Titles – Ensures your posts get better reach with AI-generated, keyword-friendly titles. ✅ Scalable Workflow – Can be triggered manually, on schedule, or via form submissions. ✅ Time-Saving – Reduces manual steps from design to publishing, enabling faster content production. ✅ Multi-Platform Support – Easily extendable to other platforms (TikTok, LinkedIn, etc.) with Upload-Post API. How It Works Form Trigger: A user fills out a form with a "Prompt" (text description) and a list of "Reference images" (comma-separated URLs). Data Processing: The workflow converts the submitted image URL string into a proper array for the AI API. AI Image Generation: The workflow sends the prompt and image URLs to the fal.ai API (specifically, the ByteDance seedream model) to generate a new, consistent image. Status Polling: It periodically checks the status of the AI job until the image generation is COMPLETED. Result Retrieval: Once complete, it fetches the URL of the generated image and downloads the image file itself. SEO Title Generation: The original user prompt is sent to OpenAI's GPT-4o-mini model to generate an optimized, engaging social media title. Cloud Backup: The generated image is uploaded to a specified Google Drive folder for storage. Social Media Posting: Finally, the workflow posts the downloaded image file to both Instagram and Facebook via the Upload-Post.com API, using the AI-generated title. Set Up Steps To make this workflow functional, you need to configure several third-party services and their corresponding credentials within n8n. Obtain fal.ai API Key: Create an account at fal.ai. Locate your API key in your account settings. In the "Create Video" and "Get status" nodes, edit the HTTP Header Auth credentials. Set the Header Name to Authorization and the Value to Key YOUR_FAL_AI_API_KEY. Configure Upload-Post.com API: Create an account at Upload-Post.com and get your API key. Create a profile within the Upload-Post app (e.g., test1); this profile manages your social account connections. In both the "Post to Instagram" and "Post to Facebook" nodes, edit the HTTP Header Auth credentials. Set the Header Name to Authorization and the Value to Apikey YOUR_UPLOAD_POST_API_KEY. Crucially, in the same nodes, find the user parameter in the body and replace the placeholder YOUR_USERNAME with the profile name you created (e.g., test1). Configure OpenAI/OpenRouter (Optional for Title Generation): The "Generate title" node uses an OpenAI-compatible API. The provided example uses OpenRouter. Ensure you have valid credentials (e.g., for OpenRouter or directly for OpenAI) configured in n8n and selected in this node. Configure Google Drive (Optional for Backup): The "Upload Image" node requires Google OAuth credentials. Set up a Google Cloud project, enable the Drive API, and create OAuth 2.0 credentials in the n8n settings. Authenticate and select the desired destination folder in the node's parameters. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Ruthwik
🚀 AI-Powered WhatsApp Customer Support for Shopify Brands This n8n template builds a WhatsApp support copilot that answers **order status* and *product availability** from Shopify using LLM "agents," then replies to the customer in WhatsApp or routes to human support. Use cases "Where is my order?" → live status + tracking link "What are your best-selling T-shirts?" → in-stock sizes & variants Greetings / small talk → welcome message Anything unclear → handoff to support channel Good to know WhatsApp Business conversations are billed by Meta/Twilio/Exotel; plan accordingly. Shopify Admin API has rate limits (leaky bucket) --- stagger requests. LLM usage incurs token costs; cap max tokens and enable caching where possible. Avoid sending PII to the model; only pass minimal order/product fields. How it works WhatsApp Trigger\ Receives an incoming message (e.g., "Where is my order?"). Get Customer from Shopify → Customer Details → Normalize Input\ Looks up the customer by phone, formats the query (lower-case, emoji & punctuation normalization). Switch (intent router)\ Classifies into welcome, orderStatusQuery, productQuery, or supportQuery. Welcome path\ Welcome message → polite greeting → (noop placeholder). Order status path (Orders Agent) Orders Agent (LLM + Memory) interprets the user request and extracts needed fields. Get Customer Orders (HTTP to Shopify) fetches the user's latest order(s). Structured Output Parser cleans the agent's output into a strict schema. Send Order Status (WhatsApp message) returns status, ETA, and tracking link. Products path (Products Agent) Products Agent (LLM + Memory) turns the ask into a product query. Get Products from Shopify (HTTP) pulls best sellers / inventory & sizes. Structured Output Parser formats name, price, sizes, stock. Send Products message (WhatsApp) sends a tidy, human-readable reply Support path Send a message to support posts the transcript/context to your agent/helpdesk channel and informs the user a human will respond How to use Replace the manual/WhatsApp trigger with your live WhatsApp number/webhook. Set env vars/credentials: Shopify domain + Admin API token, WhatsApp provider keys, LLM key (OpenAI/OpenRouter), and (optionally) your support channel webhook. Edit message templates for tone, add your brand name, and localize if needed. Test with samples: "Where is my order?", "Show best sellers", "Hi". Requirements WhatsApp Business API (Meta/Twilio/Exotel) Shopify store + Admin API access LLM provider (OpenAI/OpenRouter etc.) Slack webhook for human handoff Prerequisites Active WhatsApp Business Account connected via API provider (Meta, Twilio, or Exotel). Shopify Admin API credentials** (API key, secret, store domain). Slack OAuth app** or webhook for human support escalation. API key for your LLM provider (OpenAI, OpenRouter, etc.). Customising this workflow Add intents: returns/exchanges, COD confirmation, address changes. Enrich product replies with images, price ranges, and "Buy" deep links. Add multilingual support by detecting locale and templating responses. Log all interactions to a DB/Sheet for analytics and quality review. Guardrails: confidence thresholds → fallback to support; redact PII; retry on API errors.