by Aadarsh Jain
Document Analyzer and Q&A Workflow AI-powered document and web page analysis using n8n and GPT model. Ask questions about any local file or web URL and get intelligent, formatted answers. Who's it for Perfect for researchers, developers, content analysts, students, and anyone who needs quick insights from documents or web pages without uploading files to external services. What it does Analyzes local files**: PDF, Markdown, Text, JSON, YAML, Word docs Fetches web content**: Documentation sites, blogs, articles Answers questions**: Using GPT model with structured, well-formatted responses Input format: path_or_url | your_question Examples: /Users/docs/readme.md | What are the installation steps? https://n8n.io | What is n8n? Setup Import workflow into n8n Add your OpenAI API key to credentials Link the credential to the "OpenAI Document Analyzer" node Activate the workflow Start chatting! Customize Change AI model → Edit "OpenAI Document Analyzer" node (switch to gpt-4o-mini for cost savings) Adjust content length → Modify maxLength in "Process Document Content" node (default: 15000 chars) Add file types → Update supportedTypes array in "Parse Document & Question" node Increase timeout → Change timeout value in "Fetch Web Content" node (default: 30s)
by Parag Javale
The AI Blog Creator with Gemini, Replicate Image, Supabase Publishing & Slack is a fully automated content generation and publishing workflow designed for modern marketing and SaaS teams. It automatically fetches the latest industry trends, generates SEO-optimized blogs using AI, creates a relevant featured image, publishes the post to your CMS (e.g., Supabase or custom API), and notifies your team via Slack all on a daily schedule. This workflow connects multiple services NewsAPI, Google Gemini, Replicate, Supabase, and Slack into one intelligent content pipeline that runs hands-free once set up. ✨ Features 📰 Fetch Trending Topics — pulls the latest news or updates from your selected industry (via NewsAPI). 🤖 AI Topic Generation — Gemini suggests trending blog topics relevant to AI, SaaS, and Automation. 📝 AI Blog Authoring — Gemini then writes a full 1200-1500 word SEO-optimized article in Markdown. 🧹 Smart JSON Cleaner — A resilient code node parses Gemini’s output and ensures clean, structured data. 🖼️ Auto-Generated Image — Replicate’s Ideogram model creates a blog cover image based on the content prompt. 🌐 Automatic Publishing — Posts are automatically published to your Supabase or custom backend. 💬 Slack Notification — Notifies your team with blog details and live URL. ⏰ Fully Scheduled — Runs automatically every day at your preferred time (default 10 AM IST). ⚙️ Workflow Structure | Step | Node | Purpose | | ---- | ----------------------------------- | ----------------------------------------------- | | 1 | Schedule Trigger | Runs daily at 10 AM | | 2 | Fetch Industry Trends (NewsAPI) | Retrieves trending articles | | 3 | Message a model (Gemini) | Generates trending topic ideas | | 4 | Message a model1 (Gemini) | Writes full SEO blog content | | 5 | Code in JavaScript | Cleans, validates, and normalizes Gemini output | | 6 | HTTP Request (Replicate) | Generates an image using Ideogram | | 7 | HTTP Request1 | Retrieves generated image URL | | 8 | Wait + If | Polls until image generation succeeds | | 9 | Edit Fields | Assembles blog fields into final JSON | | 10 | Publish to Supabase | Posts to your CMS | | 11 | Slack Notification | Sends message to your Slack channel | 🔧 Setup Instructions Import the Workflow in n8n and enable it. Create the following credentials: NewsAPI (Query Auth) — from https://newsapi.org Google Gemini (PaLM API) — use your Gemini API key Replicate (Bearer Auth) — API key from https://replicate.com/account Supabase (Header Auth) — endpoint to your /functions/v1/blog-api (set your key in header) Slack API — create a Slack App token with chat:write permission Edit the NewsAPI URL query parameter to match your industry (e.g., q=AI automation SaaS). Update the Supabase publish URL to your project endpoint if needed. Adjust the Slack Channel name under “Slack Notification”. (Optional) Change the Schedule Trigger time as per your timezone. 💡 Notes & Tips The Code in JavaScript node is robust against malformed or extra text in Gemini output — it sanitizes Markdown and reconstructs clean JSON safely. You can replace Supabase with any CMS or Webhook endpoint by editing the “Publish to Supabase” node. The Replicate model used is ideogram-ai/ideogram-v3-turbo — you can swap it with Stable Diffusion or another model for different aesthetics. Use the slug field in your blog URLs for SEO-friendly links. Test with one manual execution before activating scheduled runs. If Slack notification fails, verify the token scopes and channel permissions. 🧩 Tags #AI #Automation #ContentMarketing #BlogGenerator #n8n #Supabase #Gemini #Replicate #Slack #WorkflowAutomation
by Jaruphat J.
⚠️ Note: This template requires a community node and works only on self-hosted n8n installations. It uses the Typhoon OCR Python package, pdfseparate from poppler-utils, and custom command execution. Make sure to install all required dependencies locally. Who is this for? This template is designed for developers, back-office teams, and automation builders (especially in Thailand or Thai-speaking environments) who need to process multi-file, multi-page Thai PDFs and automatically export structured results to Google Sheets. It is ideal for: Government and enterprise document processing Thai-language invoices, memos, and official letters AI-powered automation pipelines that require Thai OCR What problem does this solve? Typhoon OCR is one of the most accurate OCR tools for Thai text, but integrating it into an end-to-end workflow usually requires manual scripting and handling multi-page PDFs. This template solves that by: Splitting PDFs into individual pages Running Typhoon OCR on each page Aggregating text back into a single file Using AI to extract structured fields Automatically saving structured data into Google Sheets What this workflow does Trigger:** Manual execution or any n8n trigger node Load Files:** Read PDFs from a local doc/multipage folder Split PDF Pages:** Use pdfinfo and pdfseparate to break PDFs into pages Typhoon OCR:** Run OCR on each page via Execute Command Aggregate:** Combine per-page OCR text LLM Extraction:** Use AI (e.g., GPT-4, OpenRouter) to extract fields into JSON Parse JSON:** Convert structured JSON into a tabular format Google Sheets:** Append one row per file into a Google Sheet Cleanup:** Delete temp split pages and move processed PDFs into a Completed folder Setup Install Requirements Python 3.10+ typhoon-ocr: pip install typhoon-ocr poppler-utils: provides pdfinfo, pdfseparate qpdf: backup page counting Create folders /doc/multipage for incoming files /doc/tmp for split pages /doc/multipage/Completed for processed files Google Sheet Create a Google Sheet with column headers like: book_id | date | subject | to | attach | detail | signed_by | signed_by2 | contact_phone | contact_email | contact_fax | download_url API Keys Export your TYPHOON_OCR_API_KEY and OPENAI_API_KEY (or use credentials in n8n) How to customize this workflow Replace the LLM provider in the “Structure Text to JSON with LLM” node (supports OpenRouter, OpenAI, etc.) Adjust the JSON schema and parsing logic to match your documents Update Google Sheets mapping to fit your desired fields Add trigger nodes (Dropbox, Google Drive, Webhook) to automate file ingestion About Typhoon OCR Typhoon is a multilingual LLM and NLP toolkit optimized for Thai. It includes typhoon-ocr, a Python OCR package designed for Thai-centric documents. It is open-source, highly accurate, and works well in automation pipelines. Perfect for government paperwork, PDF reports, and multi-language documents in Southeast Asia. Deployment Option You can also deploy this workflow easily using the Docker image provided in my GitHub repository: https://github.com/Jaruphat/n8n-ffmpeg-typhoon-ollama This Docker setup already includes n8n, ffmpeg, Typhoon OCR, and Ollama combined together, so you can run the whole environment without installing each dependency manually.
by Nasser
Who’s it for? Content Creators E-commerce Stores Marketing Team Description: Generate unique UGC images for your products. Simply upload a product image into a Google Drive folder, and the workflow will instantly generate 50 unique, high-quality AI UGC images using Nano Banana via Fal.ai. All results are automatically saved back into the same folder, ready to use across social media, e-commerce stores, and marketing campaigns. How it works? 📺 YouTube Video Tutorial: 1 - Trigger: Upload a new Product Image (with white background) to a Folder in your Google Drive 2 - Generate 50 different Image Prompts for your Product 3 - Loop over each Prompt Generated 4 - Generate UGC Content thanks to Fal.ai (Nano Banana) 5 - Upload UGC Content on the initial Google Drive Folder Cost: 0.039$ / image== How to set up? 1. Accounts & APIs In the Edit Field "Setup" Node replace all ==[YOUR_API_TOKEN]== with your API Token : Fal.ai (gemini-25-flash-image/edit): https://fal.ai/models/fal-ai/gemini-25-flash-image/edit/api In Credentials on your n8n Dashboard, connect the following accounts using ==Client ID / Secret==: Google Drive: https://docs.n8n.io/integrations/builtin/credentials/google/ 2. Requirements Base Image of your Product preferably have a White Background Your Google Drive Folder and every Files it contains should be publicly available 3. Customizations Change the amount of total UGC Generated: In Generate Prompts → Message → "Your task is to generate 50" Modify the instructions to generate the UGC Prompts: In Generate Prompts → Message Change the amount of Base Image: In Generate Image → Body Parameters → JSON → image_urls Change the amount of UGC Generated per prompt: In Generate Image → Body Parameters → JSON → num_images Modify the Folder where UGC Generated are stored: In Upload File → Parent Folder
by n8n Automation Expert | Template Creator | 2+ Years Experience
Description 🎯 Overview An advanced automated trading bot that implements ICT (Inner Circle Trader) methodology and Smart Money Concepts for cryptocurrency trading. This workflow combines AI-powered market analysis with automated trade execution through Coinbase Advanced Trading API. ⚡ Key Features 📊 ICT Trading Strategy Implementation Kill Zone Detection**: Automatically identifies optimal trading sessions (Asian, London, New York kill zones) Smart Money Concepts**: Analyzes market structure breaks, liquidity grabs, fair value gaps, and order blocks Session Validation**: Real-time GMT time tracking with session strength calculations Structure Analysis**: Detects BOS (Break of Structure) and CHOCH (Change of Character) patterns 🤖 AI-Powered Analysis GPT-4 Integration**: Advanced market analysis using OpenAI's latest model Confidence Scoring**: AI generates confidence scores (0-100) for each trading signal Risk Assessment**: Automated risk level evaluation (LOW/MEDIUM/HIGH) ICT-Specific Prompts**: Custom prompts designed for Inner Circle Trader methodology 🔄 Automated Trading Flow Signal Reception: Receives trading signals via Telegram webhook Data Extraction: Parses symbol, action, price, and technical indicators Session Validation: Verifies current kill zone and trading session strength Market Data: Fetches real-time data from Coinbase Advanced Trading API AI Analysis: Processes signals through GPT-4 with ICT-specific analysis Quality Filter: Multi-condition filtering based on confidence, session, and structure Trade Execution: Automated order placement through Coinbase API Documentation: Records all trades and rejections in Notion databases 📱 Multi-Platform Integration Telegram Bot**: Receives signals and sends formatted notifications Coinbase Advanced**: Real-time market data and trade execution Notion Database**: Comprehensive trade logging and analysis tracking Webhook Support**: External system integration capabilities 🛠️ Setup Requirements API Credentials Needed: Coinbase Advanced Trading API** (API Key, Secret, Passphrase) OpenAI API Key** (GPT-4 access) Telegram Bot Token** and Chat ID Notion Integration** (Database IDs for trade records) Environment Variables: TELEGRAM_CHAT_ID=your_chat_id NOTION_TRADING_DB_ID=your_trading_database_id NOTION_REJECTED_DB_ID=your_rejected_signals_database_id WEBHOOK_URL=your_external_webhook_url 📈 Trading Logic Kill Zone Priority System: London & New York Sessions**: HIGH priority (0.9 strength) Asian & London Close**: MEDIUM priority (0.6 strength) Off Hours**: LOW priority (0.1 strength) Signal Validation Criteria: Signal quality must not be "LOW" Confidence score ≥ 60% Active kill zone session required ICT structure alignment confirmed 🎛️ Workflow Components Extract ICT Signal Data: Parses incoming Telegram messages for trading signals ICT Session Validator: Determines current kill zone and session strength Get Coinbase Market Data: Fetches real-time cryptocurrency data ICT AI Analysis: GPT-4 powered analysis with ICT methodology Parse ICT AI Analysis: Processes AI response with fallback mechanisms ICT Quality & Session Filter: Multi-condition signal validation Execute ICT Trade: Automated trade execution via Coinbase API Create ICT Trading Record: Logs successful trades to Notion Generate ICT Notification: Creates formatted Telegram alerts Log ICT Rejected Signal: Records filtered signals for analysis 🚀 Use Cases Automated ICT-based cryptocurrency trading Smart Money Concepts implementation Kill zone session trading AI-enhanced market structure analysis Professional trading documentation and tracking ⚠️ Risk Management Built-in session validation prevents off-hours trading AI confidence scoring filters low-quality signals Comprehensive logging for performance analysis Automated stop-loss and take-profit calculations This workflow is perfect for traders familiar with ICT methodology who want to automate their Smart Money Concepts trading strategy with AI-enhanced decision making.
by Avkash Kakdiya
How it works This workflow starts whenever a new domain is added to a Google Sheet. It cleans the domain, fetches traffic insights from SimilarWeb, extracts the most relevant metrics, and updates the sheet with enriched data. Optionally, it can also send this information to Airtable for further tracking or analysis. Step-by-step Trigger on New Domain Workflow starts when a new row is added in the Google Sheet. Captures the raw URL/domain entered by the user. Clean Domain URL Strips unnecessary parts like http://, https://, www., and trailing slashes. Stores a clean domain format (e.g., example.com) along with the row number. Fetch Website Analysis Uses the SimilarWeb API to pull traffic and engagement insights for the domain. Data includes global rank, country rank, category rank, total visits, bounce rate, and more. Extract Key Metrics Processes raw SimilarWeb data into a simplified structure. Extracted insights include: Ranks: Global, Country, and Category. Traffic Overview: Total Visits, Bounce Rate, Pages per Visit, Avg Visit Duration. Top Traffic Sources: Direct, Search, Social. Top Countries (Top 3): With traffic share percentages. Device Split: Mobile vs Desktop. Update Google Sheet Writes the cleaned and enriched domain data back into the same (or another) Google Sheet. Ensures each row is updated with the new traffic insights. Export to Airtable (Optional) Creates a new record in Airtable with the enriched traffic metrics. Useful if you want to manage or visualize company/domain data outside of Google Sheets. Why use this? Automatically enriches domain lists with live traffic data from SimilarWeb. Cleans messy URLs into a standard format. Saves hours of manual research on company traffic insights. Provides structured, comparable metrics for better decision-making. Flexible: update sheets, export to Airtable, or both.
by Paolo Ronco
Amazon Luna Prime Games Catalog Tracker (Auto-Sync to Google Sheets)** Automatically fetch, organize, and maintain an updated catalog of Amazon Luna – Included with Prime games.This workflow regularly queries Amazon’s official Luna endpoint, extracts complete metadata, and syncs everything into Google Sheets without duplicates. Ideal for: tracking monthly Prime Luna rotations keeping a personal archive of games monitoring new games appearing on Amazon Games / Prime Gaming, so you can instantly play titles you’re interested in building dashboards or gaming databases powering notification systems (Discord, Telegram, email, etc.) Overview Amazon Luna’s “Included with Prime” lineup changes frequently, with new games added and old ones removed.Instead of checking manually, this n8n template fully automates the process: Fetches the latest list from Amazon’s backend Extracts detailed metadata from the response Syncs the data into Google Sheets Avoids duplicates by updating existing rows Supports all major Amazon regions Once configured, it runs automatically—keeping your game catalog correct, clean, and always up to date. 🛠️ How the workflow works 1. Scheduled Trigger Starts the workflow on a set schedule (default: every 5 days at 3:00 PM).You can change both frequency and time freely. 2. HTTP Request to Amazon Luna Calls Amazon Luna’s regional endpoint and retrieves the full “Included with Prime” catalog. 3. JavaScript Code Node – Data Extraction Parses the JSON response and extracts structured fields: Title Genres Release Year ASIN Image URLs Additional metadata The result is a clean, ready-to-use dataset. 4. Google Sheets – Insert or Update Rows Each game is written into the selected Google Sheet: Existing games get updated New games are appended The Title acts as the unique identifier to prevent duplicates. ## ⚙️ Configuration Parameters | Parameter | Description | Recommended values | | --- | --- | --- | | x-amz-locale | Language + region | it_IT 🇮🇹 · en_US 🇺🇸 · de_DE 🇩🇪 · fr_FR 🇫🇷 · es_ES 🇪🇸 · en_GB 🇬🇧 · ja_JP 🇯🇵 · en_CA 🇨🇦 | | x-amz-marketplace-id | Marketplace backend ID | APJ6JRA9NG5V4 🇮🇹 · ATVPDKIKX0DER 🇺🇸 · A1PA6795UKMFR9 🇩🇪 · A13V1IB3VIYZZH 🇫🇷 · A1RKKUPIHCS9HS 🇪🇸 · A1F83G8C2ARO7P 🇬🇧 · A1VC38T7YXB528 🇯🇵 · A2EUQ1WTGCTBG2 🇨🇦 | | Accept-Language | Response language | Example: it-IT,it;q=0.9,en;q=0.8 | | User-Agent | Browser-like request | Default or updated UA | | Trigger interval | Refresh frequency | Every 5 days at 3:00 PM (modifiable) | | Google Sheet | Storage output | Select your file + sheet | You can adapt these headers to fetch data from any supported country. 💡 Tips & Customization 🌍 Regional catalogs Duplicate the HTTP Request + Code + Sheet block to track multiple countries (US, DE, JP, UK…). 🧹 No duplicates The workflow updates rows intelligently, ensuring a clean catalog even after many runs. 🗂️ Move data anywhere Send the output to: Airtable Databases (MySQL, Postgres, MongoDB…) Notion CSV REST APIs BI dashboards 🔔 Add notifications (Discord, Telegram, Email, etc.) You can pair this template with a notification workflow.When used with Discord, the notification message can include: game title description or metadata the game’s image**, automatically downloaded and attached This makes notifications visually informative and perfect for tracking new Prime titles. 🔒 Important Notes All retrieved data belongs to Amazon. The workflow is intended for personal, testing, or educational use only. Do not republish or redistribute collected data without permission.
by Joseph
Overview This n8n workflow creates an intelligent AI agent that automates browser interactions through Airtop's browser automation platform. The agent can control real browser sessions, navigate websites, interact with web elements, and maintain detailed session records - all while providing live viewing capabilities for real-time monitoring. Youtube Tutorial: https://www.youtube.com/watch?v=XoZqFY7QFps What This Workflow Does The AI agent acts as your virtual assistant in the browser, capable of: Session Management**: Creates, monitors, and terminates browser sessions with proper tracking Web Navigation**: Visits websites, clicks elements, fills forms, and performs complex interactions Multi-Window Support**: Manages multiple browser windows within sessions Live Monitoring**: Provides real-time viewing URLs so you can watch the automation Data Tracking**: Maintains comprehensive records of all browser activities Profile Integration**: Uses Airtop profiles for authenticated sessions Email Notifications**: Sends live URLs and status updates via Gmail Demo Use Case: Automated Reddit Posting The tutorial demonstrates the agent's capabilities by: Logging into Reddit using pre-configured Airtop profile credentials Navigating to a specific subreddit based on user input Creating and publishing a new post with title and content Tracking the entire process with detailed session records Providing live viewing access throughout the automation Core Workflow Components 1. Chat Interface Trigger Node Type**: Chat Trigger Purpose**: Accepts user commands for browser automation tasks Input**: Natural language instructions (e.g., "Create a Reddit post in r/automation") 2. AI Agent Processing Node Type**: OpenAI GPT-4 Purpose**: Interprets user requests and determines appropriate browser actions System Message**: Contains the comprehensive agent instructions from your documentation Capabilities**: Understands complex web interaction requests Plans multi-step browser workflows Manages session states intelligently Handles error scenarios gracefully 3. Google Sheets Data Management Multiple Google Sheets nodes manage different aspects of session tracking: Browser Sessions Sheet Fields**: session_id, description, status, created_date Purpose**: Tracks active browser sessions Operations**: Create, read, update session records Window Sessions Sheet Fields**: session_id, window_id, description, airtop_live_view_url, status Purpose**: Tracks individual browser windows within sessions Operations**: Create, read, update window records Airtop Profiles Sheet Fields**: platform_name, platform_url, profile_name Purpose**: Stores available authenticated profiles Operations**: Read available profiles for session creation 4. Airtop Browser Automation Nodes Multiple specialized nodes for browser control: Session Management create_session**: Creates new browser sessions with optional profile authentication terminate_session**: Closes browser sessions and updates records read_airtop_profiles**: Retrieves available authentication profiles Window Management create_window**: Opens new browser windows with specified URLs query_page**: Analyzes page content and identifies interactive elements Web Interaction click_element**: Clicks specific page elements based on AI descriptions type_text**: Inputs text into form fields and input elements 5. Gmail Integration Node Type**: Gmail Send Purpose**: Sends live viewing URLs and status updates Recipients**: User email for real-time monitoring Content**: Complete Airtop live view URLs for browser session observation 6. Error Handling & Validation Input Validation**: Ensures required parameters are present Session State Checks**: Verifies browser session status before operations Error Recovery**: Handles failed operations gracefully Data Consistency**: Maintains accurate session records even during failures Technical Requirements API Credentials Needed Airtop.ai API Key Sign up at airtop.ai Generate API key from dashboard Required for all browser automation functions OpenAI API Key OpenAI account with GPT-4 access Required for AI agent intelligence and decision-making Google Sheets Access Google account with Google Sheets API access Copy the provided template and get your sheet URL Required for session and profile data management Gmail OAuth Google account with Gmail API access Required for sending live viewing URLs and notifications Airtable Base Structure Create three tables in your Airtable base: 1. Browser Details (Sessions) session_id (Single line text) description (Single line text) status (Single select: Open, Closed) created_date (Date) 2. Window Details (Windows) session_id (Single line text) window_id (Single line text) description (Single line text) airtop_live_view_url (URL) status (Single select: Open, Closed) 3. Airtop Profiles platform_name (Single line text) platform_url (URL) profile_name (Single line text) Workflow Logic Flow User Request Processing User Input: Natural language command via chat interface AI Analysis: OpenAI processes request and determines required actions Session Check: Agent reads current browser session status Action Planning: AI creates step-by-step execution plan Browser Session Lifecycle Session Creation: Check for existing open sessions Ask user about profile usage if needed Create new Airtop session Record session details in Airtable Window Management: Create browser window with target URL Capture live viewing URL Record window details in Airtable Send live URL via Gmail Web Interactions: Query page content for element identification Execute clicks, form fills, navigation Monitor page state changes Handle dynamic content loading Session Cleanup: Terminate browser session when complete Update all related records to "Closed" status Send completion notification Data Flow Architecture User Input → AI Processing → Session Management → Browser Actions → Data Recording → User Notifications Key Features & Benefits Intelligent Automation Natural Language Control**: Users can describe tasks in plain English Context Awareness**: AI understands complex multi-step workflows Adaptive Responses**: Handles unexpected page changes and errors Profile Integration**: Seamlessly uses stored authentication credentials Real-Time Monitoring Live View URLs**: Watch browser automation as it happens Status Updates**: Real-time notifications of task progress Session Tracking**: Complete audit trail of all browser activities Multi-Window Support**: Handle complex workflows across multiple tabs Enterprise-Ready Features Error Recovery**: Robust handling of network issues and page failures Session Persistence**: Maintains state across workflow interruptions Data Integrity**: Consistent record-keeping even during failures Scalable Architecture**: Can handle multiple concurrent automation tasks Use Cases Beyond Reddit This workflow architecture supports automation for any website: Social Media Management Multi-platform posting**: Facebook, Twitter, LinkedIn, Instagram Community engagement**: Responding to comments, messages Content scheduling**: Publishing posts at optimal times Analytics gathering**: Collecting engagement metrics Business Process Automation CRM data entry**: Updating customer records across platforms Support ticket management**: Creating, updating, routing tickets E-commerce operations**: Product listings, inventory updates Report generation**: Gathering data from multiple web sources Personal Productivity Travel booking**: Comparing prices, making reservations Bill management**: Paying utilities, checking statements Job applications**: Submitting applications, tracking status Research tasks**: Gathering information from multiple sources Advanced Configuration Options Custom Profiles Create Airtop profiles for different websites Store authentication credentials securely Switch between different user accounts Handle multi-factor authentication flows Workflow Customization Modify AI system prompts for specific use cases Add custom validation rules Implement retry logic for failed operations Create domain-specific interaction patterns Integration Extensions Connect to additional data sources Add webhook notifications Implement approval workflows Create audit logs and reporting Getting Started 📊 Copy the Google Sheets Template - Just click and make a copy! Set up credentials for Airtop, OpenAI, and Gmail Import workflow into your n8n instance Configure node credentials with your API keys and Google Sheets URL Test with simple commands like "Visit google.com" Expand to complex workflows as you become comfortable Best Practices Session Management Always check for existing sessions before creating new ones Properly terminate sessions to avoid resource waste Use descriptive names for sessions and windows Regularly clean up old session records Error Handling Implement timeout handling for slow-loading pages Add retry logic for network failures Validate element existence before interactions Log detailed error information for debugging Security Considerations Store sensitive credentials in Airtop profiles, not workflow Use webhook authentication for production deployments Implement rate limiting to avoid being blocked by websites Regular audit of browser session activities This workflow transforms n8n into a powerful browser automation platform, enabling you to automate virtually any web-based task while maintaining full visibility and control over the automation process.
by Julian Kaiser
Automatically Classify Support Tickets in Zoho Desk with AI with Gemini Transform your customer support workflow with intelligent ticket classification. This automation leverages AI to automatically categorize incoming support tickets in Zoho Desk, reducing manual work and ensuring faster ticket routing to the right teams. How It Works Fetches all tickets from Zoho Desk with pagination support Filters unclassified tickets (where classification field is null) Retrieves complete ticket threads for full conversation context Uses OpenRouter AI (GPT-4, Claude, or other models) to classify tickets into predefined categories Updates tickets in Zoho Desk with accurate classifications automatically Use Cases Customer Support Teams**: Automatically route tickets to specialized departments (billing, technical, sales) Help Desks**: Prioritize urgent issues and categorize feature requests Prerequisites Active Zoho Desk account with API access OpenRouter API account (supports multiple AI models) Basic understanding of OAuth2 authentication Predefined ticket categories in your Zoho Desk setup Setup Steps Time: ~15 minutes Configure Zoho Desk OAuth2 - Follow our step-by-step GitHub guide for OAuth2 credential setup Set up OpenRouter API - Create an account and generate API keys at openrouter.ai Customize classifications - Define your ticket categories (e.g., Technical, Billing, Feature Request, Bug Report) Adapt the workflow - Modify for any field: status, priority, tags, assignment, or custom fields Review API documentation - Check Zoho Desk Search API docs for advanced filtering options Test thoroughly - Run manual triggers before automation Note: This workflow demonstrates proper Zoho Desk API integration, including OAuth2 authentication and pagination handling—two common integration challenges.
by JKingma
🛍️ Automated Product Description Generation for Adobe Commerce (Magento 2) Description This n8n template demonstrates how to automatically generate product descriptions for items in Adobe Commerce (Magento 2) that are missing one. The workflow retrieves product data, converts raw attribute values (like numeric IDs) into human-readable labels, and passes the enriched product data to an LLM (Azure OpenAI by default). The LLM generates a compelling description, which is then saved back to Magento using the API. This ensures all products have professional descriptions without manual writing effort. Use cases include: Auto-generating missing descriptions for catalog completeness. Creating consistent descriptions across large product datasets. Reducing manual workload for content managers. Tailoring descriptions for SEO and customer readability. Good to know All attribute options are resolved to human-readable labels before being sent to the LLM. The flow uses Azure OpenAI, but you can replace it with OpenAI, Anthropic, Gemini, or other LLM providers. The LLM prompt can be customised to adjust tone, length, SEO-focus, or specific brand style. Works out-of-the-box with Adobe Commerce (Magento 2) APIs, but can be adapted for other ecommerce systems. How it works Get Product from Magento Retrieves a product that has no description. Collects all product attributes. Generate Description with LLM Resolves attribute option IDs into human-readable values (e.g. color_id = 23 → "Red"). Passes the readable product attributes to an Azure OpenAI model. The LLM creates a clear, engaging product description. The prompt can be customised (e.g. SEO-optimized, short catalog text, or marketing style). Save Description in Magento Updates the product via the Magento API with the generated description. Ensures product data is enriched and visible in the webshop immediately. How to use Configure your Magento 2 API credentials in n8n. Replace the Azure OpenAI node with another provider if needed. Adjust the prompt to match your brand’s tone of voice. Run the workflow to automatically process products missing descriptions. Requirements ✅ n8n instance (self-hosted or cloud) ✅ Adobe Commerce (Magento 2) instance with API access ✅ Azure OpenAI (or other LLM provider) credentials (Optional) Prompt customisations for SEO or brand voice Customising this workflow This workflow can be adapted for: Other attributes**: Include or exclude attributes (e.g. only color & size for apparel). Different LLMs**: Swap Azure OpenAI for OpenAI, Anthropic, Gemini, or any supported n8n AI node. Prompt tuning**: Adjust instructions to generate shorter, longer, or SEO-rich descriptions. Selective updates**: Target only specific categories (e.g. electronics, fashion). Multi-language support**: Generate product descriptions in multiple languages for international shops.
by Intuz
This n8n template from Intuz provides a complete solution to automate on-demand lead generation. It acts as a powerful scraping agent that takes a simple chat query, scours both Google Search and Google Maps for relevant businesses, scrapes their websites for contact details, and compiles an enriched lead list directly in Google Sheets. Who's this workflow for? Sales Development Representatives (SDRs) Local Marketing Agencies Business Development Teams Freelancers & Consultants Market Researchers How it works 1. Start with a Chat Query: The user initiates the workflow by typing a search query (e.g., "dentists in New York") into a chat interface. 2. Multi-Source Search: The workflow queries both the Google Custom Search API (for web results across multiple pages) and scrapes Google Maps (for local businesses) to gather a broad list of potential leads. 3. Deep Dive Website Scraping: For each unique business website found, the workflow visits the URL to scrape the raw HTML content of the page. 4. Intelligent Contact Extraction: Using custom code, it then parses the scraped website content to find and extract valuable contact information like email addresses, phone numbers, and social media links. 5. Deduplicate and Log to Sheets: Before saving, the workflow checks your Google Sheet to ensure the lead doesn't already exist. All unique, newly enriched leads are then appended as clean rows to your sheet, along with the original search query for tracking. Key Requirements to Use This Template 1. n8n Instance & Required Nodes: An active n8n account (Cloud or self-hosted). This workflow uses the official n8n LangChain integration (@n8n/n8n-nodes-langchain) for the chat trigger. If you are using a self-hosted version of n8n, please ensure this package is installed. 2. Google Custom Search API: A Google Cloud Project with the "Custom Search API" enabled. You will need an API Key for this service. You must also create a Programmable Search Engine and get its Search engine ID (cx). This tells Google what to search (e.g., the whole web). 3. Google Sheets Account: A Google account and a pre-made Google Sheet with columns for Business Name, Primary Email, Contact Number, URL, Description, Socials, and Search Query. Setup Instructions 1. Configure the Chat Trigger: In the "When chat message received" node, you can find the Direct URL or Embed code to use the chat interface. Set Up Google Custom Search API (Crucial Step): Go to the "Custom Google Search API" (HTTP Request) node. Under "Query Parameters", you must replace the placeholder values for key (with your API Key) and cx (with your Search Engine ID). 3. Configure Google Sheets: In all Google Sheets nodes (Append row in sheet, Get row(s) in sheet, etc.), connect your Google Sheets credentials. Select your target spreadsheet (Document ID) and the specific sheet (Sheet Name) where you want to store the leads. 4. Activate the Workflow: Save the workflow and toggle the "Active" switch to ON. Open the chat URL and enter a search query to start generating leads. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Workflow Automation Click here- Get Started
by Automate With Marc
Viral Marketing Reel & Autopost with Sora2 + Blotato Create funny, ultra-realistic marketing reels on autopilot using n8n, Sora2, Blotato, and OpenAI. This beginner-friendly template generates a comedic video prompt, creates a 12-second Sora2 video, writes a caption, and auto-posts to Instagram/TikTok — all on a schedule. 🎥 Watch the full step-by-step tutorial: https://www.youtube.com/watch?v=lKZknEzhivo What this template does This workflow automates an entire short-form content production pipeline: Scheduled Trigger: Runs automatically at your chosen time (e.g., every evening at 7PM). AI “Video Prompt Agent”: Creates a cinematic, funny, 12-second Sora2 text-to-video prompt designed to promote a product (default: Sally’s Coffee). Insert Row (Data Table): Logs each generated video prompt for tracking, reuse, or inspiration. Sora2 (via Wavespeed): Sends POST request to generate a video. Waits 30 seconds. Polls the prediction endpoint until the video is completed. Blotato Integration: Uploads the finished video to your connected social account(s). Automatically publishes or schedules the post. Caption Generator: Uses an AI agent to create an Instagram/TikTok-ready caption with relevant hashtags. This turns n8n into a hands-free comedic marketing engine that writes, creates, and posts content for you. Why it’s useful Create daily or weekly marketing reels without filming, editing, or writing scripts. Experiment with new comedic formats, hooks, and product placements in seconds. Perfect for small businesses, agencies, creators, and social media managers. Demonstrates how to combine AI agents + Sora2 + polling + external posting services inside one workflow. Requirements Before running this template, configure: OpenAI API Key (for the prompt agent & caption model) Wavespeed / Sora2 API credentials Blotato account connected to Instagram/TikTok (for posting) n8n Data Table (optional, or replace with your own) ⚠️ All credentials must be added manually after import. No real credentials are included in the template. How it works Schedule Trigger Runs at a fixed time or interval. Video Prompt Agent (LangChain Agent) Generates a cinematic, realistic comedic video idea. Built with a detailed system prompt. Ensures brand integration (e.g., Sally’s Coffee) happens naturally. Insert Row (Data Table) Logs each generated prompt so future videos can be referenced or reused. Sora2 POST Request Sends the generated prompt to Sora2 via Wavespeed’s /text-to-video endpoint. Wait 30s + GET Sora2 Result Polls the result until data.status === "completed". Continues looping if still “processing”. Upload Media (Blotato) Uploads the finished video file. Caption Generator Creates a funny, platform-ready Instagram/TikTok caption with hashtags. Create Post (Blotato) Publishes (or schedules) the video + caption. Setup Instructions (Step-by-Step) Import template into n8n. Open Video Prompt Agent → review or customize the brand name, style, humor tone. Add your OpenAI API credentials: For prompt generation For caption generation Add your Wavespeed/Sora2 credentials to the POST and GET nodes. Connect your Blotato credential for uploading and posting. (Optional) Replace the Data Table ID with your own table. Adjust the Schedule Trigger time to your desired posting schedule. Run once manually to confirm: Prompt is generated Video is created Caption is written Video uploads successfully Enable workflow → your daily/weekly comedic autoposter is live. Customization Ideas Change the brand from Sally’s Coffee to any business, product, or influencer brand. Modify the prompt agent to enforce specific camera styles, settings, or comedic tones. Swap posting destinations: Blotato supports multiple networks—configure IG/TikTok/Facebook/YouTube Shorts. Add approval steps: Insert a Slack/Telegram “Approve before posting” step. Add analytics logging: Store video URLs, caption, and AI cost estimate. Troubleshooting Sora video stuck in processing: Increase the wait time or add another polling loop. Upload fails: Ensure media URL exists and Blotato account has posting permissions. Caption empty: Reconnect OpenAI credential or check model availability. Posting fails: Confirm your Blotato API key is valid and linked to a connected account. Category: Marketing, AI Video, Social Media Automation Difficulty: Beginner–Intermediate Core Nodes: LangChain Agent, HTTP Request, Wait, Data Table, Blotato, OpenAI Includes: System prompts, polling logic, caption generator, posting workflow