by Nikan Noorafkan
📊 Google Ads + OpenAI + Sheets — Monthly AI Performance Analysis Automate monthly ad performance insights with AI-powered recommendations 🧩 Overview This workflow automatically analyzes Google Ads performance every month, using the Google Ads API and OpenAI (GPT-4o) to uncover which ad themes, categories, and messages perform best. It then generates a structured AI report, saves it to Google Sheets, and sends a Slack summary to your marketing team. 💡 Perfect for digital marketers, agencies, and growth analysts who want automated campaign insights without manually crunching numbers. ⚙️ Features ✅ Automatically runs on the 1st of each month ✅ Fetches last 30 days of ad performance via Google Ads API (GAQL) ✅ Uses GPT-4o for natural-language insights & improvement ideas ✅ Groups ads by category and theme (e.g., “Free Shipping,” “Premium”) ✅ Generates a clean, formatted markdown report ✅ Archives reports in Google Sheets for trend tracking ✅ Notifies your Slack channel with AI-driven recommendations 🧠 Architecture | Component | Purpose | | ------------------- | ------------------------------------------------ | | n8n | Workflow engine | | Google Ads API | Source of ad performance data | | OpenAI (GPT-4o) | Analyzes CTR patterns and writes recommendations | | Google Sheets | Report archiving and history tracking | | Slack | Team notifications | 🧭 Workflow Logic (Summary) Monthly Trigger (1st of Month) ⬇️ 1️⃣ Get Performance Data (Google Ads API) Fetches 30-day CTR, clicks, impressions for all responsive search ads. ⬇️ 2️⃣ Prepare Performance Data Groups data by ad group and theme keywords, builds an AI prompt. ⬇️ 3️⃣ AI Agent (LangChain) + GPT-4o Analyzes patterns and generates actionable insights. ⬇️ 4️⃣ Generate Report (Code) Formats a Markdown report with AI recommendations and KPIs. ⬇️ 5️⃣ Save to Google Sheets Archives results for long-term analytics. ⬇️ 6️⃣ Send Report to Slack Delivers the summary directly to your marketing channel. 🔑 Environment Variables | Variable | Example | Description | | ------------------------ | ----------------------------- | ------------------------------ | | GOOGLE_ADS_CUSTOMER_ID | 123-456-7890 | Google Ads customer account ID | | GOOGLE_ADS_API_VERSION | v17 | Current Ads API version | | GOOGLE_SHEET_ID | 1xA1B2c3D4EFgH... | Target spreadsheet ID | | OPENAI_API_KEY | sk-xxxxx | OpenAI API key for GPT-4o | | SLACK_WEBHOOK_URL | https://hooks.slack.com/... | Slack incoming webhook | 🔐 Credential Setup | Service | Type | Required Scopes | | ----------------- | ----------------------------- | ---------------------------------------------- | | Google Ads | OAuth2 (googleAdsOAuth2Api) | https://www.googleapis.com/auth/adwords | | OpenAI | API key (openAiApi) | Full access | | Google Sheets | OAuth2 | https://www.googleapis.com/auth/spreadsheets | | Slack | Webhook | chat:write | 🧱 Node-by-Node Breakdown | Node | Purpose | Key Configuration | | ---------------------------------- | ----------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------- | | Monthly Trigger | Starts workflow on 1st of every month | Cron: 0 0 1 * * | | Get Performance Data | Queries Ads data | Endpoint: https://googleads.googleapis.com/v17/customers/{id}/googleAds:searchQuery: GAQL (CTR, clicks, impressions, last 30 days) | | Prepare Performance Data | Aggregates and builds AI prompt | Groups by ad group and theme, computes CTRs | | AI Agent – Analyze Performance | Passes formatted data to GPT-4o | System message: “You are a Google Ads performance analyst…” | | OpenAI Chat Model (GPT-4o) | Analytical reasoning engine | Model: gpt-4o, Temperature 0.2 | | Generate Report | Parses AI output, formats Markdown report | Adds recommendations + next steps | | Save Report to Sheets | Archives report | Sheet name: Performance Reports | | Send Report (Slack) | Sends summary | Uses report_markdown variable | 🧠 AI Report Example 30-Day Performance Analysis Report Executive Summary Analyzed: 940 ads Period: Last 30 days Top Performing Categories Running Shoes: 9.4% CTR (120 ads) Fitness Apparel: 8.2% CTR (90 ads) Top Performing Themes "Free Shipping" messaging: 9.8% CTR (58 ads) "Premium" messaging: 8.5% CTR (44 ads) AI-Powered Recommendations [HIGH] Emphasize “Free Shipping” across more ad groups. Expected Impact: +5 % CTR [MEDIUM] Test “Premium Quality” vs. “New Arrivals.” Expected Impact: +3 % CTR Next Steps Implement new ad variations A/B test messaging Re-analyze next month 🧩 Testing Procedure 1️⃣ Temporarily disable the cron trigger. 2️⃣ Run the workflow manually. 3️⃣ Confirm: Google Ads node returns JSON with results. AI Agent output is valid JSON. Report is written to Sheets. Slack message received. 4️⃣ Re-enable the monthly trigger once verified. 🧾 Output in Google Sheets | Date | Ads Analyzed | Top Category | Top Theme | Key Recommendations | Generated At | | ---------- | ------------ | ------------- | ------------- | ---------------------------------- | ----------------- | | 2025-10-01 | 940 | Running Shoes | Free Shipping | “Add Free Shipping copy to 10 ads” | 2025-10-01T00:05Z | 🪜 Maintenance | Frequency | Task | | --------- | ----------------------------------------- | | Monthly | Review AI accuracy and update themes list | | Quarterly | Refresh Google Ads API credentials | | As needed | Update GAQL fields for new metrics | ⚙️ API Verification Endpoint: POST https://googleads.googleapis.com/v17/customers/{customer_id}/googleAds:search Scopes: https://www.googleapis.com/auth/adwords GAQL Query: SELECT ad_group_ad.ad.id, ad_group_ad.ad.responsive_search_ad.headlines, ad_group.name, metrics.impressions, metrics.clicks, metrics.ctr FROM ad_group_ad WHERE segments.date DURING LAST_30_DAYS AND metrics.impressions > 100 ORDER BY metrics.clicks DESC LIMIT 1000 ✅ Fully valid query — verified for GAQL syntax, fields, and resource joins. ✅ OAuth2 flow handled by n8n’s googleAdsOAuth2Api. ✅ Optional: add "timeout": 60000 for large accounts. 📈 Metrics of Success | KPI | Target | | -------------------------- | ---------------- | | Report accuracy | ≥ 95 % | | Monthly automation success | ≥ 99 % | | CTR improvement tracking | +3–5 % over time | 🔗 References Google Ads API Docs LangChain in n8n OpenAI API Reference Google Sheets API Slack Incoming Webhooks 🎯 Conclusion You now have a fully automated Google Ads performance analysis workflow powered by: Google Ads API** for granular metrics OpenAI GPT-4o** for intelligent recommendations Google Sheets** for archiving Slack** for team-wide updates 💡 Result: A recurring, data-driven optimization loop that improves ad performance every month — with zero manual effort.
by Rahul Joshi
📊 Description Automate your client proposal creation with this intelligent workflow that transforms Google Sheets entries into professional Google Docs proposals using OpenAI GPT-4o. Designed for agencies and sales teams, it delivers personalized, branded, and structured proposals in minutes — no manual editing required. 🚀📄🤖 What This Template Does Triggers when a new row is added in a connected Google Sheet. 📋 Filters only the latest row to ensure one proposal per new entry. 🔍 Uses GPT-4o to generate structured proposal content (Executive Summary, Scope, Costing, Timeline, Conclusion). 💡 Parses output into validated JSON format for accurate field mapping. ⚙️ Populates a Google Docs template with AI-generated content using placeholders. 📝 Downloads the completed proposal as a PDF file. 💾 Archives the finalized document into a designated Google Drive folder. 📂 Resets the template for the next proposal cycle automatically. 🔄 Key Benefits ✅ Eliminates repetitive manual proposal writing. ✅ Ensures brand consistency with structured templates. ✅ Generates high-quality proposals using AI in real time. ✅ Automates document formatting, saving hours per client. ✅ Scales easily for agencies handling multiple clients daily. Features Google Sheets trigger for new entries. GPT-4o-based content generation with customizable prompts. JSON output validation and structured parsing. Google Docs population using placeholder replacement. Drive storage automation for version tracking. End-to-end automation from data to proposal delivery. Requirements Google Sheets document with columns: clientName, jobDescription. Google Docs template with placeholders (e.g., {{executive_summary}}, {{scope_of_work}}). OpenAI API key (GPT-4o). Google Drive credentials for output management. Target Audience Marketing and web agencies automating client proposal generation. Sales teams preparing project estimates and deliverables. Freelancers and consultants managing multiple client requests. Businesses streamlining documentation workflows. Step-by-Step Setup Instructions Connect Google Sheets and replace the Sheet ID placeholder. Set up your Google Docs proposal template and replace the Document ID. Add your OpenAI API key for GPT-4o content generation. Specify your Google Drive folder for saving proposals. Test the workflow with a sample entry to confirm formatting. Activate the workflow for continuous proposal generation. ✅
by Daniel Shashko
How it Works This workflow transforms natural language queries into research reports through a five-stage AI pipeline. When triggered via webhook (typically from Google Sheets using the companion google-apps-script.js (GitHub gist), it first checks Redis cache for instant results. For new queries, GPT-4o breaks complex questions into focused sub-queries, optimizes them for search, then uses Bright Data's MCP Tool to find the top 5 credible sources (official sites, news, financial reports). URLs are scraped in parallel, bypassing bot detection. GPT-4o extracts structured data from each source: answers, facts, entities, sentiment, quotes, and dates. GPT-4o-mini validates source credibility and filters unreliable content. Valid results aggregate into a final summary with confidence scores, key insights, and extended analysis. Results cache for 1 hour and output via webhook, Slack, email, and DataTable—all in 30-90 seconds with 60 requests/minute rate limiting. Who is this for? Research teams needing automated multi-source intelligence Content creators and journalists requiring fact-checked information Due diligence professionals conducting competitive intelligence Google Sheets power users wanting AI research in spreadsheets Teams managing large research volumes needing caching and rate limiting Setup Steps Setup time: 30-45 minutes Requirements: Bright Data account (Web Scraping API + MCP token) OpenAI API key (GPT-4o and GPT-4o-mini access) Redis instance Slack workspace (optional) SMTP email provider (optional) Google account (optional for Sheets integration) Core Setup: Get Bright Data Web Scraping API token and MCP token Get OpenAI API key Set up Redis instance Configure critical nodes: Webhook Entry: Add Header Auth token Bright Data MCP Tool: Add MCP endpoint with token Parallel Web Scraping: Add Bright Data API credentials Redis Nodes: Add connection credentials All GPT Nodes: Add OpenAI API key (5 nodes) Slack/Email: Add credentials if using Google Sheets Integration: Create Google Sheet Open Extensions → Apps Script Paste the companion google-apps-script.js code Update webhook URL and auth token Save and authorize Test: {"prompt": "What is the population of Tokyo?", "source": "Test", "language": "English"} Customization Guidance Source Count:** Change from 5 to 3-10 URLs per query Cache Duration:** Adjust from 1 hour to 24 hours for stable info Rate Limits:** Modify 60/minute based on usage needs Character Limits:** Adjust 400-char main answer to 200-1000 AI Models:** Swap GPT-4o for Claude or use GPT-4o-mini for all stages Geographic Targeting:** Add more regions beyond us/il Output Channels:** Add Notion, Airtable, Discord, Teams Temperature:** Lower (0.1-0.2) for facts, higher (0.4-0.6) for analysis Once configured, this workflow handles all web research, from fact-checking to complex analysis—delivering validated intelligence in seconds with automatic caching. Built by Daniel Shashko Connect on LinkedIn
by Oneclick AI Squad
This is an advanced n8n workflow for transforming product concepts into 3D showcase videos with AI packaging design and auto-rotation rendering. Workflow Features: 🎯 Core Capabilities: AI Product Concept Generation - Uses Claude Sonnet 4 to analyze product prompts and generate comprehensive 3D specifications Automated Packaging Design - DALL-E 3 generates professional packaging visuals Texture Map Generation - Creates PBR-ready texture maps for realistic materials 3D Scene Script Generation - Produces complete Blender Python scripts with: Product geometry based on shape Professional 3-point lighting (key, fill, rim) 360° rotation animation (8 seconds) Camera setup and render settings Preview Rendering - Generates photorealistic 3D preview images Video Processing - Handles encoding and upload to video hosting services Database Storage - Saves all showcase data for tracking Status Monitoring - Checks render progress with automatic retry logic 📋 Required Setup: API Credentials needed: Anthropic API (for Claude AI) OpenAI API (for DALL-E image generation) Replicate API (optional for additional rendering) Video hosting service (Cloudflare Stream or similar) PostgreSQL database 🔧 How to Use: Import the JSON - Copy the artifact content and import directly into n8n Configure Credentials - Add your API keys in n8n credentials manager Activate Workflow - Enable the webhook trigger Send Request to webhook endpoint: POST /product-showcase { "productPrompt": "A premium organic energy drink in a sleek aluminum can with nature-inspired graphics" } 📤 Output Includes: Product specifications (dimensions, materials, colors) Packaging design image URL Texture map URLs Downloadable Blender script 3D preview render Video showcase URL Rendering metadata
by Trung Tran
Multi-Agent Architecture Free Bootstrap Template for Beginners Free template to learn and reuse a multi-agent architecture in n8n. The company metaphor: a CEO (orchestrator) delegates to Marketing, Operations, Finance to produce a short sales-season plan, export it to PDF, and share it. Who’s it for Builders who want a clear, minimal pattern for multi-agent orchestration in n8n. Teams demoing/teaching agent collaboration with one coordinator + three specialists. Anyone needing a repeatable template to generate plans from multiple “departments”. How it works / What it does Trigger (Manual) — Click Execute workflow to start. Edit Fields — Provide brief inputs (company, products, dates, constraints, channels, goals). CEO Agent (Orchestrator) — Reads the brief, calls 3 tool agents once, merges results, resolves conflicts. Marketing Agent — Proposes top campaigns + channels + content calendar. Operations Agent — Outlines inventory/staffing readiness, fulfillment steps, risks. Finance Agent — Suggests pricing/discounts, budget split, targets. Compose Document — CEO produces Markdown; node converts to Google Doc → PDF. Share — Upload the PDF to Slack (or Drive) for review. Outputs Markdown plan** with sections (Summary, Timeline, Marketing, Ops, Pricing, Risks, Next Actions). Compact JSON** for automation (campaigns, budget, dates, actions). PDF** file for stakeholders. How to set up Add credentials OpenAI (or your LLM provider) for all agents. Google (Drive/Docs) to create the document and export PDF. Slack (optional) to upload/share the PDF. Map nodes (suggested) When clicking ‘Execute workflow’ → Edit Fields (form with: company, products, audience, start_date, end_date, channels, constraints, metrics). CEO Agent (AI Tool Node) → calls Marketing Agent, Operations Agent, Finance Agent (AI Tool Nodes). Configure metadata (doc title from company + window). Create document file (Google Docs API) with CEO Markdown. Convert to PDF (export). Upload a file (Slack) to share. Prompts (drop-in) CEO (system): orchestrate 3 tools; request concise JSON+Markdown; merge & resolve; output sections + JSON. Marketing / Operations / Finance (system): each returns a small JSON per its scope (campaigns/calendar; staffing/steps/risks; discounts/budget/targets). Test — Run once; verify the PDF and Slack message. Requirements n8n (current version with AI Tool Node). LLM credentials (e.g., OpenAI). Google credentials for Docs/Drive (to create & export). Optional Slack bot token for file uploads. How to customize the workflow Swap roles**: Replace departments (e.g., Product, Legal, Support) or add more tool agents. Change outputs: Export to **DOCX/HTML/Notion; add a cover page; attach brand styles. Approval step: Insert **Slack “Send & Wait” before PDF generation for review/edits. Data grounding**: Add RAG (Sheets/DB/Docs) so agents cite inventory, pricing, or past campaign KPIs. Automation JSON**: Extend the schema to match your CRM/PM tool and push next_actions into Jira/Asana. Scheduling: Replace manual trigger with a **cron (weekly/monthly planning). Localization**: Add a Translation agent or set language via input field. Guardrails**: Add length limits, cost caps (max tokens), and validation on agent JSON.
by Cheng Siong Chin
How It Works A scheduled trigger initiates automated retrieval of TOTO/4D data, including both current and historical records. The datasets are merged and validated to ensure structural consistency before branching into parallel analytical pipelines. One track performs pattern mining and anomaly detection, while the other generates statistical and time-series forecasts. Results are then routed to an AI agent that integrates multi-model insights, evaluates prediction confidence, and synthesizes the final output. The system formats the results and delivers them through the selected export channel. Setup Instructions 1. Scheduler Config: Adjust the trigger frequency (daily or weekly). 2. Data Sources: Configure API endpoints or database connectors for TOTO/4D retrieval. 3. Data Mapping: Align and map column structures for both 1D and 4D datasets in merge nodes. 4. AI Integration: Insert the OpenAI API key and connect the required model nodes. 5. Export Paths: Select and configure output channels (email, Google Sheets, webhook, or API). Prerequisites TOTO/4D historical data source with API access OpenAI API key (GPT-4 recommended) n8n environment with HTTP/database connectivity Basic time series analysis knowledge Use Cases Traders: Pattern recognition for draw prediction with confidence scoring Analysts: Multivariate forecasting across cycles with validation Customization Data: Swap TOTO/4D with stock prices, crypto, sensors, or any time series Models: Replace OpenAI with Claude, local LLMs, or HuggingFace models Benefits Automation: Runs 24/7 without manual intervention Intelligence: Ensemble approach prevents overfitting and single-model bias
by Juan Carlos Cavero Gracia
This workflow transforms any video you drop into a Google Drive folder into a ready-to-publish YouTube upload. It analyzes the video with AI to craft 3 high-CTR title ideas, 3 long SEO-friendly descriptions (with timestamps), and 10–15 optimized tags. It then generates 4 thumbnail options using your face and lets you pick your favorite before auto-publishing to YouTube via Upload-Post. Who Is This For? YouTube Creators & Editors:** Ship videos with winning titles, thumbnails, and SEO in minutes. Agencies & Media Teams:** Standardize output and speed across channels and clients. Founders & Solo Makers:** Maintain consistent publishing with minimal manual work. What Problem Does It Solve? Producing SEO metadata and high-performing thumbnails is slow and inconsistent. This flow: Generates High-CTR Options:** 3 distinct angles for title/description/tags. Creates Thumbnails with Your Face:** 4 options ready for review in one pass. Auto-Publishes Safely:** Human selection gates reduce risk before going live. How It Works Google Drive Trigger: Watches a folder for new video files. AI Video Analysis (Gemini): Produces an in-depth Spanish description and timestamps. Concept Generation: Returns 3 JSON concepts (title, thumbnail prompt, description, tags). User Review #1: Pick your favorite concept in a simple form. Thumbnail Generation (fal.ai): Creates 4 thumbnails using your face (provided image URL). User Review #2: Choose the best thumbnail. Upload to YouTube (Upload-Post): Publishes the video with your chosen title, description, tags, and thumbnail. Setup Credentials (all offer free trials, no credit card required): Google Gemini (chat/vision for analysis) fal.ai API (thumbnail generation) Upload-Post ( Connect your Youtube channel and generate api keys) Google Drive OAuth (folder watch + file download) Provide Your Face Image URL(s): Used by fal.ai to integrate your face into thumbnails. Select the Google Drive Folder: Where you’ll drop videos to process. Pick & Publish: Use the built-in forms to choose concept and thumbnail. Requirements Accounts:** Google (Drive + Gemini), fal.ai, Upload-Post, n8n. API Keys:** Gemini, fal.ai; Upload-Post credentials; Google Drive OAuth. Assets:** At least one clear face image for thumbnails. Features Three SEO Angles:** Distinct title/description sets to test different intents. Rich Descriptions with Timestamps:** Ready for YouTube SEO and viewer navigation. Face-Integrated Thumbnails:** 4 options aligned with the selected title. Human-in-the-Loop Controls:** Approve concepts and thumbnails before publishing. Auto-Publish via Upload-Post:** One click to push live to YouTube. Start Free:** All API calls can run on free trials, no credit card required. Video demo https://www.youtube.com/watch?v=EOOgFveae-U
by Jameson Kanakulya
Automated Content Page Generator with AI, Tavily Research, and Supabase Storage > ⚠️ Self-Hosted Disclaimer: This template requires self-hosted n8n installation and external service credentials (OpenAI, Tavily, Google Drive, NextCloud, Supabase). It cannot run on n8n Cloud due to dependency requirements. Overview Transform simple topic inputs into professional, multi-platform content automatically. This workflow combines AI-powered content generation with intelligent research and seamless storage integration to create website content, blog articles, and landing pages optimized for different audiences. Key Features Automated Research**: Uses Tavily's advanced search to gather relevant, up-to-date information Multi-Platform Content**: Generates optimized content for websites, blogs, and landing pages Image Management**: Downloads from Google Drive and uploads to NextCloud with public URL generation Database Integration**: Stores all content in Supabase for easy retrieval Error Handling**: Built-in error management workflow for reliability Content Optimization**: AI-driven content strategy with trend analysis and SEO optimization Required Services & APIs Core Services n8n**: Self-hosted instance (required) OpenAI**: GPT-4 API access for content generation Tavily**: Research API for content discovery Google Drive**: Image storage and retrieval Google Sheets**: Content input and workflow triggering NextCloud**: Image hosting and public URL generation Supabase**: Database storage for generated content Setup Instructions ## Prerequisites Before setting up this workflow, ensure you have: Self-hosted n8n installation API credentials for all required services Database table created in Supabase ## Step 1: Service Account Configuration OpenAI Setup Create an OpenAI account at platform.openai.com Generate API key from the API Keys section In n8n, create new OpenAI credentials using your API key Test connection to ensure GPT-4 access Tavily Research Setup Sign up at tavily.com Get your API key from the dashboard Add Tavily credentials in n8n Configure search depth to "advanced" for best results Google Services Setup Create Google Cloud Project Enable Google Drive API and Google Sheets API Create OAuth2 credentials Configure Google Drive and Google Sheets credentials in n8n Share your input spreadsheet with the service account NextCloud Setup Install NextCloud or use hosted solution Create application password for API access Configure NextCloud credentials in n8n Create /images/ folder for content storage Supabase Setup Create Supabase project at supabase.com Create table with the following structure: CREATE TABLE works ( id SERIAL PRIMARY KEY, title TEXT NOT NULL, content TEXT NOT NULL, image_url TEXT, category TEXT, created_at TIMESTAMP DEFAULT NOW() ); Get project URL and service key from settings Configure Supabase credentials in n8n ## Step 2: Google Sheets Input Setup Create a Google Sheets document with the following columns: TITLE**: Topic or title for content generation IMAGE_URL**: Google Drive sharing URL for associated image Example format: TITLE | IMAGE_URL AI Chatbot Implementation | https://drive.google.com/file/d/your-file-id/view Digital Marketing Trends 2024 | https://drive.google.com/file/d/another-file-id/view ## Step 3: Workflow Import and Configuration Import the workflow JSON into your n8n instance Configure all credential connections: Link OpenAI credentials to "OpenAI_GPT4_Model" node Link Tavily credentials to "Tavily_Research_Agent" node Link Google credentials to "Google_Sheets_Trigger" and "Google_Drive_Image_Downloader" nodes Link NextCloud credentials to "NextCloud_Image_Uploader" and "NextCloud_Public_URL_Generator" nodes Link Supabase credentials to "Supabase_Content_Storage" node Update the Google Sheets Trigger node: Set your spreadsheet ID in the documentId field Configure polling frequency (default: every minute) Test each node connection individually before activating ## Step 4: Error Handler Setup (Optional) The workflow references an error handler workflow (GWQ4UI1i3Z0jp3GF). Either: Create a simple error notification workflow with this ID Remove the error handling references if not needed Update the workflow ID to match your error handler ## Step 5: Workflow Activation Save all node configurations Test the workflow with a sample row in your Google Sheet Verify content generation and storage in Supabase Activate the workflow for continuous monitoring How It Works ## Workflow Process Trigger: Google Sheets monitors for new rows with content topics Research: Tavily searches for 3 relevant articles about the topic Content Generation: AI agent creates multi-platform content (website, blog, landing page) Content Cleaning: Text processing removes formatting artifacts Image Processing: Downloads image from Google Drive, uploads to NextCloud URL Generation: Creates public sharing links for images Storage: Saves final content package to Supabase database ## Content Output Structure Each execution generates: Optimized Title**: SEO-friendly, platform-appropriate headline Multi-Platform Content**: Website content (professional, authority-building) Blog content (educational, SEO-optimized) Landing page content (conversion-focused) Category Classification**: Automated content categorization Image Assets**: Processed and publicly accessible images Customization Options ## Content Strategy Modification Edit the AI agent's system message to change content style Adjust character limits for different platform requirements Modify category classifications for your industry ## Research Parameters Change Tavily search depth (basic, advanced) Adjust number of research sources (1-10) Modify search topic focus ## Storage Configuration Update Supabase table structure for additional fields Change NextCloud folder organization Modify image naming conventions Troubleshooting ## Common Issues Workflow not triggering: Check Google Sheets permissions Verify polling frequency settings Ensure spreadsheet format matches requirements Content generation errors: Verify OpenAI API key and credits Check GPT-4 model access Review system message formatting Image processing failures: Confirm Google Drive sharing permissions Check NextCloud storage space and permissions Verify file formats are supported Database storage issues: Validate Supabase table structure Check API key permissions Review field mapping in storage node ## Performance Optimization Adjust polling frequency based on your content volume Monitor API usage to stay within limits Consider batch processing for high-volume scenarios Support and Updates This template is designed for self-hosted n8n environments and requires technical setup. For issues: Check n8n community forums Review service-specific documentation Test individual nodes in isolation Monitor execution logs for detailed error information
by aditya vadaganadam
This n8n template turns chat questions into structured financial reports using Gemini and posts them to a Discord channel via webhook. Ask about tickers, sectors, or theses (e.g., “NVDA long‑term outlook?” or “Gold ETF short‑term drivers?”) and receive a concise, shareable report. Good to know Not financial advice: Use for insights only; verify independently. Model availability can vary by region. If you see “model not found,” it may be geo‑restricted. Costs depend on model and tokens. Check current Gemini pricing for updates. Discord messages are limited to ~2000 characters per post; long reports may need splitting. Rate limits: Discord webhooks are rate‑limited; add short waits for bursts. How it works Chat Trigger collects the user’s question (public chat supported when the workflow is activated). Conversation Memory keeps a short window of recent messages to maintain context. Connect Gemini provides the LLM (e.g., gemini‑2.5‑flash‑lite) and parameters (temperature, tokens). Agent (agent1) applies a financial analysis System Message to produce structured insights. Structured Output Parser enforces a simple JSON schema: idea (one‑line thesis) + analysis (Markdown sections). Code formats a Discord‑ready Markdown report (title, question, executive summary, sections, disclaimer). Edit Fields maps the formatted report to a clean content field. Discord Webhook posts the final report to your channel. How to use Start with the built‑in Chat Trigger: click Open chat, ask a question, and verify the Discord post. Replace or augment with a Cron or Webhook trigger for scheduled or programmatic runs. For richer context, add HTTP Request nodes (prices, news, filings) and pass summaries to the agent. Requirements n8n instance with internet access Google AI (Gemini) API key Discord server with a webhook URL Customising this workflow System Message: Adjust tone, depth, risk profile, and required sections (Summary, Drivers, Risks, Metrics, Next Steps, Takeaway). Model settings: Switch models or tune temperature/tokens in Connect Gemini. Schema: Extend the parser and formatter with fields like drivers[], risks[], or metrics{}. Formatting: Edit the Code node to change headings, emojis, disclaimers, or add timestamps. Operations: Add retries, message splitting for long outputs, and rate‑limit handling for Discord.
by DIGITAL BIZ TECH
AI-Powered Website Chatbot with Google Drive Knowledge Base Overview This workflow combines website chatbot intelligence with automated document ingestion and vectorization — enabling live Q&A from both chat input and processed Google Drive files. It uses Mistral AI for OCR + embeddings, and Qdrant for vector search. Chatbot Flow Trigger:** When chat message received or webhook based upon deployed chatbot Model:** OpenAI gpt-4.1-mini Memory:** Simple Memory (Buffer Window) Vector Search Tool:** Qdrant Vector Store Embeddings:** Mistral Cloud Agent:** website chat agent Responds based on chatdbtai Supabase content Enforces brand tone and informative documents. Integratration with both: Embedded chat UI Webhook Document → Knowledge Base Pipeline Triggered manually to keep vector store up-to-date. Steps Google Drive (brand folder) → Fetch files from folder Website kb (ID: 1o3DK9Ceka5Lqb8irvFSfEeB8SVGG_OL7) Loop Over Items → For each file: Set metadata Download file Upload to Mistral for OCR Get Signed URL Run OCR extraction (mistral-ocr-latest) If OCR success → Pass to chunking pipeline Else → skip and continue Chunking Logic (Code node) Splits document into 1,000-character JSON chunks Adds metadata (source, char positions, file ID) Default Data Loader + Text Splitter → Prepares chunks for embedding Embeddings (Mistral Cloud) → Generates embeddings for text chunks Qdrant Vector Store (Insert mode) → Saves embeddings into docragtestkb collection Wait → Optional delay between batches Integrations Used | Service | Purpose | Credential | |----------|----------|------------| | Google Drive | File source | Google Drive account 6 rn dbt | | Mistral Cloud | OCR + embeddings | Mistral Cloud account 2 dbt rn | | Qdrant | Vector storage | QdrantApi account | | OpenAI | Chat model | OpenAi account 8 dbt digi | Agent System Prompt Summary > “You are the official AI assistant for this website. Use chatdbtai only as your knowledge source. Respond conversationally, list offerings clearly, link blogs, and say ‘I couldn’t find that on this site’ if no match.” Key Features ✅ Automated OCR + chunking → vectorization ✅ Persistent memory for chat sessions ✅ Multi-channel (Webhook + Embedded Chat) ✅ Fully brand-guided, structured responses ✅ Live data retrieval from Qdrant vector store Summary > A unified workflow that turns brand files + web content into a knowledge base that powers a intelligent chatbot — capable of responding to visitors in real time, powered by Mistral, OpenAI, and Qdrant. Need Help or More Workflows? Want to customize this workflow for your business or integrate it with your existing tools? Our team at Digital Biz Tech can tailor it precisely to your use case from automation logic to AI-powered enhancements. 💡 We can help you set it up for free — from connecting credentials to deploying it live. Contact: shilpa.raju@digitalbiz.tech Website: https://www.digitalbiz.tech LinkedIn: https://www.linkedin.com/company/digital-biz-tech/ You can also DM us on LinkedIn for any help.
by Trung Tran
Create AI-Powered Chatbot for Candidate Evaluation on Slack > This workflow connects a Slack chatbot with AI agents and Google Sheets to automate candidate resume evaluation. It extracts resume details, identifies the applied job from the message, fetches the correct job description, and provides a summarized evaluation via Slack and tracking sheet. Perfect for HR teams using Slack. Who’s it for This workflow is designed for: HR Teams, **Recruiters, and Hiring Managers Working in software or tech companies using Slack, Google Sheets, and n8n Who want to automate candidate evaluation based on uploaded profiles and applied job positions How it works / What it does This workflow is triggered when a Slack user mentions the HR bot and attaches a candidate profile PDF. The workflow performs the following steps: Trigger from Slack Mention A user mentions the bot in Slack with a message like: @HRBot Please evaluate this candidate for the AI Engineer role. (with PDF attached) Input Validation If no file is attached, the bot replies: "Please upload the candidate profile file before sending the message." Extract Candidate Profile Downloads the attached PDF from Slack Uses Extract from File to parse the resume into text Profile Analysis (AI Agent) Sends the resume text and message to the Profile Analyzer Agent Identifies: Candidate name, email, and summary Applied position (from message) Looks up the Job Description PDF URL using Google Sheets Job Description Retrieval Downloads and parses the matching JD PDF HR Evaluation (AI Agent) Sends both the candidate profile and job description to HR Expert Agent Receives a summarized fit evaluation and insights Output and Logging Sends evaluation result back to Slack in the original thread Updates a Google Sheet with evaluation data for tracking How to set up Slack Setup Create a Slack bot and install it into your workspace Enable the app_mention event and generate a bot token Connect Slack to n8n using Slack Bot credentials Google Sheets Setup Create a sheet mapping Position Title → Job Description URL Create another sheet for logging evaluation results n8n Setup Add a Webhook Trigger for Slack mentions Connect Slack, Google Sheets, and GPT-4 credentials Set up agents (Profile Analyzer Agent, HR Expert Agent) with appropriate prompts Deploy & Test Mention your bot in Slack with a message and file Confirm the reply and entry in the evaluation tracking sheet Requirements n8n (self-hosted or cloud) Slack App with Bot Token OpenAI or Azure OpenAI account (for GPT-4) Google Sheets (2 sheets: job mapping + evaluation log) Candidate profiles in PDF format Defined job titles and descriptions How to customize the workflow You can easily adapt this workflow to your team’s needs: | Customization Area | How to Customize | |--------------------------|----------------------------------------------------------------------------------| | Job Mapping Source | Replace Google Sheet with Airtable or Notion DB | | JD Format | Use Markdown or inline descriptions instead of PDF | | Evaluation Output Format | Change from Slack message to Email or Notion update | | HR Agent Prompt | Customize to match your company tone or include scoring rubrics | | Language Support | Add support for bilingual input/output (e.g., Vietnamese & English) | | Workflow Trigger | Trigger from slash command or form instead of @mention |
by Guillaume Duvernay
This template introduces a revolutionary approach to automated web research. Instead of a rigid workflow that can only find one type of information, this system uses a "thinker" and "doer" AI architecture. It dynamically interprets your plain-English research request, designs a custom spreadsheet (CSV) with the perfect columns for your goal, and then deploys a web-scraping AI to fill it out. It's like having an expert research assistant who not only finds the data you need but also builds the perfect container for it on the fly. Whether you're looking for sales leads, competitor data, or market trends, this workflow adapts to your request and delivers a perfectly structured, ready-to-use dataset every time. Who is this for? Sales & marketing teams:** Generate targeted lead lists, compile competitor analysis, or gather market intelligence with a simple text prompt. Researchers & analysts:** Quickly gather and structure data from the web for any topic without needing to write custom scrapers. Entrepreneurs & business owners:** Perform rapid market research to validate ideas, find suppliers, or identify opportunities. Anyone who needs structured data:** Transform unstructured, natural language requests into clean, organized spreadsheets. What problem does this solve? Eliminates rigid, single-purpose workflows:** This workflow isn't hardcoded to find just one thing. It dynamically adapts its entire research plan and data structure based on your request. Automates the entire research process:** It handles everything from understanding the goal and planning the research to executing the web search and structuring the final data. Bridges the gap between questions and data:** It translates your high-level goal (e.g., "I need sales leads") into a concrete, structured spreadsheet with all the necessary columns (Company Name, Website, Key Contacts, etc.). Optimizes for cost and efficiency:* It intelligently uses a combination of deep-dive and standard web searches from *Linkup.so** to gather high-quality initial results and then enrich them cost-effectively. How it works (The "Thinker & Doer" Method) The process is cleverly split into two main phases: The "Thinker" (AI Planner): You submit a research request via the built-in form (e.g., "Find 50 US-based fashion companies for a sales outreach campaign"). The first AI node acts as the "thinker." It analyzes your request and determines the optimal structure for your final spreadsheet. It dynamically generates a plan, which includes a discoveryQuery to find the initial list, an enrichmentQuery to get details for each item, and the JSON schemas that define the exact columns for your CSV. The "Doer" (AI Researcher): The rest of the workflow is the "doer," which executes the plan. Discovery: It uses a powerful "deep search" with Linkup.so to execute the discoveryQuery and find the initial list of items (e.g., the 50 fashion companies). Enrichment: It then loops through each item in the list. For each one, it performs a fast and cost-effective "standard search" with Linkup to execute the enrichmentQuery, filling in all the detailed columns defined by the "thinker." Final Output: The workflow consolidates all the enriched data and converts it into a final CSV file, ready for download or further processing. Setup Connect your AI provider: In the OpenAI Chat Model node, add your AI provider's credentials. Connect your Linkup account: In the two Linkup (HTTP Request) nodes, add your Linkup API key (free account at linkup.so). We recommend creating a "Generic Credential" of type "Bearer Token" for this. Linkup offers €5 of free credits monthly, which is enough for 1k standard searches or 100 deep queries. Activate the workflow: Toggle the workflow to "Active." You can now use the form to submit your first research request! Taking it further Add a custom dashboard:** Replace the form trigger and final CSV output with a more polished user experience. For example, build a simple web app where users can submit requests and download their completed research files. Make it company-aware:** Modify the "thinker" AI's prompt to include context about your company. This will allow it to generate research plans that are automatically tailored to finding leads or data relevant to your specific products and services. Add an AI summary layer:** After the CSV is generated, add a final AI node to read the entire file and produce a high-level summary, such as "Here are the top 5 leads to contact first and why," turning the raw data into an instant, actionable report.