by Cheng Siong Chin
How It Works A scheduled trigger initiates automated retrieval of TOTO/4D data, including both current and historical records. The datasets are merged and validated to ensure structural consistency before branching into parallel analytical pipelines. One track performs pattern mining and anomaly detection, while the other generates statistical and time-series forecasts. Results are then routed to an AI agent that integrates multi-model insights, evaluates prediction confidence, and synthesizes the final output. The system formats the results and delivers them through the selected export channel. Setup Instructions 1. Scheduler Config: Adjust the trigger frequency (daily or weekly). 2. Data Sources: Configure API endpoints or database connectors for TOTO/4D retrieval. 3. Data Mapping: Align and map column structures for both 1D and 4D datasets in merge nodes. 4. AI Integration: Insert the OpenAI API key and connect the required model nodes. 5. Export Paths: Select and configure output channels (email, Google Sheets, webhook, or API). Prerequisites TOTO/4D historical data source with API access OpenAI API key (GPT-4 recommended) n8n environment with HTTP/database connectivity Basic time series analysis knowledge Use Cases Traders: Pattern recognition for draw prediction with confidence scoring Analysts: Multivariate forecasting across cycles with validation Customization Data: Swap TOTO/4D with stock prices, crypto, sensors, or any time series Models: Replace OpenAI with Claude, local LLMs, or HuggingFace models Benefits Automation: Runs 24/7 without manual intervention Intelligence: Ensemble approach prevents overfitting and single-model bias
by Jameson Kanakulya
Automated Content Page Generator with AI, Tavily Research, and Supabase Storage > ⚠️ Self-Hosted Disclaimer: This template requires self-hosted n8n installation and external service credentials (OpenAI, Tavily, Google Drive, NextCloud, Supabase). It cannot run on n8n Cloud due to dependency requirements. Overview Transform simple topic inputs into professional, multi-platform content automatically. This workflow combines AI-powered content generation with intelligent research and seamless storage integration to create website content, blog articles, and landing pages optimized for different audiences. Key Features Automated Research**: Uses Tavily's advanced search to gather relevant, up-to-date information Multi-Platform Content**: Generates optimized content for websites, blogs, and landing pages Image Management**: Downloads from Google Drive and uploads to NextCloud with public URL generation Database Integration**: Stores all content in Supabase for easy retrieval Error Handling**: Built-in error management workflow for reliability Content Optimization**: AI-driven content strategy with trend analysis and SEO optimization Required Services & APIs Core Services n8n**: Self-hosted instance (required) OpenAI**: GPT-4 API access for content generation Tavily**: Research API for content discovery Google Drive**: Image storage and retrieval Google Sheets**: Content input and workflow triggering NextCloud**: Image hosting and public URL generation Supabase**: Database storage for generated content Setup Instructions ## Prerequisites Before setting up this workflow, ensure you have: Self-hosted n8n installation API credentials for all required services Database table created in Supabase ## Step 1: Service Account Configuration OpenAI Setup Create an OpenAI account at platform.openai.com Generate API key from the API Keys section In n8n, create new OpenAI credentials using your API key Test connection to ensure GPT-4 access Tavily Research Setup Sign up at tavily.com Get your API key from the dashboard Add Tavily credentials in n8n Configure search depth to "advanced" for best results Google Services Setup Create Google Cloud Project Enable Google Drive API and Google Sheets API Create OAuth2 credentials Configure Google Drive and Google Sheets credentials in n8n Share your input spreadsheet with the service account NextCloud Setup Install NextCloud or use hosted solution Create application password for API access Configure NextCloud credentials in n8n Create /images/ folder for content storage Supabase Setup Create Supabase project at supabase.com Create table with the following structure: CREATE TABLE works ( id SERIAL PRIMARY KEY, title TEXT NOT NULL, content TEXT NOT NULL, image_url TEXT, category TEXT, created_at TIMESTAMP DEFAULT NOW() ); Get project URL and service key from settings Configure Supabase credentials in n8n ## Step 2: Google Sheets Input Setup Create a Google Sheets document with the following columns: TITLE**: Topic or title for content generation IMAGE_URL**: Google Drive sharing URL for associated image Example format: TITLE | IMAGE_URL AI Chatbot Implementation | https://drive.google.com/file/d/your-file-id/view Digital Marketing Trends 2024 | https://drive.google.com/file/d/another-file-id/view ## Step 3: Workflow Import and Configuration Import the workflow JSON into your n8n instance Configure all credential connections: Link OpenAI credentials to "OpenAI_GPT4_Model" node Link Tavily credentials to "Tavily_Research_Agent" node Link Google credentials to "Google_Sheets_Trigger" and "Google_Drive_Image_Downloader" nodes Link NextCloud credentials to "NextCloud_Image_Uploader" and "NextCloud_Public_URL_Generator" nodes Link Supabase credentials to "Supabase_Content_Storage" node Update the Google Sheets Trigger node: Set your spreadsheet ID in the documentId field Configure polling frequency (default: every minute) Test each node connection individually before activating ## Step 4: Error Handler Setup (Optional) The workflow references an error handler workflow (GWQ4UI1i3Z0jp3GF). Either: Create a simple error notification workflow with this ID Remove the error handling references if not needed Update the workflow ID to match your error handler ## Step 5: Workflow Activation Save all node configurations Test the workflow with a sample row in your Google Sheet Verify content generation and storage in Supabase Activate the workflow for continuous monitoring How It Works ## Workflow Process Trigger: Google Sheets monitors for new rows with content topics Research: Tavily searches for 3 relevant articles about the topic Content Generation: AI agent creates multi-platform content (website, blog, landing page) Content Cleaning: Text processing removes formatting artifacts Image Processing: Downloads image from Google Drive, uploads to NextCloud URL Generation: Creates public sharing links for images Storage: Saves final content package to Supabase database ## Content Output Structure Each execution generates: Optimized Title**: SEO-friendly, platform-appropriate headline Multi-Platform Content**: Website content (professional, authority-building) Blog content (educational, SEO-optimized) Landing page content (conversion-focused) Category Classification**: Automated content categorization Image Assets**: Processed and publicly accessible images Customization Options ## Content Strategy Modification Edit the AI agent's system message to change content style Adjust character limits for different platform requirements Modify category classifications for your industry ## Research Parameters Change Tavily search depth (basic, advanced) Adjust number of research sources (1-10) Modify search topic focus ## Storage Configuration Update Supabase table structure for additional fields Change NextCloud folder organization Modify image naming conventions Troubleshooting ## Common Issues Workflow not triggering: Check Google Sheets permissions Verify polling frequency settings Ensure spreadsheet format matches requirements Content generation errors: Verify OpenAI API key and credits Check GPT-4 model access Review system message formatting Image processing failures: Confirm Google Drive sharing permissions Check NextCloud storage space and permissions Verify file formats are supported Database storage issues: Validate Supabase table structure Check API key permissions Review field mapping in storage node ## Performance Optimization Adjust polling frequency based on your content volume Monitor API usage to stay within limits Consider batch processing for high-volume scenarios Support and Updates This template is designed for self-hosted n8n environments and requires technical setup. For issues: Check n8n community forums Review service-specific documentation Test individual nodes in isolation Monitor execution logs for detailed error information
by Jitesh Dugar
Transform college admissions from an overwhelming manual process into an intelligent, efficient, and equitable system that analyzes essays, scores applicants holistically, and identifies top candidates—saving 40+ hours per week while improving decision quality. 🎯 What This Workflow Does Automates comprehensive application review with AI-powered analysis: 📝 Application Intake - Captures complete college applications via Jotform 📚 AI Essay Analysis - Deep analysis of personal statements and supplemental essays for: Writing quality, authenticity, and voice AI-generated content detection Specificity and research quality Red flags (plagiarism, inconsistencies, generic writing) 🎯 Holistic Review AI - Evaluates applicants across five dimensions: Academic strength (GPA, test scores, rigor) Extracurricular profile (leadership, depth, impact) Personal qualities (character, resilience, maturity) Institutional fit (values alignment, contribution potential) Diversity contribution (unique perspectives, experiences) 🚦 Smart Routing - Automatically categorizes and routes applications: Strong Admit (85-100): Slack alert → Director email → Interview invitation → Fast-track Committee Review (65-84): Detailed analysis → Committee discussion → Human decision Standard Review (<65): Acknowledgment → Human verification → Standard timeline 📊 Comprehensive Analytics - All applications logged with scores, recommendations, and outcomes ✨ Key Features AI Essay Analysis Engine Writing Quality Assessment**: Grammar, vocabulary, structure, narrative coherence Authenticity Detection**: Distinguishes genuine voice from AI-generated content (GPT detectors) Content Depth Evaluation**: Self-awareness, insight, maturity, storytelling ability Specificity Scoring**: Generic vs tailored "Why Us" essays with research depth Red Flag Identification**: Plagiarism indicators, privilege blindness, inconsistencies, template writing Thematic Analysis**: Core values, motivations, growth narratives, unique perspectives Holistic Review Scoring (0-100 Scale) Academic Strength (35%)**: GPA in context, test scores, course rigor, intellectual curiosity Extracurricular Profile (25%)**: Quality over quantity, leadership impact, commitment depth Personal Qualities (20%)**: Character, resilience, empathy, authenticity, self-awareness Institutional Fit (15%)**: Values alignment, demonstrated interest, contribution potential Diversity Contribution (5%)**: Unique perspectives, life experiences, background diversity Intelligent Candidate Classification Admit**: Top 15% - clear admit, exceptional across multiple dimensions Strong Maybe**: Top 15-30% - competitive, needs committee discussion Maybe**: Top 30-50% - solid but not standout, waitlist consideration Deny**: Below threshold - does not meet competitive standards (always human-verified) Automated Workflows Priority Candidates**: Immediate Slack alerts, director briefs, interview invitations Committee Cases**: Detailed analysis packets, discussion points, voting workflows Standard Processing**: Professional acknowledgments, timeline communications Interview Scheduling**: Automated invitations with candidate-specific questions 💼 Perfect For Selective Colleges & Universities**: 15-30% acceptance rates, holistic review processes Liberal Arts Colleges**: Emphasis on essays, personal qualities, institutional fit Large Public Universities**: Processing thousands of applications efficiently Graduate Programs**: MBA, law, medical school admissions Scholarship Committees**: Evaluating merit and need-based awards Honors Programs**: Identifying top candidates for selective programs Private High Schools**: Admissions teams with holistic processes 🎓 Admissions Impact Efficiency & Productivity 40-50 hours saved per week** on initial application review 70% faster** essay evaluation with AI pre-analysis 3x more applications** processed per reader Zero data entry** - all information auto-extracted Consistent evaluation** across thousands of applications Same-day turnaround** for top candidate identification Decision Quality Improvements Objective scoring** reduces unconscious bias Consistent criteria** applied to all applicants Essay authenticity checks** catch AI-written applications Holistic view** considers all dimensions equally Data-driven insights** inform committee discussions Fast-track top talent** before competitors Equity & Fairness Standardized evaluation** ensures fair treatment First-generation flagging** provides context Socioeconomic consideration** in holistic scoring Diverse perspectives valued** in diversity score Bias detection** in essay analysis Audit trail** for compliance and review Candidate Experience Instant acknowledgment** of application receipt Professional communication** at every stage Clear timelines** and expectations Interview invitations** for competitive candidates Respectful process** for all applicants regardless of outcome 🔧 What You'll Need Required Integrations Jotform** - Application intake forms Create your form for free on JotForm using this link OpenAI API** - GPT-4o for analysis (~$0.15-0.25 per application) Gmail/Outlook** - Applicant and staff communication (free) Google Sheets** - Application database and analytics (free) Optional Integrations Slack** - Real-time alerts for strong candidates ($0-8/user/month) Google Calendar** - Interview scheduling automation (free) Airtable** - Advanced application tracking (alternative to Sheets) Applicant Portal Integration** - Status updates via API CRM Systems** - Slate, TargetX, Salesforce for higher ed 🚀 Setup Guide (3-4 Hours) Step 1: Create Application Form (60 min) Build comprehensive Jotform with sections: Basic Information Full name, email, phone High school, graduation year Intended major Academic Credentials GPA (weighted/unweighted, scale) SAT score (optional) ACT score (optional) Class rank (if available) Academic honors Essays (Most Important!) Personal statement (650 words max) "Why Our College" essay (250-300 words) Supplemental prompts (program-specific) Activities & Achievements Extracurricular activities (list with hours/week, years) Leadership positions (with descriptions) Honors and awards Community service hours Work experience Additional Information First-generation college student (yes/no) Financial aid needed (yes/no) Optional: demographic information Optional: additional context Step 2: Import n8n Workflow (15 min) Copy JSON from artifact n8n: Workflows → Import → Paste Includes all nodes + 7 detailed sticky notes Step 3: Configure OpenAI API (20 min) Get API key: https://platform.openai.com/api-keys Add to both AI nodes (Essay Analysis + Holistic Review) Model: gpt-4o (best for nuanced analysis) Temperature: 0.3 (consistency with creativity) Test with sample application Cost: $0.15-0.25 per application (essay analysis + holistic review) Step 4: Customize Institutional Context (45 min) Edit AI prompts to reflect YOUR college: In Holistic Review Prompt, Update: College name and type Acceptance rate Average admitted student profile (GPA, test scores) Institutional values and culture Academic programs and strengths What makes your college unique Desired student qualities In Essay Analysis Prompt, Add: Specific programs to look for mentions of Faculty names applicants should reference Campus culture keywords Red flags specific to your institution Step 5: Setup Email Communications (30 min) Connect Gmail/Outlook OAuth Update all recipient addresses: admissions-director@college.edu admissions-committee@college.edu Email addresses for strong candidate alerts Customize email templates: Add college name, logo, branding Update contact information Adjust tone to match institutional voice Include decision release dates Add applicant portal links Step 6: Configure Slack Alerts (15 min, Optional) Create channel: #admissions-strong-candidates Add webhook URL or bot token Test with mock strong candidate Customize alert format and recipients Step 7: Create Admissions Database (30 min) Google Sheet with columns:
by Nik B.
Automatically fetches daily sales, shifts, and receipts from Loyverse. Calculates gross profit, net operating profit, other key metrics, saves them to a Google Sheet and sends out a daily report via email. Who’s it for This template is for any business owner, manager, or analyst using Loyverse POS who needs more advanced financial reporting. If you're a restaurant, bar, or retail owner who wants to automatically track daily net profit, compare sales to historical averages, and build a custom financial dashboard in Google Sheets, this workflow is for you. How it works / What it does This workflow runs automatically on a daily schedule. It fetches all sales data and receipts from your Loyverse account for the previous business day, defined by your custom shift times (even past midnight). A powerful Code node then processes all the data to calculate the metrics that Loyverse either doesn't provide at all, or only spreads out across several separate reports instead of in one consolidated place. Already set up are metrics like... -Total Revenue, Gross Profit, and Net Operating Profit Cash handling differences (over/under) Average spend per receipt (ATV) 30-day rolling Net Operating Profit (NOP) Performance vs. your historical weekday average Finally, it appends the single, calculated row of daily metrics to a Google Sheet and sends an easily customizable summary report to your email. How to set up This workflow includes detailed Sticky Notes to guide you through the setup process. Because every business has a unique POS configuration (different POS devices, categories, and payment types), you'll need to set up a few things manually before executing the workflow. I've tried to make this as easy as possible to follow, and the entire setup should only take about 15 minutes. Preparations & Credential setup Subscribe to "Integrations" Add-on in Loyverse ($9 / month) to gain API access. Create an Access token in Loyverse Create Credentials: In your n8n instance, create credentials for Loyverse (use "Generic" > "Bearer Auth"), Google Sheets (OAuth2), and your Email (SMTP or other). Make a copy of a prep-configured Google Spreadsheet (Link in the second sticky note inside the workflow). Fill MASTER CONFIG: Open the MASTER CONFIG node. Follow the comments inside to add your Google Sheet ID, Sheet Names, business hours, timezone, and Loyverse IDs (for POS devices, payment types, and categories). Configure Google Sheet Nodes Configure Read Historical Data: Open this node. Follow the instructions in the nearby Sticky Note to paste the expressions for your Document ID and Sheet Name. Configure Save Product List: Open this node. Paste in the expressions for Document ID and Sheet Name. The column mapper will load; map your sheet columns (e.g., item_name) to the data on the left (e.g., {{ $json.item_name }}). Configure Save Latest Sales Data: Open this node. Paste in the expressions for Document ID and Sheet Name. Save and run the workflow. After that, the column mapper will load. This is the most important step: map your sheet's column names (e.g., "Total Revenue") to the calculated metrics from the Calculate All Metrics node (e.g., {{ $json.totalGrossRevenue }}). Activate the workflow. 🫡 Requirements Loyverse Integrations Subscription Loyverse Access Token Credentials for Loyverse (Bearer Auth) Credentials for Google Sheets (OAuth2) Credentials for Email/SMTP sender How to customize the workflow This template is designed to be highly flexible. Central Configuration: Almost all customization (POS devices, categories, payment types, sheet names) is done in the MASTER CONFIG node. You don't need to dig through other nodes. Add/Remove Metrics: The Calculate All Metrics node has additional metrics already set up, just add the relevant collumns to the SalesData sheet or even add your own calculations to the node. Any new metric you add (e.g., metrics.myNewMetric = 123) will be available to map in the Save Latest Sales Data node. Email Body: You can easily edit the Send email node to change the text or add new metrics from the Calculate All Metrics node.
by aditya vadaganadam
This n8n template turns chat questions into structured financial reports using Gemini and posts them to a Discord channel via webhook. Ask about tickers, sectors, or theses (e.g., “NVDA long‑term outlook?” or “Gold ETF short‑term drivers?”) and receive a concise, shareable report. Good to know Not financial advice: Use for insights only; verify independently. Model availability can vary by region. If you see “model not found,” it may be geo‑restricted. Costs depend on model and tokens. Check current Gemini pricing for updates. Discord messages are limited to ~2000 characters per post; long reports may need splitting. Rate limits: Discord webhooks are rate‑limited; add short waits for bursts. How it works Chat Trigger collects the user’s question (public chat supported when the workflow is activated). Conversation Memory keeps a short window of recent messages to maintain context. Connect Gemini provides the LLM (e.g., gemini‑2.5‑flash‑lite) and parameters (temperature, tokens). Agent (agent1) applies a financial analysis System Message to produce structured insights. Structured Output Parser enforces a simple JSON schema: idea (one‑line thesis) + analysis (Markdown sections). Code formats a Discord‑ready Markdown report (title, question, executive summary, sections, disclaimer). Edit Fields maps the formatted report to a clean content field. Discord Webhook posts the final report to your channel. How to use Start with the built‑in Chat Trigger: click Open chat, ask a question, and verify the Discord post. Replace or augment with a Cron or Webhook trigger for scheduled or programmatic runs. For richer context, add HTTP Request nodes (prices, news, filings) and pass summaries to the agent. Requirements n8n instance with internet access Google AI (Gemini) API key Discord server with a webhook URL Customising this workflow System Message: Adjust tone, depth, risk profile, and required sections (Summary, Drivers, Risks, Metrics, Next Steps, Takeaway). Model settings: Switch models or tune temperature/tokens in Connect Gemini. Schema: Extend the parser and formatter with fields like drivers[], risks[], or metrics{}. Formatting: Edit the Code node to change headings, emojis, disclaimers, or add timestamps. Operations: Add retries, message splitting for long outputs, and rate‑limit handling for Discord.
by Trung Tran
Create AI-Powered Chatbot for Candidate Evaluation on Slack > This workflow connects a Slack chatbot with AI agents and Google Sheets to automate candidate resume evaluation. It extracts resume details, identifies the applied job from the message, fetches the correct job description, and provides a summarized evaluation via Slack and tracking sheet. Perfect for HR teams using Slack. Who’s it for This workflow is designed for: HR Teams, **Recruiters, and Hiring Managers Working in software or tech companies using Slack, Google Sheets, and n8n Who want to automate candidate evaluation based on uploaded profiles and applied job positions How it works / What it does This workflow is triggered when a Slack user mentions the HR bot and attaches a candidate profile PDF. The workflow performs the following steps: Trigger from Slack Mention A user mentions the bot in Slack with a message like: @HRBot Please evaluate this candidate for the AI Engineer role. (with PDF attached) Input Validation If no file is attached, the bot replies: "Please upload the candidate profile file before sending the message." Extract Candidate Profile Downloads the attached PDF from Slack Uses Extract from File to parse the resume into text Profile Analysis (AI Agent) Sends the resume text and message to the Profile Analyzer Agent Identifies: Candidate name, email, and summary Applied position (from message) Looks up the Job Description PDF URL using Google Sheets Job Description Retrieval Downloads and parses the matching JD PDF HR Evaluation (AI Agent) Sends both the candidate profile and job description to HR Expert Agent Receives a summarized fit evaluation and insights Output and Logging Sends evaluation result back to Slack in the original thread Updates a Google Sheet with evaluation data for tracking How to set up Slack Setup Create a Slack bot and install it into your workspace Enable the app_mention event and generate a bot token Connect Slack to n8n using Slack Bot credentials Google Sheets Setup Create a sheet mapping Position Title → Job Description URL Create another sheet for logging evaluation results n8n Setup Add a Webhook Trigger for Slack mentions Connect Slack, Google Sheets, and GPT-4 credentials Set up agents (Profile Analyzer Agent, HR Expert Agent) with appropriate prompts Deploy & Test Mention your bot in Slack with a message and file Confirm the reply and entry in the evaluation tracking sheet Requirements n8n (self-hosted or cloud) Slack App with Bot Token OpenAI or Azure OpenAI account (for GPT-4) Google Sheets (2 sheets: job mapping + evaluation log) Candidate profiles in PDF format Defined job titles and descriptions How to customize the workflow You can easily adapt this workflow to your team’s needs: | Customization Area | How to Customize | |--------------------------|----------------------------------------------------------------------------------| | Job Mapping Source | Replace Google Sheet with Airtable or Notion DB | | JD Format | Use Markdown or inline descriptions instead of PDF | | Evaluation Output Format | Change from Slack message to Email or Notion update | | HR Agent Prompt | Customize to match your company tone or include scoring rubrics | | Language Support | Add support for bilingual input/output (e.g., Vietnamese & English) | | Workflow Trigger | Trigger from slash command or form instead of @mention |
by Guillaume Duvernay
This template introduces a revolutionary approach to automated web research. Instead of a rigid workflow that can only find one type of information, this system uses a "thinker" and "doer" AI architecture. It dynamically interprets your plain-English research request, designs a custom spreadsheet (CSV) with the perfect columns for your goal, and then deploys a web-scraping AI to fill it out. It's like having an expert research assistant who not only finds the data you need but also builds the perfect container for it on the fly. Whether you're looking for sales leads, competitor data, or market trends, this workflow adapts to your request and delivers a perfectly structured, ready-to-use dataset every time. Who is this for? Sales & marketing teams:** Generate targeted lead lists, compile competitor analysis, or gather market intelligence with a simple text prompt. Researchers & analysts:** Quickly gather and structure data from the web for any topic without needing to write custom scrapers. Entrepreneurs & business owners:** Perform rapid market research to validate ideas, find suppliers, or identify opportunities. Anyone who needs structured data:** Transform unstructured, natural language requests into clean, organized spreadsheets. What problem does this solve? Eliminates rigid, single-purpose workflows:** This workflow isn't hardcoded to find just one thing. It dynamically adapts its entire research plan and data structure based on your request. Automates the entire research process:** It handles everything from understanding the goal and planning the research to executing the web search and structuring the final data. Bridges the gap between questions and data:** It translates your high-level goal (e.g., "I need sales leads") into a concrete, structured spreadsheet with all the necessary columns (Company Name, Website, Key Contacts, etc.). Optimizes for cost and efficiency:* It intelligently uses a combination of deep-dive and standard web searches from *Linkup.so** to gather high-quality initial results and then enrich them cost-effectively. How it works (The "Thinker & Doer" Method) The process is cleverly split into two main phases: The "Thinker" (AI Planner): You submit a research request via the built-in form (e.g., "Find 50 US-based fashion companies for a sales outreach campaign"). The first AI node acts as the "thinker." It analyzes your request and determines the optimal structure for your final spreadsheet. It dynamically generates a plan, which includes a discoveryQuery to find the initial list, an enrichmentQuery to get details for each item, and the JSON schemas that define the exact columns for your CSV. The "Doer" (AI Researcher): The rest of the workflow is the "doer," which executes the plan. Discovery: It uses a powerful "deep search" with Linkup.so to execute the discoveryQuery and find the initial list of items (e.g., the 50 fashion companies). Enrichment: It then loops through each item in the list. For each one, it performs a fast and cost-effective "standard search" with Linkup to execute the enrichmentQuery, filling in all the detailed columns defined by the "thinker." Final Output: The workflow consolidates all the enriched data and converts it into a final CSV file, ready for download or further processing. Setup Connect your AI provider: In the OpenAI Chat Model node, add your AI provider's credentials. Connect your Linkup account: In the two Linkup (HTTP Request) nodes, add your Linkup API key (free account at linkup.so). We recommend creating a "Generic Credential" of type "Bearer Token" for this. Linkup offers €5 of free credits monthly, which is enough for 1k standard searches or 100 deep queries. Activate the workflow: Toggle the workflow to "Active." You can now use the form to submit your first research request! Taking it further Add a custom dashboard:** Replace the form trigger and final CSV output with a more polished user experience. For example, build a simple web app where users can submit requests and download their completed research files. Make it company-aware:** Modify the "thinker" AI's prompt to include context about your company. This will allow it to generate research plans that are automatically tailored to finding leads or data relevant to your specific products and services. Add an AI summary layer:** After the CSV is generated, add a final AI node to read the entire file and produce a high-level summary, such as "Here are the top 5 leads to contact first and why," turning the raw data into an instant, actionable report.
by Fayzul Noor
This workflow is built for digital marketers, sales professionals, influencer agencies, and entrepreneurs who want to automate Instagram lead generation. If you’re tired of manually searching for profiles, copying email addresses, and updating spreadsheets, this automation will save you hours every week. It turns your process into a smart system that finds, extracts, and stores leads while you focus on growing your business. How it works / What it does This n8n automation completely transforms how you collect Instagram leads using AI and API integrations. Here’s a simple breakdown of how it works: Set your targeting parameters using the Edit Fields node. You can specify your platform (Instagram), field of interest such as “beauty & hair,” and target country like “USA.” Generate intelligent search queries with an AI Agent powered by GPT-4o-mini. It automatically creates optimized Google search queries to find relevant Instagram profiles in your chosen niche and location. Extract results from Google using Apify’s Google Search Scraper, which collects hundreds of Instagram profile URLs that match your search criteria. Fetch detailed Instagram profile data using Apify’s Instagram Scraper. This includes usernames, follower counts, and profile bios where contact information usually appears. Use AI to extract emails from the profile biographies with the Information Extractor node powered by GPT-3.5-turbo. It identifies emails even when they are hidden or creatively formatted. Store verified leads in a PostgreSQL database. The workflow automatically adds new leads or updates existing ones with fields like username, follower count, email, and niche. Once everything is set up, the system runs on autopilot and keeps building your database of quality leads around the clock. How to set up Follow these steps to get your Instagram Lead Generation Machine running: Import the JSON file into your n8n instance. Add your API credentials: Apify token for the Google and Instagram scrapers OpenAI API key for the AI-powered nodes PostgreSQL credentials for storing leads Open the Edit Fields node and set your platform, field of interest, and target country. Run the workflow manually using the Manual Trigger node to test it. Once confirmed, replace the manual trigger with a schedule or webhook to run it automatically. Check your PostgreSQL database to ensure the leads are being saved correctly. Requirements Before running the workflow, make sure you have the following: An n8n account or instance (self-hosted or n8n Cloud) An Apify account for accessing the Google and Instagram scrapers OpenAI API access for generating smart search queries and extracting emails A PostgreSQL database to store your leads Basic understanding of how n8n workflows and nodes operate How to customize the workflow This workflow is flexible and can be customized to fit your business goals. Here’s how you can tailor it: Change your niche or location by updating the Edit Fields node. You can switch from “beauty influencers in the USA” to “fitness coaches in Canada” in seconds. Add more data fields to collect additional information such as engagement rates, bio keywords, or profile categories. Just modify the PostgreSQL node and database schema. Connect to your CRM or email system to automatically send introduction emails or add new leads to your marketing pipeline. Use different triggers such as a scheduled cron trigger for daily runs or a webhook trigger to start the workflow through an API call. Filter higher-quality leads by adding logic to capture only profiles with a minimum number of followers or verified emails.
by Oneclick AI Squad
This is an advanced n8n workflow for transforming product concepts into 3D showcase videos with AI packaging design and auto-rotation rendering. Workflow Features: 🎯 Core Capabilities: AI Product Concept Generation - Uses Claude Sonnet 4 to analyze product prompts and generate comprehensive 3D specifications Automated Packaging Design - DALL-E 3 generates professional packaging visuals Texture Map Generation - Creates PBR-ready texture maps for realistic materials 3D Scene Script Generation - Produces complete Blender Python scripts with: Product geometry based on shape Professional 3-point lighting (key, fill, rim) 360° rotation animation (8 seconds) Camera setup and render settings Preview Rendering - Generates photorealistic 3D preview images Video Processing - Handles encoding and upload to video hosting services Database Storage - Saves all showcase data for tracking Status Monitoring - Checks render progress with automatic retry logic 📋 Required Setup: API Credentials needed: Anthropic API (for Claude AI) OpenAI API (for DALL-E image generation) Replicate API (optional for additional rendering) Video hosting service (Cloudflare Stream or similar) PostgreSQL database 🔧 How to Use: Import the JSON - Copy the artifact content and import directly into n8n Configure Credentials - Add your API keys in n8n credentials manager Activate Workflow - Enable the webhook trigger Send Request to webhook endpoint: POST /product-showcase { "productPrompt": "A premium organic energy drink in a sleek aluminum can with nature-inspired graphics" } 📤 Output Includes: Product specifications (dimensions, materials, colors) Packaging design image URL Texture map URLs Downloadable Blender script 3D preview render Video showcase URL Rendering metadata
by Dzaky Jaya
This n8n workflow demonstrate how to configure AI Agent for financial research purposes especially for IDX data through Sectors App API. use cases: research stock market in Indonesia. analyze the performance of companies belonging to certain subsectors or company comparing financial metrics between BBCA and BBRI providing technical analysis for certain ticker stock movement and many more all from conversational agent UI chat. Main components Input-n8nChatNative**: handling and process input from native n8n chat ui Input-TelegramBot**: handling and process input from Telegram Bot Input-WebUI(Webhook)**: handling and process input from hosted Web UI through webhook Main Agent**: processing raw user queries and delegate task to specialized agent if needed. Spec Agent - Sectors App**: make request to Sectors App API to get real time financial data listed in IDX from available endpoint Spec Agent - Web Search**: make web search from Google Grounding (Gemini API) and Tavily Search. Vector Document Processing**: process document upload from user into embedding and vector store. How it works user queries may received from multiple platform (we use three here: Telegram, hosted Web UI, and native n8n chat UI) if user also uploading document, it will process the document and store it in vector store the request send to the Main Agent to process the queries the Main Agent decide the task to delegate to Specialized Agent if nedded. the result then sent back to user based on the platform How to use You need this API: Gemini API: get it free from https://aistudio.google.com/ Tavily API: get it free from https://www.tavily.com/ Sectors App API: get it from https://sectors.app/api/ you can optionally change the model or adding fallback model to handle token request, cause it may use quite many tokens.
by Automate With Marc
🎨 Instagram Carousel & Caption Generator on Autopilot (GPT-5 + Nano Banana + Blotato + Google Sheets) Description Watch the full step-by-step tutorial on YouTube: https://youtu.be/id22R7iBTjo Disclaimer (self-hosted requirement): This template assumes you have valid API credentials for OpenAI, Wavespeed/Nano Banana, Blotato, and Google. If using n8n Self-Hosted, ensure HTTPS access and credentials are set in your instance. How It Works Chat Trigger – Receive a topic/idea (e.g. “5 best podcast tips”). Image Prompt Generator (GPT-5) – Creates 5 prompts using the “Hook → Problem → Insight → Solution → CTA” framework. Structured Output Parser – Formats output into a JSON array. Generate Images (Nano Banana) – Converts prompts into high-quality visuals. Wait for Render – Ensures image generation completes. Fetch Rendered Image URLs – Retrieves image links. Upload to Blotato – Hosts and prepares images for posting. Collect Media URLs – Gathers all uploaded image URLs. Log to Google Sheets – Stores image URLs + timestamps for tracking. Caption Generator (GPT-5) – Writes an SEO-friendly caption. Merge Caption + Images – Combines data. Post Carousel (Blotato) – Publishes directly to Instagram. Step-by-Step Setup Instructions 1) Prerequisites n8n (Cloud or Self-Hosted) OpenAI API Key (GPT-5) Wavespeed API Key (Nano Banana) Blotato API credentials (connected to Instagram) Google Sheets OAuth credentials 2) Add Credentials in n8n OpenAI: Settings → Credentials → Add “OpenAI API” Wavespeed: HTTP Header Auth (e.g. Authorization: Bearer <API_KEY>) Blotato: Add “Blotato API” Google Sheets: Add “Google Sheets OAuth2 API” 3) Configure & Test Run with an idea like “Top 5 design hacks”. Check generated images, caption, and logged sheet entry. Confirm posting works via Blotato. 4) Optional Add a Schedule Trigger for weekly automation. Insert a Slack approval loop before posting. Customization Guide ✏️ Change design style: Modify adjectives in the Image Prompt Generator. 📑 Adjust number of slides: Change Split node loop count. 💬 Tone of captions: Edit Caption Generator’s system prompt. ⏱️ Adjust render wait time: If image generation takes longer, increase the Wait node duration from 30 seconds to 60 seconds or more. 🗂️ Log extra data: Add columns in Google Sheets for campaign or topic. 🔁 Swap posting tool: Replace Blotato with your scheduler or email node. Requirements OpenAI API key (GPT-5 or compatible) Wavespeed API key (Nano Banana) Blotato API credentials Google Sheets OAuth credentials n8n account (Cloud or Self-Hosted)
by Nikan Noorafkan
📊 Google Ads + OpenAI + Sheets — Monthly AI Performance Analysis Automate monthly ad performance insights with AI-powered recommendations 🧩 Overview This workflow automatically analyzes Google Ads performance every month, using the Google Ads API and OpenAI (GPT-4o) to uncover which ad themes, categories, and messages perform best. It then generates a structured AI report, saves it to Google Sheets, and sends a Slack summary to your marketing team. 💡 Perfect for digital marketers, agencies, and growth analysts who want automated campaign insights without manually crunching numbers. ⚙️ Features ✅ Automatically runs on the 1st of each month ✅ Fetches last 30 days of ad performance via Google Ads API (GAQL) ✅ Uses GPT-4o for natural-language insights & improvement ideas ✅ Groups ads by category and theme (e.g., “Free Shipping,” “Premium”) ✅ Generates a clean, formatted markdown report ✅ Archives reports in Google Sheets for trend tracking ✅ Notifies your Slack channel with AI-driven recommendations 🧠 Architecture | Component | Purpose | | ------------------- | ------------------------------------------------ | | n8n | Workflow engine | | Google Ads API | Source of ad performance data | | OpenAI (GPT-4o) | Analyzes CTR patterns and writes recommendations | | Google Sheets | Report archiving and history tracking | | Slack | Team notifications | 🧭 Workflow Logic (Summary) Monthly Trigger (1st of Month) ⬇️ 1️⃣ Get Performance Data (Google Ads API) Fetches 30-day CTR, clicks, impressions for all responsive search ads. ⬇️ 2️⃣ Prepare Performance Data Groups data by ad group and theme keywords, builds an AI prompt. ⬇️ 3️⃣ AI Agent (LangChain) + GPT-4o Analyzes patterns and generates actionable insights. ⬇️ 4️⃣ Generate Report (Code) Formats a Markdown report with AI recommendations and KPIs. ⬇️ 5️⃣ Save to Google Sheets Archives results for long-term analytics. ⬇️ 6️⃣ Send Report to Slack Delivers the summary directly to your marketing channel. 🔑 Environment Variables | Variable | Example | Description | | ------------------------ | ----------------------------- | ------------------------------ | | GOOGLE_ADS_CUSTOMER_ID | 123-456-7890 | Google Ads customer account ID | | GOOGLE_ADS_API_VERSION | v17 | Current Ads API version | | GOOGLE_SHEET_ID | 1xA1B2c3D4EFgH... | Target spreadsheet ID | | OPENAI_API_KEY | sk-xxxxx | OpenAI API key for GPT-4o | | SLACK_WEBHOOK_URL | https://hooks.slack.com/... | Slack incoming webhook | 🔐 Credential Setup | Service | Type | Required Scopes | | ----------------- | ----------------------------- | ---------------------------------------------- | | Google Ads | OAuth2 (googleAdsOAuth2Api) | https://www.googleapis.com/auth/adwords | | OpenAI | API key (openAiApi) | Full access | | Google Sheets | OAuth2 | https://www.googleapis.com/auth/spreadsheets | | Slack | Webhook | chat:write | 🧱 Node-by-Node Breakdown | Node | Purpose | Key Configuration | | ---------------------------------- | ----------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------- | | Monthly Trigger | Starts workflow on 1st of every month | Cron: 0 0 1 * * | | Get Performance Data | Queries Ads data | Endpoint: https://googleads.googleapis.com/v17/customers/{id}/googleAds:searchQuery: GAQL (CTR, clicks, impressions, last 30 days) | | Prepare Performance Data | Aggregates and builds AI prompt | Groups by ad group and theme, computes CTRs | | AI Agent – Analyze Performance | Passes formatted data to GPT-4o | System message: “You are a Google Ads performance analyst…” | | OpenAI Chat Model (GPT-4o) | Analytical reasoning engine | Model: gpt-4o, Temperature 0.2 | | Generate Report | Parses AI output, formats Markdown report | Adds recommendations + next steps | | Save Report to Sheets | Archives report | Sheet name: Performance Reports | | Send Report (Slack) | Sends summary | Uses report_markdown variable | 🧠 AI Report Example 30-Day Performance Analysis Report Executive Summary Analyzed: 940 ads Period: Last 30 days Top Performing Categories Running Shoes: 9.4% CTR (120 ads) Fitness Apparel: 8.2% CTR (90 ads) Top Performing Themes "Free Shipping" messaging: 9.8% CTR (58 ads) "Premium" messaging: 8.5% CTR (44 ads) AI-Powered Recommendations [HIGH] Emphasize “Free Shipping” across more ad groups. Expected Impact: +5 % CTR [MEDIUM] Test “Premium Quality” vs. “New Arrivals.” Expected Impact: +3 % CTR Next Steps Implement new ad variations A/B test messaging Re-analyze next month 🧩 Testing Procedure 1️⃣ Temporarily disable the cron trigger. 2️⃣ Run the workflow manually. 3️⃣ Confirm: Google Ads node returns JSON with results. AI Agent output is valid JSON. Report is written to Sheets. Slack message received. 4️⃣ Re-enable the monthly trigger once verified. 🧾 Output in Google Sheets | Date | Ads Analyzed | Top Category | Top Theme | Key Recommendations | Generated At | | ---------- | ------------ | ------------- | ------------- | ---------------------------------- | ----------------- | | 2025-10-01 | 940 | Running Shoes | Free Shipping | “Add Free Shipping copy to 10 ads” | 2025-10-01T00:05Z | 🪜 Maintenance | Frequency | Task | | --------- | ----------------------------------------- | | Monthly | Review AI accuracy and update themes list | | Quarterly | Refresh Google Ads API credentials | | As needed | Update GAQL fields for new metrics | ⚙️ API Verification Endpoint: POST https://googleads.googleapis.com/v17/customers/{customer_id}/googleAds:search Scopes: https://www.googleapis.com/auth/adwords GAQL Query: SELECT ad_group_ad.ad.id, ad_group_ad.ad.responsive_search_ad.headlines, ad_group.name, metrics.impressions, metrics.clicks, metrics.ctr FROM ad_group_ad WHERE segments.date DURING LAST_30_DAYS AND metrics.impressions > 100 ORDER BY metrics.clicks DESC LIMIT 1000 ✅ Fully valid query — verified for GAQL syntax, fields, and resource joins. ✅ OAuth2 flow handled by n8n’s googleAdsOAuth2Api. ✅ Optional: add "timeout": 60000 for large accounts. 📈 Metrics of Success | KPI | Target | | -------------------------- | ---------------- | | Report accuracy | ≥ 95 % | | Monthly automation success | ≥ 99 % | | CTR improvement tracking | +3–5 % over time | 🔗 References Google Ads API Docs LangChain in n8n OpenAI API Reference Google Sheets API Slack Incoming Webhooks 🎯 Conclusion You now have a fully automated Google Ads performance analysis workflow powered by: Google Ads API** for granular metrics OpenAI GPT-4o** for intelligent recommendations Google Sheets** for archiving Slack** for team-wide updates 💡 Result: A recurring, data-driven optimization loop that improves ad performance every month — with zero manual effort.