by Baptiste Fort
📘 Workflow Documentation – Stock Market Daily Digest 👋 Introduction Wake up to a clean, analyst-style stock digest in your inbox—top gainers/losers, a readable performance table, 3–5 insights, and upcoming events—no spreadsheets, no manual scraping, no copy-paste. This article explains, step by step, how to build a robust, daily, end-to-end automation that collects market data (Bright Data), waits until scraping is done, aggregates results, asks an AI model (OpenAI) to draft a styled HTML email, logs everything to Airtable, and finally sends the report via Gmail. You’ll find a friendly but technical tour of every single node, so you can rebuild or adapt the same pipeline with confidence. 🎯 Who is this workflow for? Investors & traders** who want a quick, readable daily summary. Finance/Product teams** building data-driven alerts/digests. Consultants & agencies** sending recurring client updates. Automation builders** prototyping finance ops quickly. 🧰 Tools you’ll need Bright Data** — dataset triggers & snapshots for reliable web data. OpenAI (GPT)** — to generate a professional HTML digest. Airtable** — store daily rows for history, filters, dashboards. Example Airtable Table: Daily Stocks | Ticker | Company | Price | Change % | Sentiment | Date | |--------|--------------------------|---------|----------|-----------|---------------------| | AAPL | Apple Inc. | 225.80 | +1.4% | 🟢 Positive | 2025-09-18 09:00 | | MSFT | Microsoft Corporation | 415.20 | -0.7% | 🔴 Negative | 2025-09-18 09:00 | | NVDA | NVIDIA Corporation | 124.55 | +2.1% | 🟢 Positive | 2025-09-18 09:00 | | TSLA | Tesla Inc. | 260.00 | -3.0% | 🔴 Negative | 2025-09-18 09:00 | | META | Meta Platforms Inc. | 310.45 | +0.5% | 🟡 Neutral | 2025-09-18 09:00 | Gmail** — deliver the final HTML email to stakeholders. n8n** — the automation engine that orchestrates every step. > Keep API keys in n8n Credentials (never hard-code secrets). 🗺️ Architecture at a glance Schedule fires daily Seed list of tickers Split into one item per stock Prepare keyword for scraping Launch Bright Data job Poll progress with a wait-loop Fetch snapshot data Aggregate for the AI Generate HTML summary (GPT) Save rows to Airtable Send email via Gmail ⚙️ Step-by-step — Every node explained ⏰ Daily Run Trigger (Schedule Trigger) Purpose Start the automation at a precise time each day so nobody needs to push a button. Parameters (English) Trigger Type**: Time Interval or Cron Every X**: 1 Day (or your preferred cadence) Timezone**: UTC (or your own) Start Time**: optional (e.g., 09:00) 📝 Set Stock List (Set Node – SAMPLE DATA) Purpose Define the universe of stocks to monitor. This acts as the seed data for scraping. Parameters Values to Set**: Fixed JSON (array of objects) Keep Only Set**: true Fields per item**: ticker, name, market_cap (you may add sector, isin, etc.) 🔀 Split Stocks (Split Out) Purpose Turn the array into individual items so each ticker is processed independently (scraping, polling, results). Parameters Operation**: Split Out Items Field to Split**: the array defined in the previous Set node 🏷 Prepare Stock Keyword (Set Node) Purpose Create a keyword field (typically equal to ticker) for Bright Data discovery. Parameters Values to Set**: Add Field Field Name**: keyword Value**: use an expression referencing the current item’s ticker (e.g., ` {{ $json.ticker }} `) 🕸 Bright Data Scraper (HTTP Request) Purpose Trigger the Bright Data dataset to start collecting information for the keyword. Returns a snapshot_id to poll later. Parameters Method**: POST Endpoint**: https://api.brightdata.com/datasets/v1/trigger Authentication**: Authorization: Bearer <token> (header) Body Fields**: dataset_id: your Bright Data dataset ID discover_by: usually keyword keyword: the value prepared above > Add a retry/backoff policy on 429/5xx in node options. 🔄 Check Scraper Progress (HTTP Request) Purpose Poll Bright Data to see whether the snapshot is running or ready. Parameters Method**: GET Endpoint**: https://api.brightdata.com/datasets/v1/snapshots/{snapshot_id} Authentication**: Authorization: Bearer <token> Expected Output**: a status field (running, ready) ⏳ Wait for Data (Wait Node) Purpose Pause between progress checks to avoid rate limits and give Bright Data time to finish. Parameters Mode**: Wait a fixed amount of time Time**: e.g., 30 seconds (tune to your dataset size) 🔀 Scraper Status Switch (Switch Node) Purpose Route logic based on the polled status. Parameters Value to Check**: status Rules**: Equals running → go to Wait for Data (then re-check) Equals ready → proceed to Fetch Scraper Results > Loop pattern: Check → Wait → Check, until ready. 📥 Fetch Scraper Results (HTTP Request) Purpose Download the completed snapshot data once Bright Data marks it ready. Parameters Method**: GET Endpoint**: https://api.brightdata.com/datasets/v1/snapshots/{snapshot_id}/data Authentication**: Authorization: Bearer <token> Query**: format=json Output**: array of rows per ticker (price, change %, any fields your dataset yields) > Normalize fields with a Set/Code node if needed. 📊 Aggregate Stock Data (Aggregate Node) Purpose Combine all individual items into one consolidated object so the AI can analyze the entire market snapshot. Parameters Mode**: Aggregate (merge to a single item) Fields to Include**: ticker, name, price, change, sentiment (plus any extra fields captured) Output**: one JSON item containing an array/map of the day’s stocks 🤖 Generate Daily Summary (AI Node – OpenAI) Purpose Ask the model to convert raw data into a styled HTML email: headline, top movers, table, insights, and (optional) upcoming events. Parameters Model**: gpt-4.1 Input**: the aggregated JSON from the previous node Prompt guidelines**: Output HTML only with inline styles (email-safe) Include a table (Ticker, Company, % Change with ↑/↓ & color, Market Cap, Sentiment icon) Highlight top 2 gainers & 2 losers with short reasoning if present Provide 3–5 insights (sector rotation, volatility, outliers) Add upcoming events when available (earnings, launches, macro) Footer: “Generated automatically by your AI-powered stock monitor” Output field**: confirm the exact property that contains the HTML (e.g., output, message, text) 🗂 Save to Airtable (Airtable – Create Record) Purpose Log each item (or the roll-up) to Airtable for history, filtering, and dashboards. Parameters Operation**: Create Record Base ID**: from your Airtable URL Table**: e.g., Daily Stocks Field Mapping**: Ticker ← ` {{ $json.ticker }} ` Company ← ` {{ $json.name }} ` Price ← ` {{ $json.price }} ` Change % ← ` {{ $json.change }} ` Sentiment ← ` {{ $json.sentiment }} ` Date ← ` {{ $now.toISO() }} ` > Use a Single-Select for Sentiment (🟢 / 🟡 / 🔴) to build clean Airtable views. 📧 Send Report via Gmail (Gmail Node) Purpose Deliver the AI-generated HTML digest to your recipients. Parameters Operation**: Send Email Send To**: one or more recipients (e.g., investor@domain.com) Subject**: Daily Stock Market Digest – {{ $now.format("yyyy-MM-dd") }} Message (HTML)**: reference the AI node’s HTML property (e.g., ` {{ $('Generate Daily Summary').first().json.output }} `) Options: set **Append Attribution to false (keep the email clean) > Test in Gmail, Outlook, and mobile to validate inline CSS. 🧪 Error handling & reliability tips Backoff on Bright Data* — If scraping many tickers, increase *Wait** or batch requests. Guard against empty results** — If a snapshot returns 0 rows, branch to a fallback email (“No data today”). AI guardrails** — Enforce “HTML-only” and skip missing sections gracefully. Airtable normalization** — Strip %, cast numbers to float before insert. Observability* — Add a final Slack/Email *On Fail** node with run ID and error message. 🧩 Customization ideas Sector deep-dives**: add sector fields and a second AI paragraph on sector rotation. CSV attachment**: generate & attach a CSV for power users. Multiple lists**: run parallel branches for Tech, Healthcare, or regions. Other asset classes**: Crypto, ETFs, Indices, FX. Audience targeting**: different “To” lists and slightly different prompts per audience. ✅ Why this workflow is powerful Hands-off** — the report simply shows up every day. Analyst-grade** — clean HTML, top movers, tidy table, actionable insights. Auditable** — rows archived in Airtable for history and dashboards. Composable** — swap scrapers, LLMs, storage, or email service. Scalable** — start with 10 tickers, grow to many lists using the same loop. For advanced no-code & AI projects, see 0vni – Agence automatisation.
by Mihai Farcas
This n8n template demonstrates how to deploy an AI workflow in production while simultaneously running a robust, data-driven Evaluation Framework to ensure quality and optimize costs. Use Cases Model Comparison: Quickly A/B test different LLM models (e.g., Gemini 3 Pro vs. Flash Lite) for speed and cost efficiency against your specific task. Prompt Regression: Ensure that tweaks to your system prompt do not introduce new errors or lower the accuracy of your lead categorization. Production Safety: Guarantee that test runs never trigger real-world actions like sending emails to a client or sales team. Requirements A configured Gmail Trigger (or equivalent email trigger). A Google Gemini account for the LLM models. An n8n Data Table containing your "Golden Dataset" of test cases and ground truths. How it Works The workflow contains two distinct, parallel execution paths: Production Path: The Gmail Trigger monitors for new emails. The email text is routed through the Sentiment Analysis node, which categorizes the lead as Positive, Neutral, or Negative. Check if Evaluating nodes verify the current execution mode. If it is not an evaluation run (the Fail branch), the lead is routed to the corresponding Send Email node for action. Evaluation Path: The When fetching a dataset row trigger pulls test cases (input text and expected sentiment/ground truth) from an n8n Data Table. Each test case loops through the same Sentiment Analysis node. The Check if Evaluating nodes route this path to the Success branch, skipping the real email actions. The Save Output node writes the model's prediction to the Data Table. The Set Metrics node uses the Categorization metric to compare the prediction against the ground truth, returning a score (0 or 1) to measure accuracy. Key Technical Details Model Switching: Multiple Google Gemini Chat Model nodes are connected via the Model input on the Sentiment Analysis node, allowing you to easily swap and compare models without changing the core logic. Edge Case Handling: The System Prompt Template in the Sentiment Analysis node is customized to handle tricky inputs, such as negative feedback about a competitor that should be classified as a Positive lead. Metrics: The workflow uses the built-in Categorization metric, which is ideal for classification tasks like sentiment analysis, to provide objective evidence of performance.
by Billy Christi
Who is this for? This workflow is ideal for: Business analysts* and *data professionals** who need to quickly analyze spreadsheet data through natural conversation Small to medium businesses** seeking AI-powered insights from their Google Sheets without complex dashboard setups Sales teams* and *marketing professionals** who want instant access to customer, product, and order analytics What problem is this workflow solving? Traditional data analysis requires technical skills and time-consuming manual work. This AI data analyst chatbot solves that by: Eliminating the need for complex formulas or pivot tables** - just ask questions in plain text Providing real-time insights** from live Google Sheets data whenever you need them Making data analysis accessible** to non-technical team members across the organization Maintaining conversation context** so you can ask follow-up questions and dive deeper into insights Combining multiple data sources** for comprehensive business intelligence What this workflow does This workflow creates an intelligent chatbot that can analyze data from Google Sheets in real time, providing AI-powered business intelligence and data insights through a conversational interface. Step by step: Chat Trigger receives incoming chat messages with session ID tracking for conversation context Parallel Data Retrieval fetches live data from multiple Google Sheets simultaneously Data Aggregation combines data from each sheet into structured objects for analysis AI Analysis processes user queries using OpenAI's language model with the combined data context Intelligent Response delivers analytical insights, summaries, or answers back to the chat interface How to set up Connect your Google Sheets account to all Google Sheets nodes for data access View & Copy the example Google Sheet template here: 👉 Smart AI Data Analyst Chatbot – Google Sheet Template Update Google Sheets document ID in all Google Sheets nodes to point to your specific spreadsheet Configure sheet names to match your Google Sheets structure Add your OpenAI API key to the OpenAI Chat Model node for AI-powered analysis Customize the AI Agent system message to reflect your specific data schema and analysis requirements Configure the chat trigger webhook for your specific chat interface implementation Test the workflow by sending sample queries about your data through the chat interface Monitor responses to ensure the AI is correctly interpreting and analyzing your Google Sheets data How to customize this workflow to your needs Replace with your own Google Sheets**: update the Google Sheets nodes to connect to your specific spreadsheets based on your use case. Replace with different data sources**: swap Google Sheets nodes with other data connectors like Airtable, databases (PostgreSQL, MySQL), or APIs to analyze data from your preferred platforms Modify AI instructions**: customize the Data Analyst AI Agent system message to focus on specific business metrics or analysis types Change AI model**: Switch to different LLM models such as Gemini, Claude, and others based on your complexity and cost requirements. Need help customizing? Contact me for consulting and support: 📧 billychartanto@gmail.com
by WeblineIndia
Landing Page Lead Intake via Webhook to Zoho CRM, Jira Task & Slack Alerts This n8n workflow captures lead data from a landing-page webhook, validates required fields and then processes the lead by creating a Zoho CRM Lead, generating a Jira Task and notifying a Slack channel. If required fields are missing, the workflow skips CRM + Jira creation and instead notifies Slack with available lead details. ⚡ Quick Start: 5-Step Fast Implementation Import this workflow JSON into n8n. Configure credentials: Zoho CRM OAuth2, Jira Cloud, Slack OAuth2. Copy the webhook URL and connect it to your landing page form. Ensure your form sends: first_name, last_name, company_name, email, phone, title, description, referrer. Activate workflow → send test POST → verify Zoho, Jira & Slack outputs. What It Does This workflow works as an automated lead pipeline. When the landing page sends JSON to the webhook, the workflow checks if last_name and company_name are present. If both fields exist, it proceeds to create a Zoho CRM lead, then generates a Jira task using the same data. A detailed Slack message is then posted with all lead information and the newly created Jira task ID. If any of the required fields are missing, the workflow does not create CRM or Jira entries. Instead, it sends a Slack notification with available details so teams can intervene manually without incorrect CRM data entry. Who’s It For Marketing teams capturing leads from landing pages. Sales teams using CRM and Jira for task tracking. Internal teams who want Slack alerts for new leads. Agencies and startups handling inbound lead flow. Anyone requiring automated lead routing without manual work. Prerequisites n8n instance Zoho CRM OAuth2 credential Jira Software Cloud credential Slack OAuth2 credential A landing page that sends POST JSON payloads Required payload fields: first_name last_name company_name email phone title description referrer How to Use & Setup Step 1: Import Workflow Go to n8n → Workflows → Import workflow JSON. Step 2: Configure Credentials Add your credentials in: Zoho CRM (Create a lead) Jira Software Cloud (Create an issue) Slack (Send a message & Send a message1) Step 3: Connect Webhook Copy the Webhook URL from the Webhook node and configure your landing page to send POST JSON to it. Step 4: Field Validation The If node checks: last_name exists company_name exists If both exist → CRM + Jira + Slack If missing → Slack-only alert Step 5: Test Workflow Send sample JSON using your landing page or Postman. Check Zoho CRM, Jira task creation, and Slack messages. Step 6: Activate Workflow Enable workflow after verification. How To Customize Nodes? Webhook Node Add/remove expected fields Modify payload structure If Node Add more validations Switch to OR logic Zoho CRM Lead Node Add additional fields Modify CRM field mapping Jira Task Node Change project, issue type, priority, assignee Modify description template Slack Nodes Change channel Rewrite notification messages Add Slack formatting Add-ons (Optional Enhancements) Email notification to lead Google Sheets entry logging Duplicate lead detection Lead scoring system CRM sync extensions (Contact, Account, etc.) Use Case Examples Marketing campaign lead automation. Instant Slack alerts for new inbound leads. Customer inquiry → Jira task workflow. Data quality enforcement (avoid CRM pollution). Trigger for larger lead qualification workflow. (And many more possible use cases.) Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|----------------|----------| | Webhook not triggered | Wrong webhook URL or wrong HTTP method | Check URL and ensure POST is used | | Zoho lead not created | Invalid credentials or missing required mapping | Reconnect Zoho credentials and verify fields | | Jira task not created | Wrong project/issue/assignee config | Verify project, issue type & permissions | | Slack message not sent | Token expired or incorrect channel ID | Re-authenticate Slack and confirm channel | | Workflow stops at If node | last_name or company_name missing | Update landing page form to include fields | | Slack message missing values | Wrong field names in payload | Ensure JSON fields match expected structure | Need Help? For assistance with setup, customization or building enhanced automation workflows, our n8n team at WeblineIndia can help you build & optimize your automations. We support: Workflow customization Add-on development Integration with other CRMs or apps Advanced automation logic
by Kdan
📌 Overview Description: This powerful workflow automates your sales quotation process by connecting Pipedrive with DottedSign. When a deal is moved to a specific stage in Pipedrive, this template automatically generates a professional PDF quotation, uploads it back to the deal, and sends it out for e-signature via DottedSign service, saving your sales team valuable time and eliminating manual work. What it does When a Pipedrive deal moves to a designated stage (e.g., "Quotation Stage"), this workflow triggers and performs the following actions: Gathers Data: It collects all relevant information, including deal details, client contacts, organization info, and associated products from Pipedrive. Generates PDF Quote: It populates a customizable HTML template with the collected data and uses a PDF generation service (Gotenberg) to create a polished PDF document. Uploads to Pipedrive: The generated PDF quote is automatically uploaded to the "Files" section of the corresponding Pipedrive deal for record-keeping. Sends for E-Signature: It creates a new signing task in DottedSign, sending the quotation to the client for their electronic signature. Requirements A Pipedrive account with admin permissions. A DottedSign developer account to obtain API credentials. A self-hosted instance of Gotenberg for HTML to PDF conversion. How to set up Pipedrive Trigger Stage: In the If node, change the stage ID 7 to the ID of the pipeline stage you want to use as the trigger. PDF Conversion Service: In the Gotenberg to PDF (HTTP Request) node, replace the placeholder URL with the endpoint of your running Gotenberg instance. DottedSign Credentials: In the Get DottedSign Access Token node, enter your client_id and client_secret in the request body. DottedSign Signature Field: In the Create DottedSign Task node, you must adjust the page and coord values under field_settings to match the desired signature location on your PDF template. How to customize the workflow Quotation Template:** Edit the Generate Quotation HTML node to modify the quote's appearance, text, company logo, and terms. The {{ ... }} expressions are placeholders that are filled with Pipedrive data. Trigger:** Replace the Pipedrive Trigger with another trigger, such as a webhook or a form submission, to adapt the workflow to different needs. Notifications:** Add a Slack or email node at the end of the workflow to notify the sales team once the quotation has been sent.
by CentralStationCRM
Overview This template benefits anyone who wants to: automate web research on a prospect company compile that research into an easily readable note and save the note into CentralStationCRM Tools in this workflow CentralStationCRM, the easy and intuitive CRM Software for small teams. Here is our API Documentation if you want to customize the workflow. ChatGPT, the well-known ai chatbot Tavily, a web search service for large language models Disclaimer Tavily Web Search is (as of yet) a community node. You have to activate the use of community nodes inside your n8n account to use this workflow. Workflow Screenshot Workflow Description The workflow consists of: a webhook trigger an ai agent node an http request node The Webhook Trigger The Webhook is set up in CentralStationCRM to trigger when a new company is created inside the CRM. The Webhook Trigger Node in n8n then fetches the company data from the CRM. The AI Agent Node The node uses ChatGPT as ai chat model and two Tavily Web Search operations ('search for information' and 'extract URLs') as tools. Additionally, it uses a simple prompt as tool, telling the ai model to re-iterate on the research data if applicable. The AI Agent Node takes the Company Name and prompts ChatGPT to "do a deep research" on this company on the web. "The research shall help sales people get a good overview about the company and allow to identify potential opportunities." The AI Agent then formats the results into markdown format and passes them to the next node. The CentralStationCRM protocol node This is an HTTP Request to the CentralStationCRM API. It creates a 'protocol' (the API's name for notes in the CRM) with the markdown data it received from the previous node. This protocol is saved in CentralStationCRM, where it can easily be accessed as a note when clicking on the new company entry. Customization ideas Even though this workflow is pretty simple, it poses interesting possibilities for customization. For example, you can alter the Webhook trigger (in CentralstationCRM and n8n) to fire when a person is created. You have to alter the AI prompt as well and make sure the third node adds the research note to the person, not a company, via the CentralStationCRM API. You could also swap the AI model used here for another one, comparing the resulting research data and get a deeper understanding of ai chat models. Then of course there is the prompt itself. You can definitely double down on the information you are most interested in and refine your prompt to make the ai bot focus on these areas of search. Start experimenting a bit! Preconditions For this workflow to work, you need a CentralStationCRM account with API Access an n8n account with API Access an Open AI account with API Access Have fun with our workflow!
by Daniel Rosehill
This workflow provides a mechanism for using AI transcribed voice notes using Voicenotes AI and then running them into an AI agent as prompts. On the "collection" end of the workflow, we gather the output (with the recorded prompt) and do two things: 1) It is saved into NocoDB as a new row on a database table recording AI outputs and prompts. 2) The prompt gets sent to an AI agent and the output gets returned to the user's email Who Is It For? If you like using voice AI tools to write detailed prompts for AI, then this workflow helps to remove the points of friction in getting from A to B! How Does It Work? Simply tag your voice note in Voicenotes with your preferred tag (I'm using 'prompt'). Then, provide the N8N webhook as the URL for a webhook that will trigger whenever a new note is created with this tag (and this tag only). Now, whenever you wish to use a voice note as a prompt, just add the 'tag.' This will trigger the webhook which, in turn, will trigger this workflow - sending the prompt to an AI agent of your choosing (configure within the workflow) and then saving the output into a database and returning it by email. Note: The AI agent system prompt is written to define a structured output to provide Gmail-safe HTML. This is thin injected into a template. You can use a Google Group to gather together the output runs or just receive them at your main address (if you don't use Gmail, just swap out for any other email node or your preferred delivery channel). How To Set It Up You'll need a Voicenotes account in order to use the service! Once you have one, you'll next want to create the tag and the webhook. In N8N, create your webhook node and then provide that to Voicenotes: Create a note. Then assign it a new tag: "Prompts" (or as you prefer). The webhook is matched to the tag. Requirements Voicenotes AI account Customisation The delivery mechanism can be customized to your preferences. If you're not a Google user, substitute the template and sending mechanism for your preferred delivery provider You could for example collect the outputs to a Slack channel or Telegram bot. You may omit the collector in NocoDB or substitute it for another wiki or knowledge management platform such as Notion or Nuclino.
by Viktor Klepikovskyi
Configurable Multi-Page Web Scraper Introduction This n8n workflow provides a robust and highly reusable solution for scraping data from paginated websites. Instead of building a complex series of nodes for every new site, you only need to update a simple JSON configuration in the initial Input Node, making your scraping tasks faster and more standardized. Purpose The core purpose of this template is to automate the extraction of structured data (e.g., product details, quotes, articles) from websites with multiple pages. It is designed to be fully recursive: it follows the "next page" link until no link is found, aggregates the results from all pages, and cleanly structures the final output into a single list of items. Setup and Configuration Locate the Input Node: The entire configuration for the scraper is held within the first node of the workflow. Update the JSON: Replace the existing JSON content with your target website's details: startUrl: The URL of the first page to begin scraping. nextPageSelector: The CSS selector for the "Next" or "Continue" link element that leads to the next page. This is crucial for the pagination loop. fields: An array of objects defining the data to extract on each page. For each field, specify the name (the output key), the selector (the CSS selector pointing to the data), and the value (the HTML attribute to pull, usually text or href). Run the Workflow: After updating the configuration, execute the workflow. It will automatically loop through all pages and deliver a final, structured list of the scraped data. For a detailed breakdown of the internal logic, including how the loop is constructed using the Set, If, and HTTP Request nodes, please refer to the original blog post: Flexible Web Scraping with n8n: A Configurable, Multi-Page Template
by Hassan
Overview Transform competitor Instagram content into optimized scripts for your own channel with this fully automated AI-powered content intelligence system. This workflow monitors Instagram profiles in your niche (AI/automation/tech), downloads their videos, transcribes the content, analyzes it for valuable tools and technologies, enriches it with web research, and rewrites it into polished, engagement-optimized scripts ready for your content team. It's like having a 24/7 content research team that never sleeps, turning competitor content into fresh opportunities for your channel. Key Benefits 🎯 Automated Competitive Intelligence - Monitor unlimited Instagram profiles and automatically capture their latest content the moment they post, ensuring you never miss trending topics in your niche. 🤖 AI-Powered Content Analysis - GPT-4O intelligently filters videos to identify only those discussing relevant tools, technologies, or AI solutions, saving hours of manual review time. ✍️ Professional Script Rewriting - Automatically transforms competitor scripts into unique, high-quality content optimized for your brand voice with engagement-focused CTAs that drive comments and DMs. 🔍 Deep Research Integration - Enriches every script with fresh facts from Perplexity AI's web search, adding unique insights and credibility that sets your content apart from simple reposts. 📊 Comprehensive Data Tracking - Stores all video metadata (views, likes, comments, duration) alongside original and rewritten scripts for performance analysis and content strategy optimization. ⚡ Scalable Batch Processing - Process multiple Instagram profiles in a single execution with built-in error handling, ensuring the workflow continues even if individual videos fail to process. 💰 Revenue-Generating Lead Magnet - Built-in CTA system ("comment [keyword] for the link") creates engagement and captures leads directly into your DMs for monetization opportunities. 🔄 100% Repurposable Output - Every processed video becomes a ready-to-use script with step-by-step guides, tool information, and engagement hooks perfect for reels, shorts, or TikToks. How It Works Phase 1: Content Discovery & Extraction The workflow begins by reading your curated list of Instagram profiles from a Google Sheet. These should be competitors or influencers in your niche (AI, automation, tech tools, etc.). For each profile, the system uses the Scrape Creators API to fetch their most recent post, specifically targeting video content. It extracts multiple video quality URLs and prepares them for download. Phase 2: Video Processing & Transcription Once a video is identified, the workflow downloads it directly from Instagram's servers using the highest quality version available. The video file is then passed to OpenAI's Whisper transcription model, which converts the audio into accurate text transcripts. This happens automatically even for videos with background music, multiple speakers, or accents. Phase 3: Intelligent Content Filtering The raw transcript is analyzed by GPT-4O using a sophisticated prompt that determines if the content is relevant to your niche. The AI identifies: Whether the video discusses tools, technologies, or AI solutions (verdict: true/false) Specific tool names mentioned Step-by-step instructions for using the tools A search query for additional research Suggestions for making the content more engaging for an AI/automation audience If the content isn't relevant, the workflow skips it and moves to the next profile, saving API costs and processing time. Phase 4: Deep Research & Fact-Finding For relevant content, the system automatically queries Perplexity AI using the generated search prompt. Perplexity searches the web in real-time to find three interesting, peculiar, or unique facts about the tool or technology. This adds depth and credibility to your final script that the original content likely didn't have. Phase 5: Professional Script Rewriting The final AI step combines everything: the original transcript, the step-by-step guide, the Perplexity research, and the improvement suggestions. GPT-4O rewrites the entire script in your brand voice (casual, spartan, straightforward) at approximately 100 words. The new script: Opens with a strong hook Presents the tool/technology clearly Includes the researched facts naturally Provides value-driven instructions Ends with a specific CTA (e.g., "Comment 'AI' and I'll send the link") Phase 6: Data Storage & Loop Execution All data is written back to your Google Sheet including video metadata (ID, timestamp, caption, engagement metrics), the original transcript, and the rewritten script. The workflow then loops back to process the next Instagram profile in your list, continuing until all profiles have been processed. Required Setup Google Sheets Database Create a Google Sheet with two tabs: "profiles" tab - Column: "Instagram Handles" (without @ symbol, e.g., "aiautomationhub") "phantom output" tab - Columns: id, timeStamp, caption, commentcount, videoUrl, likesCount, videoViewsCount, Username, Duration, original Script, rewritten Script, style, Updated API Credentials Required Scrape Creators API** - For Instagram data extraction (handles, posts, videos) OpenAI API Key** - For Whisper transcription and GPT-4O script analysis/rewriting Perplexity API Key** - For real-time web research and fact-finding Google Sheets OAuth** - For reading profiles and writing processed data Software Requirements n8n instance (cloud or self-hosted) Internet connection for API calls Sufficient OpenAI credits (approximately $0.05-0.15 per video processed) Business Use Cases Content Creation Agencies - Offer content repurposing services to clients, turning competitor research into ready-to-post scripts at scale. Social Media Managers - Monitor competitor content and generate fresh ideas for your own channels without manual research. Course Creators - Identify trending tools in your niche and create educational content around them before competitors do. Affiliate Marketers - Discover new tools to promote, complete with ready-made scripts and CTAs for lead generation. SaaS Companies - Track how competitors explain similar products and optimize your own messaging based on what works. Newsletter Operators - Find trending topics and tools to feature, with scripts easily adaptable to written content. Revenue Potential Direct Sales: Sell this workflow template for $97-$297 depending on setup complexity and included support. Subscription Service: Offer managed content intelligence as a service at $197-$497/month, processing unlimited profiles for clients. Agency Upsell: Use this as a lead generation tool (the CTA system) to build an email/DM list, then sell content creation services at $500-$2,000 per client/month. Course Integration: Include as a bonus tool in a content creation course priced at $497-$997, increasing perceived value. White-Label Licensing: License to agencies for $997-$2,997 with white-label rights for their client base. Time Savings ROI: If a content team spends 2 hours per video on research and scripting at $50/hour, this workflow saves $100 per video. Processing 20 videos weekly = $104,000 annual savings. Difficulty Level & Build Time Difficulty: Intermediate-Advanced Requires understanding of API authentication Needs basic JSON knowledge for data mapping Involves prompt engineering for optimal AI outputs Build Time: 3-4 hours for experienced n8n users, 6-8 hours for beginners Setup and API credential configuration: 1 hour Node connection and data flow: 1-2 hours Prompt optimization and testing: 1-2 hours Google Sheets schema creation: 30 minutes End-to-end testing with real profiles: 1-2 hours Maintenance: Low - Occasional prompt tweaks as AI models evolve Detailed Setup Steps Create Google Sheets Database Create new Google Sheet named "Instagram Content Intelligence" Add tab "profiles" with column "Instagram Handles" Add tab "phantom output" with all required columns (see schema above) Populate profiles tab with 5-10 Instagram handles in your niche Obtain API Credentials Sign up for Scrape Creators (https://scrapecreators.com) and get API key Create OpenAI account and generate API key with GPT-4O and Whisper access Sign up for Perplexity AI API (https://perplexity.ai) and get API key Connect Google Sheets via OAuth in n8n Import Workflow to n8n Copy the JSON workflow provided In n8n, click "Import from File" or paste JSON All nodes will appear but show credential errors Configure Credentials Click each node with a red error icon Add your respective API credentials (Scrape Creators, OpenAI, Perplexity, Google Sheets) Test each connection to ensure validity Map Google Sheets In "Get row(s) in sheet1" node, select your Google Sheet and "profiles" tab In "Update Entries2" node, select your Google Sheet and "phantom output" tab Verify column mappings match your sheet structure Customize AI Prompts Review "Filter & Generate Suggestions" prompt - adjust for your specific niche Review "Write New Script" prompt - modify tone, length, and CTA format to match your brand Test with sample transcripts to optimize output quality Test with Single Profile Add one Instagram handle to your profiles sheet Click "Execute workflow" manually Monitor each node's output to verify data flow Check that final script appears in Google Sheet Scale to Multiple Profiles Add 10-20 Instagram profiles to your sheet Run full workflow and monitor for errors Review output quality across different content types Adjust batch size if rate limits are hit Set Up Scheduling (Optional) Replace Manual Trigger with Schedule Trigger Set to run daily at optimal times (e.g., 6 AM when fresh posts exist) Enable error notifications to catch failures Implement DM Automation (Advanced) Connect Instagram API or tools like ManyChat Monitor comments for keywords from your CTA Auto-send tool links via DM to engaged users Advanced Customization Options Multi-Language Support: Add language detection node and conditional branches for different script formats per language. Engagement Scoring: Implement scoring algorithm based on likes, comments, views to prioritize which videos to repurpose first. Content Categorization: Add classification layer to tag scripts by category (productivity tools, AI models, automation platforms) for better organization. Thumbnail Analysis: Integrate vision AI to analyze which thumbnail styles perform best and suggest designs for your repurposed content. Sentiment Analysis: Add sentiment detection to understand emotional tone and adjust rewritten scripts to match or improve engagement potential. A/B Script Variants: Generate 2-3 script variations per video with different hooks/CTAs for split testing performance. Competitor Trend Dashboard: Build a connected dashboard showing trending tools, engagement patterns, and content gaps in your niche. Auto-Publishing Integration: Connect to Instagram API or scheduling tools to automatically post rewritten content with approval workflows. Voice Cloning Integration: Add ElevenLabs API to generate audio using your voice profile, making videos fully production-ready. Multi-Platform Expansion: Extend to TikTok, YouTube Shorts, LinkedIn by adjusting script length and platform-specific CTAs.
by Davide
This workflow automates the creation of AI-generated viral selfie images with celebrities using Nano Banana Pro Edit via RunPod, generates engaging social media captions, and publishes the content to Instagram via Postiz. It starts with a form submission where the user provides an image URL, a custom prompt, and an aspect ratio. | START | RESULT | |------|--------| | | | Key Advantages 1. ✅ Full Automation, Zero Manual Effort From image generation to caption writing and publishing, the entire process is automated. This drastically reduces production time and eliminates repetitive manual tasks. 2. ✅ Scalable Content Creation The workflow can handle unlimited submissions, making it ideal for: Creators Agencies Growth teams SaaS products offering AI-generated content 3. ✅ Consistent Viral Quality By using a dedicated AI content agent with strict guidelines, every post is: Optimized for engagement Consistent in tone and quality Designed to maximize comments, shares, and saves 4. ✅ No Technical Skills Required for End Users The form-based entry point allows anyone to generate high-quality, celebrity-style content without understanding AI, APIs, or automation. 5. ✅ Multi-Tool Integration in One Pipeline The workflow seamlessly connects: AI image generation (RunPod) AI content intelligence (Google Gemini) Asset storage (Google Drive) Social media distribution (Postiz) 6. ✅ Brand-Safe and Platform-Native Output The captions are written to feel human and authentic, avoiding: Obvious AI language Overuse of emojis Mentions of AI generation This increases trust and platform compatibility. 7. ✅ Perfect for Growth and Monetization This workflow is ideal for: Viral growth experiments Personal brand scaling Automated influencer-style content AI-powered SaaS or lead magnets How it works The workflow then: Sends the image and prompt to RunPod’s Nano Banana Pro Edit API for AI image generation. Periodically checks the generation status until it is completed. Once the image is ready, it is downloaded and analyzed by Google Gemini to generate a viral-ready Instagram caption and hashtags. The final image is uploaded to Google Drive and to Postiz for social media publishing. The caption and image are combined and scheduled for posting on Instagram through the Postiz integration. The process includes conditional logic, waiting intervals, and error handling to ensure reliable execution from input to publication. Set up steps To use this workflow in n8n: Configure credentials: Add RunPod API credentials under httpBearerAuth named “Runpods”. Set up Google Gemini (PaLM) API credentials for caption generation. Add Postiz API credentials for social media posting. Configure Google Drive OAuth2 credentials for image backup. Prepare nodes: Ensure the Form Trigger node is properly set up with the required fields: IMAGE_URL, PROMPT, and FORMAT. Update the RunPod API endpoints in the “Generate selfie” and “Get status clip” nodes if needed. Verify the Google Drive folder ID in the “Upload file” node. Replace XXX in the “Upload to Social” node with a valid Postiz integration ID. Test the flow: Use the pinned test data in the “On form submission” node to simulate a form entry. Activate the workflow and submit the form to trigger the process. Monitor execution in n8n’s workflow view to ensure all nodes run successfully. 👉 Subscribe to my new YouTube channel. Here I’ll share videos and Shorts with practical tutorials and FREE templates for n8n. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Fabian Perez
This workflow automates your entire lead follow-up process across email, SMS, and WhatsApp. It starts on a schedule and pulls your latest leads from FollowUpBoss (FUB), checking when the workflow last ran. Each new contact is automatically validated — phone numbers and emails are cleaned, filtered, and checked for duplicates before sending any message. Once validated, the system intelligently decides how to reach each lead: 💬 Email + SMS if all data looks good 📧 Email only if phone is invalid 📱 SMS/WhatsApp only if email is missing Each message is personalized using data from the lead record, and everything is tracked back in your database for future reporting. This template helps agents, marketing teams, and CRM users run consistent follow-ups without missing a single contact. Whether you manage 10 or 10 000 leads, this flow scales effortlessly. Tools used: FollowUpBoss, Gmail, Twilio/WhatsApp, n8n (Tip: Replace your API keys and Gmail credentials before running.)
by Rahul Joshi
Description This workflow automates employee retention analytics by combining candidate performance data with trait-level retention statistics. It scores candidates, validates data, and generates a polished Retention Digest HTML email using GPT (Azure OpenAI). Hiring managers receive structured insights weekly, highlighting top/weak traits, candidate scores, and actionable JD refinement tips. What This Template Does (Step-by-Step) ⚡ Manual Trigger – Starts workflow execution on demand. 📑 Candidate Data Fetch (Google Sheets – Hires Tracking) – Pulls candidate-level details like name, role, traits, start date, and retention status. 📑 Trait Summary Fetch (Google Sheets – Retention Summary) – Fetches aggregated trait-level retention statistics, including hires, stayed, left, retention %, and weight adjustments. 🔀 Merge Candidate + Trait Data – Combines both datasets into a unified stream for scoring. 🧮 Candidate Scoring & Data Normalization (Code Node) – Cleans and standardizes data. Builds a trait → weight map. Calculates each candidate’s Candidate_Score. Outputs normalized JSON. ✅ Data Validation (If Node) – Ensures both candidate and trait datasets are present. TRUE → continues to AI digest generation. FALSE → routes to error logging. ⚠️ Error Handling Logic (Google Sheets – Error Log) – Logs any failed or incomplete runs into a dedicated error sheet for auditing. 🧠 AI Processing Backend (Azure OpenAI) – Prepares candidate + trait data for GPT processing. 🤖 Retention Digest Generator (LLM Chain) – Uses GPT (gpt-4o-mini) to create a structured HTML Retention Digest, including: TL;DR summary Top Traits (positive retention) Weak Traits (negative retention) Candidate highlights (scores & retention status) 3 actionable JD refinement tips 📧 Email Delivery (Gmail) – Sends the digest directly to hiring managers as a styled HTML email with subject: Retention Analysis Digest – Weekly Update Prerequisites Google Sheets (Hires Tracking + Retention Summary + Error Log) Gmail API credentials Azure OpenAI access (gpt-4o-mini model) n8n instance (self-hosted or cloud) Key Benefits ✅ Automates retention analytics & reporting ✅ Provides AI-powered insights in structured HTML ✅ Improves hiring strategy with trait-based scoring ✅ Reduces manual effort in weekly retention reviews ✅ Ensures reliability with error handling & validation Perfect For HR & Recruitment teams monitoring post-hire retention Organizations optimizing job descriptions & hiring strategy Talent analytics teams needing automated, AI-driven insights Stakeholders requiring clear weekly digest emails