by Mohammad
🔐 Human-in-the-Loop Approval Flow (n8n + Postgres + Telegram) 👥 Who’s it for Teams that need a manager approval step before a ticket or request can change status. Great for internal ops, IT requests, or any workflow where “a human must sign off.” ⚡ What it does 📨 Manager receives approval/reject link 🔑 Link is signed with HMAC + expiry (secure & tamper-proof) 🗄️ Postgres updates the ticket status 📝 Audit trail records every decision 📲 Telegram notifies both manager and requester ⏰ Expired or invalid links trigger alerts and logs 🛠 Requirements n8n instance (self-hosted) Postgres database (with tickets, ticket_audit, workflow_errors) Telegram bot token One environment variable set: SECRET_KEY ⚙️ How to set up Set SECRET_KEY in .env Create Postgres tables (SQL provided) Add Telegram + Postgres credentials in n8n Import the workflow JSON Test by opening an approval/reject link in your browser 🎨 How to customize Change who the “manager” is (currently hardcoded in the Code node). Swap Telegram for Slack or email notifications. Extend the audit schema to include more metadata (IP, username).
by Haruki Kuwai
🧭 Description This section automates Gmail message handling through AI-powered classification and response. Using the LangChain Text Classifier, incoming emails are analyzed and sorted into four categories — High Priority, Advertisement, Inquiry, and Finance/Billing — each triggering a dedicated action flow. High Priority: AI generates a professional draft reply and saves it to Gmail. Advertisement: AI summarizes content and logs it to Google Sheets. Inquiry: AI composes a customer-friendly response automatically. Finance/Billing: AI creates a brief summary and forwards it to the accounting email. This system reduces manual sorting, ensures consistent communication quality, and speeds up email management with full automation. 💡 Use Cases Automatically categorize incoming Gmail messages by topic or intent. Generate AI-written reply drafts for urgent business messages. Summarize marketing or promotional emails into Google Sheets for tracking. Provide automated responses to customer inquiries. Forward billing or invoice messages directly to accounting teams.
by Rahul Joshi
Description: Reignite cold leads automatically with this intelligent n8n automation template that integrates Zoho CRM, Azure OpenAI (GPT-4o-mini), and Email. This workflow identifies leads that haven’t been contacted in the last 30 days, generates personalized AI-written emails based on lead data, sends them directly, and updates the CRM—all without manual follow-up. Perfect for sales teams, marketing managers, and business development professionals who want to recover lost opportunities, boost engagement rates, and maintain an active sales pipeline with minimal effort. ✅ What This Template Does (Step-by-Step) ⏰ Daily Trigger Automatically runs on a set schedule (daily or weekly) to check for inactive leads. 📅 Calculate 30 Days Ago Computes the exact date threshold (today − 30 days) to filter stale leads from Zoho CRM. 🔍 Fetch Cold Leads from Zoho Searches Zoho CRM for leads whose Last Activity occurred before the calculated date—returning only those needing re-engagement. 🤖 AI Email Generation Uses Azure OpenAI (GPT-4o-mini) to analyze each lead’s data and craft a personalized re-engagement email that reflects their previous interactions, interests, or stage in the funnel. 📧 Send Personalized Email Delivers the custom AI-generated email directly to each lead. Subject: “Let’s Reconnect!” Body: Tailored, human-like message written by AI. ✅ Update CRM Record After each email is sent, the workflow updates the lead in Zoho CRM—marking them as contacted, refreshing the Last Activity timestamp, and maintaining accurate engagement history. 🧠 Key Features ✔️ Smart lead filtering based on inactivity window ✔️ AI-crafted, context-aware personalized emails ✔️ Seamless Zoho CRM integration for tracking and updates ✔️ Fully automated daily execution ✔️ Customizable for different CRMs or intervals 💼 Use Cases 💡 Re-engage leads who’ve gone silent for 30+ days 📈 Improve conversion and response rates automatically 🤝 Maintain continuous pipeline nurturing 🔁 Save hours of manual email follow-up 📦 Required Integrations • Zoho CRM API – for fetching and updating lead data • Azure OpenAI API (GPT-4o-mini) – for email personalization • SMTP / Email API – for sending re-engagement emails 🎯 Why Use This Template? ✅ Automates your entire cold-lead revival process ✅ Saves manual outreach time for sales reps ✅ Increases lead conversion through personalized AI communication ✅ Keeps CRM data fresh and accurate
by ObisDev
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. LinkedIn Automation Outreach Workflow Documentation Inline Notes for Each Node 1. On form submission Trigger Node - Manual Start 📝 Note: "Manual trigger to start the LinkedIn scraping and outreach process. This node initiates the workflow when you want to begin lead processing." 2. Scrape profiles from a linkedin search HTTP Request/Browserflow Node 📝 Note: "Scrapes LinkedIn profiles based on search criteria (e.g., automation specialists in Lagos). Returns JSON array with profile data including names, URLs, taglines, locations, and summaries. Uses scrapeProfilesFromSearch.linkedinSearch() function." 3. Split Out1 Split Out Node 📝 Note: "Converts the JSON array of profiles into individual items for processing. Each profile becomes a separate execution path. Field to split: 'data'. This enables personalized message generation for each contact." 4. Limit Limit Node 📝 Note: "Controls batch size for processing (currently set to 3 items). Prevents overwhelming the AI agent and helps with rate limiting. Adjust max items based on your subscription limits and testing needs." 5. AI Agent LangChain AI Agent Node 📝 Note: "Generates personalized LinkedIn and email outreach messages using profile data. Uses Groq Chat Model (llama3-8b-8192) for cost-effective text generation. Input: Individual profile data. Output: Structured JSON with personalized messages. System prompt focuses on networking approach rather than sales." 6. Code1 JavaScript Code Node 📝 Note: "Processes AI-generated messages and formats data for LinkedIn automation. Extracts connection message, profile URL, and adds automation parameters. Includes error handling for malformed AI responses and random delay generation. Prepares data structure compatible with Browserflow LinkedIn automation." 7. Send a linkedin message1 Browserflow/HTTP Node 📝 Note: "Automates LinkedIn connection requests with personalized messages. Uses formatted data from Code1 node including target URL and message content. Includes built-in delays and retry logic to avoid LinkedIn rate limiting. ⚠️ Currently shows error - check Browserflow configuration and credentials." Workflow Architecture Overview Flow Type: Sequential Processing with Batch Control Purpose: Automated LinkedIn networking outreach for automation professionals Target Audience: Lagos-based automation specialists and similar professionals Detailed Workflow Description 🎯 LinkedIn Automation Outreach Workflow for Networking This sophisticated n8n workflow automates the entire process of discovering, analyzing, and reaching out to potential networking contacts in the automation industry. Designed specifically for automation agency owners and professionals looking to build meaningful connections within their local tech community. 🔄 Workflow Process: Stage 1: Data Collection The workflow begins with a manual trigger that initiates a comprehensive LinkedIn profile scraping operation. Using advanced web scraping techniques, it searches for automation specialists, particularly focusing on the Lagos tech ecosystem. The scraping function targets professionals with expertise in tools like n8n, Make.com, AI automation, and workflow optimization. Stage 2: Data Processing & Segmentation Once the profile data is collected, the Split Out node transforms the bulk JSON response into individual processing items. This crucial step enables personalized treatment of each contact. The Limit node provides batch control, allowing you to test with smaller groups (currently 3 profiles) before scaling to larger datasets. Stage 3: AI-Powered Personalization The AI Agent represents the workflow's intelligence core, utilizing Groq's LLaMA model for cost-effective, high-quality text generation. Each profile receives a customized analysis that identifies: Specific technical expertise and tools Geographic and industry connections Potential collaboration opportunities Shared professional interests The AI generates both LinkedIn connection messages and email alternatives, ensuring multiple touchpoint options. Messages focus on genuine networking value rather than sales pitches, emphasizing knowledge sharing, collaboration opportunities, and community building. Stage 4: Message Optimization & Formatting The JavaScript Code node serves as the workflow's data orchestrator, transforming AI-generated content into automation-ready formats. It handles: Response validation and error recovery LinkedIn-specific message formatting Automation parameter injection (delays, retry logic) Fallback email preparation Metadata tracking for campaign analysis Stage 5: Automated Outreach Execution The final Browserflow integration automates the actual LinkedIn connection process. It navigates to each target profile, sends personalized connection requests, and implements intelligent delays to maintain LinkedIn compliance. Built-in error handling ensures workflow resilience even when individual requests fail. 🎖️ Key Features: Intelligent Batch Processing**: Controlled processing prevents rate limiting Dual-Channel Approach**: LinkedIn + email backup ensures message delivery Geographic Targeting**: Lagos-focused networking for local community building AI-Driven Personalization**: Each message uniquely crafted based on profile analysis Error Resilience**: Comprehensive error handling maintains workflow stability Compliance-First Design**: Built-in delays and limits respect platform policies 🎯 Use Cases: Building local automation professional networks Identifying potential collaboration partners Market research on automation service providers Community building for tech meetups and events Knowledge sharing network development ⚡ Technical Specifications: Model**: Groq LLaMA3-8B for cost-effective AI generation Processing Capacity**: 3-item batches (scalable) Message Types**: LinkedIn connections + email alternatives Automation Platform**: Browserflow for LinkedIn interaction Error Handling**: Multi-layer validation and recovery Personalization Depth**: 3-5 specific talking points per contact This workflow represents a sophisticated approach to professional networking automation, balancing efficiency with authentic relationship building. It's particularly valuable for automation professionals who understand the importance of genuine connections over mass outreach tactics.
by n8n Automation Expert | Template Creator | 2+ Years Experience
🌦️ Intelligent Aquaculture Automation for Indonesia Transform your fish farming operation with this cutting-edge n8n workflow that combines Indonesia's official BMKG weather data with IoT-powered feeding automation. This system intelligently reduces feed by 20% when rain probability exceeds 60%, preventing overfeeding during adverse weather conditions that could compromise water quality and fish health. 🚀 Key Features 🌦️ Real-time BMKG Integration: Fetches official Indonesian weather forecasts every 12 hours using BMKG's public API with precise ADM4 regional targeting 🤖 Smart Decision Engine: Advanced JavaScript algorithms analyze 6-hour and 12-hour rain probabilities to make optimal feeding decisions automatically 📱 ESP8266 IoT Control: Seamlessly sends HTTP webhook commands to your ESP8266/ESP32-based fish feeder hardware with JSON payloads 💬 Rich Telegram Notifications: Comprehensive reports including weather analysis, feeding decisions, hardware status, and next feeding schedule ⏰ Precision Scheduling: Automated execution at 05:30 and 16:30 WIB (Indonesian Western Time) with cron-based triggers 📊 Activity Logging: Complete audit trail with timestamps, weather data, and feeding decisions for operational monitoring 🛠️ Technical Architecture Core Node Components: Schedule Trigger:** Automated twice-daily execution HTTP Request:** BMKG API integration with timeout handling Code (JavaScript):** Weather parsing and feeding ratio calculations IF Condition:** Intelligent branching based on configurable rain thresholds Telegram:** Formatted notifications with markdown support Set Variables:** Secure credential management with placeholder tokens 📋 Prerequisites ✅ n8n Instance: Self-hosted or cloud deployment ✅ Telegram Bot: Create via @BotFather for notifications ✅ ESP8266/ESP32: Hardware with servo motor for automated feeding ✅ Arduino Skills: Basic programming knowledge for hardware setup ✅ Indonesian Location: Uses BMKG API with ADM4 regional codes ⚙️ Configuration Requirements 📍 Location Settings: Update latitude, longitude, and BMKG ADM4 code in the Config node 🤖 Telegram Bot: Configure bot token and chat ID in credentials 🔗 ESP8266 Webhook: Set your device's IP address for hardware communication 📊 Feeding Parameters: Customize rain threshold (default: 60%) and feed reduction (default: -20%) 🎯 Perfect For 🏭 Commercial Aquaculture: Large-scale fish farming operations requiring weather-aware feeding 🏠 Hobbyist Enthusiasts: Home aquarium and pond automation projects 🌱 Smart Agriculture: Integration with comprehensive farm management ecosystems 🔧 IoT Learning: Educational platform for weather-based automation development 🌍 Environmental Research: Combining meteorological data with livestock care protocols 📊 Rich Output Examples The workflow generates detailed Telegram reports featuring: Current Weather Analysis:** 6-hour and 12-hour rain probability breakdowns Feeding Decision Logic:** Clear rationale for feed adjustments with percentages Hardware Confirmation:** ESP8266 response status and command execution verification Schedule Preview:** Next automated feeding time with countdown Historical Logs:** Comprehensive activity tracking for pattern analysis 🔧 Hardware Integration Guide Designed for ESP8266-based feeders accepting HTTP POST commands. The workflow transmits structured JSON containing: { "command": "FEED_REDUCE_20", "feed_ratio": -20, "rain_prob": 75, "timestamp": "2024-09-18T10:30:00Z", "location": "Main Pond" } 🌍 Regional Adaptation Indonesia-Optimized: Built specifically for BMKG's official weather API with ADM4 regional precision Global Compatibility: Easily adaptable for international weather services by modifying HTTP requests and parsing logic Scalable Architecture: Supports multiple pond locations with separate ADM4 configurations 🔒 Security & Credentials All API keys use {{PLACEHOLDER}} format for secure credential management No hardcoded sensitive information in workflow nodes Telegram bot tokens managed through n8n's credential system ESP8266 webhooks support local network security 📈 Performance Benefits 20% Feed Optimization:** Automatic reduction during high rain probability periods Water Quality Protection:** Prevents overfeeding that degrades aquatic environment Cost Efficiency:** Reduces feed waste while maintaining fish health 24/7 Monitoring:** Continuous weather analysis without manual intervention Scalable Operations:** Supports multiple feeding locations from single workflow
by Ayesha Gull
Transform YouTube videos to LinkedIn posts via Apify, GPT-4.1 Mini, and Google Sheets This workflow transforms new YouTube videos from any channel into high-quality, ready-to-publish LinkedIn posts and saves them directly to a Google Sheet, creating a powerful, automated content repurposing engine. Workflow Preview Who’s it for This template is perfect for content creators, social media managers, and marketing teams who want to save time and consistently repurpose video content for their professional audience on LinkedIn. What it does This automation monitors a specific YouTube channel for new video uploads. When a new video is detected, it automatically extracts the full transcript. The transcript is then sent to an advanced AI model (GPT-4o or similar) which acts as a "thought leader" to generate two distinct, insightful LinkedIn posts based on the video's core themes. Finally, these generated posts are neatly organized and saved as new rows in a Google Sheet for easy review and scheduling. How to set up Set the Trigger: In the "RSS Feed Trigger" node, enter the RSS Feed URL for the YouTube channel you want to monitor. You can find this by searching the channel's page source for its channel_id and using the format: https://www.youtube.com/feeds/videos.xml?channel_id=YOUR_ID_HERE. Connect Credentials: Add your API credentials for Apify and OpenAI in their respective nodes. Configure Google Sheets: First prepare your spreadsheet by opening Google Sheets and creating a new sheet. In the first row, create the following headers exactly as they appear below: Video Link Theme Hook Body CTA Hashtags Now in the n8n Google Sheets node, authenticate your account. Then select the spreadsheet you just prepared and the specific sheet name from the dropdown lists. Activate: Save the workflow and toggle it to "Active". How to Find a YouTube Channel ID To monitor a specific channel, you need its unique ID to create the RSS feed URL required in the first step. Here’s how to find it: Navigate to the main page of the YouTube channel you want to follow (e.g., https://www.youtube.com/@CalebWritesCode). Right-click anywhere on the page and select "View Page Source" from the menu. A new browser tab will open with the website's code. Press Ctrl+F (on Windows) or Cmd+F (on Mac) to open the search bar. In the search bar, type <yt:channelId> and press Enter. The search will highlight a line of code containing the ID. Copy the long string of letters and numbers that appears inside the <yt:channelId> tag. Paste this ID into the RSS Feed URL format: https://www.youtube.com/feeds/videos.xml?channel_id=YOUR_ID_HERE. Requirements An n8n instance. An Apify account and API key. An OpenAI account and API key. A Google account with a prepared Google Sheet. How to customize the workflow Monitor Multiple Channels:** The easiest way to watch more than one channel is to add multiple RSS Feed Trigger nodes to the canvas. Configure each trigger with a different YouTube channel's RSS URL, and then connect the output of all of them to the input of the first Apify node. This setup will start the workflow whenever any of the monitored channels publishes a new video. Change the number of posts:** You can easily edit the prompt in the "OpenAI Chat Model" node to request a different number of posts (e.g., change "exactly 2" to "exactly 3"). Adjust the tone:** The persona and rules in the OpenAI prompt can be modified to change the style of the generated posts to better match your brand voice. Add scheduling:** Connect a scheduling node (e.g., Buffer, SocialBee) after the "Google Sheets" node to create a fully automated content pipeline that publishes the posts for you. Disclaimer: This template uses a community node. It is recommended for users on self-hosted n8n instances.
by YungCEO
Pre‑Built AI Customer Service System for Businesses | n8n, Gemini & Notion 💥 What It Does Revolutionize your client interactions with this Done‑For‑You AI Customer Service & Lead Routing System. This advanced n8n workflow, powered by Google Gemini and integrated with Notion, is pre-configured and ready to deploy, instantly transforming how you handle inquiries. Stop losing valuable time to manual support and inefficient lead qualification; this system intelligently routes messages, retrieves information from your Notion database, and provides personalized assistance from day one. It's the ultimate shortcut to professional, scalable customer engagement and lead conversion, delivered as a fully set up automation. ⚙️ Key Features ⚡ Instant AI Lead Routing:* Automatically classifies incoming messages (customer service, questions, booking) and directs them to the right AI agent for a seamless user experience. 🧠 Multi-Agent AI System:* Includes specialized AI agents for comprehensive customer support, product/service inquiries, and automated consultation booking. 💡 Notion-Powered Knowledge Base:* Leverages your existing Notion databases to pull accurate, contextual information for personalized responses and solutions. 🤝 Personalized Customer Support:* The Customer Service Agent accesses Notion CRM to provide tailored support based on customer history and previous interactions. 📈 Automated Consultation Booking:* The Booking Agent streamlines scheduling by guiding users to your intake forms, qualifying leads effortlessly. 😩 Pain Points Solved Sick of wasting countless hours on manual customer service inquiries and support? Tired of slow response times costing you valuable leads and frustrating clients? Struggling to build a complex AI chatbot system from scratch with no prior experience? Overwhelmed by disorganized customer data and scattered product information? Missing out on potential sales opportunities due to inefficient lead qualification processes? 📦 What’s Included Fully configured n8n AI Chatbot workflow for instant deployment Pre-integrated Google Gemini language models and AI agents Ready-to-connect Notion CRM and knowledge base tools Comprehensive, step-by-step deployment and launch guide Ongoing access to future updates and enhancements 🚀 Call to Action Launch your AI customer powerhouse today. No setup, no stress, just instant results. 🏷️ Optimized Tags done for you ai, n8n workflow, ai chatbot, customer service automation, lead qualification, notion integration, google gemini, pre built system, ai agent, business automation, digital product, ready to use, instant deploy
by Davide
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automates the entire process of generating short AI videos using Google Veo3 Fast, enhancing them with SEO-optimized titles, and uploading them directly to TikTok via Postiz, all triggered from a central Google Sheet. This setup ensures a seamless pipeline from video creation to TikTok upload, with minimal manual intervention. Benefits Full automation** from prompt input to social media publishing. Cheaper video generation** using Veo3 Fast vs traditional AI video tools. Centralized management** through Google Sheets – no coding required for end users. SEO-enhanced titles** with GPT-4o to boost engagement. Scheduled or manual triggering**, perfect for batch operations. No manual uploads** – integration with Postiz means content is published hands-free. How It Works This workflow automates the process of generating AI videos using Google Veo3 Fast, saving them to Google Drive, and uploading them to TikTok via Postiz. Here’s how it functions: Trigger: The workflow can be started manually or scheduled (e.g., every 5 minutes) to check for new video requests in a Google Sheet. Video Generation: The workflow retrieves a video prompt and duration from the Google Sheet. It sends the prompt to Google Veo3 Fast via the Fal.ai API to generate the video. The system periodically checks the video generation status until it’s completed. Post-Processing: Once the video is ready, it is downloaded and uploaded to Google Drive. A YouTube-optimized title is generated using GPT-4o Mini based on the video prompt. TikTok Upload: The video is uploaded to Postiz, a social media scheduling tool. Postiz then publishes the video to the connected TikTok account with the generated title. Tracking: The Google Sheet is updated with the video URL for record-keeping. Set Up Steps To configure this workflow, follow these steps: Prepare the Google Sheet: Create a Google Sheet with columns: PROMPT: Description of the video. DURATION: Length of the video. VIDEO: (Leave empty, auto-filled by the workflow). Obtain API Keys: Sign up at Fal.ai to get an API key for Google Veo3 Fast. Replace YOURAPIKEY in the "Create Video" node’s HTTP header (Authorization: Key YOURAPIKEY). Configure Postiz for TikTok: Create a Postiz account (free trial available). Connect your TikTok account in Postiz and note the Channel ID. Replace XXX in the "TikTok" node with your TikTok Channel ID. Set the Postiz API key in the "Upload Video to Postiz" node. Set Up Google Drive & Sheets Access: Ensure the workflow has OAuth access to: Google Sheets (to read/write video data). Google Drive (to store generated videos). Schedule or Run Manually: The workflow can be triggered manually or scheduled (e.g., every 5 minutes) to process new video requests. Note: This workflow requires self-hosted n8n due to community node dependencies. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Yang
Who is this for? This workflow is perfect for content strategists, SEO specialists, marketing agencies, and virtual assistants who need to quickly audit and collect blog content from client websites into a structured Google Sheet without doing manual crawling and copy-pasting. What problem is this workflow solving? Manually visiting a website, finding blog posts, and copying content into a spreadsheet is time-consuming and prone to errors. This workflow automates the process: it crawls a website, filters only blog-related pages, scrapes the article content, and stores everything neatly in Google Sheets for easy analysis and content strategy planning. What this workflow does The workflow starts when a client submits their website URL through a form. A Google Sheet is automatically created and headers are added for organizing the audit. Dumpling AI then crawls the website to discover all available pages, while the automation filters out only blog-related URLs. Each blog page is scraped for content, and the structured results (URL, crawled page, and website content) are appended row by row into the Google Sheet. Nodes Overview Form Trigger – Form Submission (Client URL) Captures the client’s website URL to start the workflow. Google Sheets – Create Blog Audit Sheet Creates a new Google Sheet with a title based on the submitted URL. Set – Set Sheet Headers Defines the headers: Url, Crawled_pages, website_content. Code – Format Header Row Formats the headers properly before sending them to the sheet. HTTP Request – Insert Headers into Sheet Updates the Google Sheet with the prepared header row. HTTP Request – Dumpling AI: Crawl Website Crawls the submitted URL to discover internal pages. Code – Extract Blog URLs Filters the crawl results and keeps only URLs that match common blog patterns (e.g., /blog/, /articles/, /posts/). HTTP Request – Dumpling AI: Scrape Blog Pages Scrapes the text content from each filtered blog page. Set – Prepare Row Data Maps the URL, blog page link, and scraped content into structured fields. Google Sheets – Save Blog Data to Google Sheets Appends the structured data into the audit sheet row by row. 📝 Notes Set up Dumpling AI and generate your API key from: Dumpling AI Google Sheets must be connected with write permissions enabled. You can change the crawl depth or limit (currently set to 10 pages) in the Dumpling AI: Crawl Website node. The Extract Blog URLs node uses regex patterns to detect blog content. You can customize these patterns to match your website’s URL structure.
by Davide
🎥🤖 This workflow automates the creation and publishing of UGC (User-Generated Content) videos using Google Gemini and Google Veo 3, then uploads them directly to Instagram with Postiz. Advantages ✅ Full Automation – Eliminates manual video editing, caption writing, and uploading. ✅ High-Quality UGC Videos – Leverages Google Veo 3 for professional ad-like video generation. ✅ AI-Powered Creativity – Uses Google Gemini for both creative direction and social media copywriting. ✅ Time-Saving – From image to published Instagram post in a single automated flow. ✅ Consistency – Ensures branding and messaging remain aligned across campaigns. ✅ Scalability – Can easily generate multiple UGC ads for different products. ✅ Centralized Management – Stores videos in Google Drive and distributes them via Postiz. How It Works Image Analysis & Creative Briefing: The workflow starts with a predefined product image. This image is analyzed by Google Gemini, which acts as a "Creative Director" to generate a detailed, cinematic prompt describing an 8-second commercial scene based on the image's content. Parallel AI Task Execution: The creative director's prompt is then sent to two different AI agents simultaneously: Video Generation: One agent uses the prompt with Google Veo 3 to generate the actual video file, visualizing the described scene. Copywriting: The other agent, acting as a "Social Media Manager," uses the same prompt to generate compelling caption copy tailored for an Instagram audience. Asset Distribution & Publishing: The generated video is uploaded to two destinations: Google Drive for storage and the Postiz API for social media management. The AI-generated caption is prepared. Finally, all data (video information from Postiz and the caption) is merged and sent to the Postiz node, which schedules and publishes the video as a post to the connected Instagram account. Set Up Steps To use this workflow, you need to configure the following credentials and node settings in n8n: Image Source: In the "Set image" node, replace the default image_url value with the URL of your own product image. Google Gemini Credentials: The workflow uses three Gemini nodes. Ensure your Google Gemini API credentials (named "Google Gemini(PaLM) (Eure)" in this example) are correctly set up and have access to the specified models (gemini-2.5-pro and veo-3.0-generate-preview). Google Drive Credentials: Configure the "Upload video" node with valid Google Drive OAuth credentials. Update the folderId parameter if you wish to save the generated videos to a different folder in your Drive. Postiz Credentials: The "Upload Video to Postiz" and "Instagram" nodes require valid credentials for the Postiz API. You must have an active Postiz account and have connected your Instagram business account to it within the Postiz platform. Postiz Integration ID: In the "Instagram" node, the integrationId field is specific to a connected social account within a Postiz account. You must replace this value with your own Instagram integration ID from Postiz. (Optional) Video Parameters: You can adjust the video generation aspect ratio (e.g., 9:16 for Stories/Reels) in the options of the "Generate UGC Video" node. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by EmailListVerify
How to scrape emails from websites This workflow will : Try to find emails by scraping the website via http request If no result is found, it will use EmailListVerify email finder API to guess an email address Scraping email via http request is a cost-effective way to find email addresses, so it can save you a few bucks to use it before calling any email finder API. Who is for This workflow will help you transform a list of websites into a list of leads with email addresses. This is a handy workflow for any lead generation specialist. Pay attention that this workflow will usually return only generic emails like "contact@". Those generic emails are useful when you target small businesses; the owner usually monitors those emails. However, I don't advise this workflow to target enterprise customers. Requirements In order to use this workflow, you will need: To copy this Google sheet template Get an API key for EmailListVerify You then need to edit the setup of the 3 stages highlighted with a yellow sticky note, and you will be good to go.
by Eumentis
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. What It Does This workflow automatically discovers recently seed-funded startups by monitoring RSS feeds for funding announcements. It uses Bright Data to scrape full article content, then extracts structured company information using OpenAI (GPT). The data is exported to an Excel sheet on OneDrive, providing sales teams with a real-time list of qualified leads without any manual effort. How It Works Trigger & Article Discovery: Monitors curated RSS feeds for articles mentioning seed funding and triggers the workflow on new article detection. Content Scraping & Preparation: Scrapes full article content and converts it into clean markdown format for AI processing. Data Extraction with AI: Uses OpenAI to extract structured details like company name, website, LinkedIn profile, founders, and funding amount. Structured Data Output & Storage: Appends extracted data to an Excel sheet on OneDrive via Microsoft Graph API. Prerequisites RSS Feed URL**: A valid RSS feed source that provides seed funding articles for startups. Bright Data Credentials**: Active Bright Data account with access credentials (API token ) to enable article scraping. OpenAI API Key**: An OpenAI account with an API key and access to GPT-4.1-MINI models for data extraction. Microsoft OAuth2 API Credentials**: OAuth2 credentials (Client ID, Secret, Tenant ID) with access scopes to use Microsoft Graph API for Excel integration. Excel Sheet in SharePoint**: A pre-created Excel file hosted on OneDrive or SharePoint with the following column headers: createdAt, companyName, companyWebsite, companyLinkedIn, fundingAmount, founderName, founderLinkedIn, articleLink Excel File & Sheet Identifiers**: The Drive ID, File ID and Sheet ID of your Excel sheet stored on OneDrive or SharePoint, required by the Microsoft Graph API for appending rows using the HTTP node in n8n. Need help with the setup? Feel free to contact us How to Set It Up Follow these steps to configure and run the workflow: Import the Workflow Copy the provided n8n workflow template. In your n8n instance, go to Editor UI > paste this workflow. Configure the RSS Feed Node Open the RSS trigger node. Replace the default URL with your RSS feed URL. Ensure the polling interval matches your desired frequency (e.g., every 15 minutes or 1 hour). Set Up Bright Data Node Add your Bright Data credentials. Follow the documentation to complete the setup. Configure OpenAI Integration Add your OpenAI API key as a credential in n8n. Ensure the model is set to gpt-4.1-MINI. Follow the documentation to complete the setup. Configure Excel File Integration Open the HTTP node responsible for sending data to the Excel sheet via Microsoft Graph API. Replace the placeholder values in the API endpoint URL with your actual File ID and Sheet ID from the Excel file stored on OneDrive or SharePoint. https://graph.microsoft.com/v1.0/drives/{{drive-id}}/items/{{file-id}}/workbook/tables/{ {{ sheet-id }} }/rows This URL is used to append data to the specified Excel sheet range. Next, set up Microsoft OAuth2 credentials in n8n: Go to n8n > Credentials > Microsoft OAuth2 API. Provide the required values: Client ID Client Secret Tenant ID Scope Follow the documentation to complete the setup. Once the credential is saved, connect it to the HTTP node making the Graph API call. Activate the Workflow Set the workflow status to Active in n8n so it runs automatically when a new article appears in the RSS feed. Need Help? Contact us for support and custom workflow development.