by Fayzul Noor
This workflow is built for digital marketers, sales professionals, influencer agencies, and entrepreneurs who want to automate Instagram lead generation. If you’re tired of manually searching for profiles, copying email addresses, and updating spreadsheets, this automation will save you hours every week. It turns your process into a smart system that finds, extracts, and stores leads while you focus on growing your business. How it works / What it does This n8n automation completely transforms how you collect Instagram leads using AI and API integrations. Here’s a simple breakdown of how it works: Set your targeting parameters using the Edit Fields node. You can specify your platform (Instagram), field of interest such as “beauty & hair,” and target country like “USA.” Generate intelligent search queries with an AI Agent powered by GPT-4o-mini. It automatically creates optimized Google search queries to find relevant Instagram profiles in your chosen niche and location. Extract results from Google using Apify’s Google Search Scraper, which collects hundreds of Instagram profile URLs that match your search criteria. Fetch detailed Instagram profile data using Apify’s Instagram Scraper. This includes usernames, follower counts, and profile bios where contact information usually appears. Use AI to extract emails from the profile biographies with the Information Extractor node powered by GPT-3.5-turbo. It identifies emails even when they are hidden or creatively formatted. Store verified leads in a PostgreSQL database. The workflow automatically adds new leads or updates existing ones with fields like username, follower count, email, and niche. Once everything is set up, the system runs on autopilot and keeps building your database of quality leads around the clock. How to set up Follow these steps to get your Instagram Lead Generation Machine running: Import the JSON file into your n8n instance. Add your API credentials: Apify token for the Google and Instagram scrapers OpenAI API key for the AI-powered nodes PostgreSQL credentials for storing leads Open the Edit Fields node and set your platform, field of interest, and target country. Run the workflow manually using the Manual Trigger node to test it. Once confirmed, replace the manual trigger with a schedule or webhook to run it automatically. Check your PostgreSQL database to ensure the leads are being saved correctly. Requirements Before running the workflow, make sure you have the following: An n8n account or instance (self-hosted or n8n Cloud) An Apify account for accessing the Google and Instagram scrapers OpenAI API access for generating smart search queries and extracting emails A PostgreSQL database to store your leads Basic understanding of how n8n workflows and nodes operate How to customize the workflow This workflow is flexible and can be customized to fit your business goals. Here’s how you can tailor it: Change your niche or location by updating the Edit Fields node. You can switch from “beauty influencers in the USA” to “fitness coaches in Canada” in seconds. Add more data fields to collect additional information such as engagement rates, bio keywords, or profile categories. Just modify the PostgreSQL node and database schema. Connect to your CRM or email system to automatically send introduction emails or add new leads to your marketing pipeline. Use different triggers such as a scheduled cron trigger for daily runs or a webhook trigger to start the workflow through an API call. Filter higher-quality leads by adding logic to capture only profiles with a minimum number of followers or verified emails.
by Facundo Cabrera
Automated Meeting Minutes from Video Recordings This workflow automatically transforms video recordings of meetings into structured, professional meeting minutes in Notion. It uses local AI models (Whisper for transcription and Ollama for summarization) to ensure privacy and cost efficiency, while uploading the original video to Google Drive for safekeeping. Ideal for creative teams, production reviews, or any scenario where visual context is as important as the spoken word. 🔄 How It Works Wait & Detect: The workflow monitors a local folder. When a new .mkv video file is added, it waits until the file has finished copying. Prepare Audio: The video is converted into a .wav audio file optimized for transcription (under 25 MB with high clarity). Transcribe Locally: The local Whisper model generates a timestamped text transcript. Generate Smart Minutes: The transcript is sent to a local Ollama LLM, which produces structured, summarized meeting notes. Store & Share: The original video is uploaded to Google Drive, a new page is created in Notion with the notes and a link to the video, and a completion notification is sent via Discord. ⏱️ Setup Steps Estimated Time**: 10–15 minutes (for technically experienced users). Prerequisites**: Install Python, FFmpeg, and required packages (openai-whisper, ffmpeg-python). Run Ollama locally with a compatible model (e.g., gpt-oss:20b, llama3, mistral). Configure n8n credentials for Google Drive, Notion, and Discord. Workflow Configuration**: Update the file paths for the helper scripts (wait-for-file.ps1, create_wav.py, transcribe_return.py) in the respective "Execute Command" nodes. Change the input folder path (G:\OBS\videos) in the "File" node to your own recording directory. Replace the Google Drive folder ID and Notion database/page ID in their respective nodes. > 💡 Note: Detailed instructions for each step, including error handling and variable setup, are documented in the Sticky Notes within the workflow itself. 📁 Helper Scripts Documentation wait-for-file.ps1 A PowerShell script that checks if a file is still being written to (i.e., locked by another process). It returns 0 if the file is free and 1 if it is still locked. Usage: .\wait-for-file.ps1 -FilePath "C:\path\to\your\file.mkv" create_wav.py A Python script that converts a video file into a .wav audio file. It automatically calculates the necessary audio bitrate to keep the output file under 25 MB—a common requirement for many transcription services. Usage: python create_wav.py "C:\path\to\your\file.mkv" transcribe_return.py A Python script that uses a local Whisper model to transcribe an audio file. It can auto-detect the language or use a language code specified in the filename (e.g., meeting.en.mkv for English, meeting.es.mkv for Spanish). The transcript is printed directly to stdout with timestamps, which is then captured by the n8n workflow. Usage: Auto-detect language python transcribe_return.py "C:\path\to\your\file.mkv" Force language via filename python transcribe_return.py "C:\path\to\your\file.es.mkv" `
by Rahul Joshi
📊 Description Automate your client proposal creation with this intelligent workflow that transforms Google Sheets entries into professional Google Docs proposals using OpenAI GPT-4o. Designed for agencies and sales teams, it delivers personalized, branded, and structured proposals in minutes — no manual editing required. 🚀📄🤖 What This Template Does Triggers when a new row is added in a connected Google Sheet. 📋 Filters only the latest row to ensure one proposal per new entry. 🔍 Uses GPT-4o to generate structured proposal content (Executive Summary, Scope, Costing, Timeline, Conclusion). 💡 Parses output into validated JSON format for accurate field mapping. ⚙️ Populates a Google Docs template with AI-generated content using placeholders. 📝 Downloads the completed proposal as a PDF file. 💾 Archives the finalized document into a designated Google Drive folder. 📂 Resets the template for the next proposal cycle automatically. 🔄 Key Benefits ✅ Eliminates repetitive manual proposal writing. ✅ Ensures brand consistency with structured templates. ✅ Generates high-quality proposals using AI in real time. ✅ Automates document formatting, saving hours per client. ✅ Scales easily for agencies handling multiple clients daily. Features Google Sheets trigger for new entries. GPT-4o-based content generation with customizable prompts. JSON output validation and structured parsing. Google Docs population using placeholder replacement. Drive storage automation for version tracking. End-to-end automation from data to proposal delivery. Requirements Google Sheets document with columns: clientName, jobDescription. Google Docs template with placeholders (e.g., {{executive_summary}}, {{scope_of_work}}). OpenAI API key (GPT-4o). Google Drive credentials for output management. Target Audience Marketing and web agencies automating client proposal generation. Sales teams preparing project estimates and deliverables. Freelancers and consultants managing multiple client requests. Businesses streamlining documentation workflows. Step-by-Step Setup Instructions Connect Google Sheets and replace the Sheet ID placeholder. Set up your Google Docs proposal template and replace the Document ID. Add your OpenAI API key for GPT-4o content generation. Specify your Google Drive folder for saving proposals. Test the workflow with a sample entry to confirm formatting. Activate the workflow for continuous proposal generation. ✅
by Dzaky Jaya
This n8n workflow demonstrate how to configure AI Agent for financial research purposes especially for IDX data through Sectors App API. use cases: research stock market in Indonesia. analyze the performance of companies belonging to certain subsectors or company comparing financial metrics between BBCA and BBRI providing technical analysis for certain ticker stock movement and many more all from conversational agent UI chat. Main components Input-n8nChatNative**: handling and process input from native n8n chat ui Input-TelegramBot**: handling and process input from Telegram Bot Input-WebUI(Webhook)**: handling and process input from hosted Web UI through webhook Main Agent**: processing raw user queries and delegate task to specialized agent if needed. Spec Agent - Sectors App**: make request to Sectors App API to get real time financial data listed in IDX from available endpoint Spec Agent - Web Search**: make web search from Google Grounding (Gemini API) and Tavily Search. Vector Document Processing**: process document upload from user into embedding and vector store. How it works user queries may received from multiple platform (we use three here: Telegram, hosted Web UI, and native n8n chat UI) if user also uploading document, it will process the document and store it in vector store the request send to the Main Agent to process the queries the Main Agent decide the task to delegate to Specialized Agent if nedded. the result then sent back to user based on the platform How to use You need this API: Gemini API: get it free from https://aistudio.google.com/ Tavily API: get it free from https://www.tavily.com/ Sectors App API: get it from https://sectors.app/api/ you can optionally change the model or adding fallback model to handle token request, cause it may use quite many tokens.
by explorium
Explorium Agent for Slack AI-powered Slack bot for business intelligence queries using Explorium API through MCP. Prerequisites Slack workspace with admin access Anthropic API key (You can replace with other LLM Chat) Explorium API Key 1. Create Slack App Create App Go to api.slack.com/apps Click Create New App → From scratch Give it name (e.g., "Explorium Agent") and select workspace Bot Permissions (OAuth & Permissions) Add these Bot Token Scopes: app_mentions:read channels:history channels:read chat:write emoji:read groups:history groups:read im:history im:read mpim:history mpim:read reactions:read users:read Enable Events Event Subscriptions → Enable Add Request URL (from n8n Slack Trigger node) Subscribe to bot events: app_mention message.channels message.groups message.im message.mpim reaction_added Install App Install App → Install to Workspace Copy Bot User OAuth Token (xoxb-...) 2. Configure n8n Import & Setup Import this JSON template Slack Trigger node: Add Slack credential with Bot Token Copy webhook URL Paste in Slack Event Subscriptions Request URL Anthropic Chat Model node: Add Anthropic API credential Model: claude-haiku-4-5-20251001 (You can replace it with other chat models) MCP Client node: Endpoint: https://mcp.explorium.ai/mcp Header Auth: Add Explorium API key Usage Examples @ExploriumAgent find tech companies in SF with 50-200 employees @ExploriumAgent show Microsoft's technology stack @ExploriumAgent get CMO contacts at healthcare companies `
by Growth AI
N8N UGC Video Generator - Setup Instructions Transform Product Images into Professional UGC Videos with AI This powerful n8n workflow automatically converts product images into professional User-Generated Content (UGC) videos using cutting-edge AI technologies including Gemini 2.5 Flash, Claude 4 Sonnet, and VEO3 Fast. Who's it for Content creators** looking to scale video production E-commerce businesses** needing authentic product videos Marketing agencies** creating UGC campaigns for clients Social media managers** requiring quick video content How it works The workflow operates in 4 distinct phases: Phase 0: Setup - Configure all required API credentials and services Phase 1: Image Enhancement - AI analyzes and optimizes your product image Phase 2: Script Generation - Creates authentic dialogue scripts based on your input Phase 3: Video Production - Generates and merges professional video segments Requirements Essential Services & APIs Telegram Bot Token** (create via @BotFather) OpenRouter API** with Gemini 2.5 Flash access Anthropic API** for Claude 4 Sonnet KIE.AI Account** with VEO3 Fast access N8N Instance** (cloud or self-hosted) Technical Prerequisites Basic understanding of n8n workflows API key management experience Telegram bot creation knowledge How to set up Step 1: Service Configuration Create Telegram Bot Message @BotFather on Telegram Use /newbot command and follow instructions Save the bot token for later use OpenRouter Setup Sign up at openrouter.ai Purchase credits for Gemini 2.5 Flash access Generate and save API key Anthropic Configuration Create account at console.anthropic.com Add credits to your account Generate Claude API key KIE.AI Access Register at kie.ai Subscribe to VEO3 Fast plan Obtain bearer token Step 2: N8N Credential Setup Configure these credentials in your n8n instance: Telegram API Credential Name: telegramApi Bot Token: Your Telegram bot token OpenRouter API Credential Name: openRouterApi API Key: Your OpenRouter key Anthropic API Credential Name: anthropicApi API Key: Your Anthropic key HTTP Bearer Auth Credential Name: httpBearerAuth Token: Your KIE.AI bearer token Step 3: Workflow Configuration Import the Workflow Copy the provided JSON workflow Import into your n8n instance Update Telegram Token Locate the "Edit Fields" node Replace "Your Telegram Token" with your actual bot token Configure Webhook URLs Ensure all Telegram nodes have proper webhook configurations Test webhook connectivity Step 4: Testing & Validation Test Individual Nodes Verify each API connection Check credential configurations Confirm node responses End-to-End Testing Send a test image to your Telegram bot Follow the complete workflow process Verify final video output How to customize the workflow Modify Image Enhancement Prompts Edit the HTTP Request node for Gemini Adjust the prompt text to match your style preferences Test different aspect ratios (current: 1:1 square format) Customize Script Generation Modify the Basic LLM Chain node prompt Adjust video segment duration (current: 7-8 seconds each) Change dialogue style and tone requirements Video Generation Settings Update VEO3 API parameters in HTTP Request1 node Modify aspect ratio (current: 16:9) Adjust model settings and seeds for consistency Output Customization Change final video format in MediaFX node Modify Telegram message templates Add additional processing steps before delivery Workflow Operation Phase 1: Image Reception and Enhancement User sends product image via Telegram System prompts for enhancement instructions Gemini AI analyzes and optimizes image Enhanced square-format image returned Phase 2: Analysis and Script Creation System requests dialogue concept from user AI analyzes image details and environment Claude generates realistic 2-segment script Scripts respect physical constraints of original image Phase 3: Video Generation Two separate videos generated using VEO3 System monitors generation status Videos merged into single flowing sequence Final video delivered via Telegram Troubleshooting Common Issues API Rate Limits**: Implement delays between requests Webhook Failures**: Verify URL configurations and SSL certificates Video Generation Timeouts**: Increase wait node duration Credential Errors**: Double-check all API keys and permissions Error Handling The workflow includes automatic error detection: Failed video generation triggers error message Status checking prevents infinite loops Alternative outputs for different scenarios Advanced Features Batch Processing Modify trigger to handle multiple images Add queue management for high-volume usage Implement user session tracking Custom Branding Add watermarks or logos to generated videos Customize color schemes and styling Include brand-specific dialogue templates Analytics Integration Track usage metrics and success rates Monitor API costs and optimization opportunities Implement user behavior analytics Cost Optimization API Usage Management Monitor token consumption across services Implement caching for repeated requests Use lower-cost models for testing phases Efficiency Improvements Optimize image sizes before processing Implement smart retry mechanisms Use batch processing where possible This workflow transforms static product images into engaging, professional UGC videos automatically, saving hours of manual video creation while maintaining high quality output perfect for social media platforms.
by Intuz
This n8n template from Intuz provides a complete solution to automate the extraction of critical information from PDF documents like faxes, or any PDFs. It uses the power of Google Gemini's multimodal capabilities to read the document, identify key fields, and organize the data into a structured format, saving it directly to a Google Sheet. Who's this workflow for? Healthcare Administrators Medical Billing Teams Legal Assistants Data Entry Professionals Office Managers How it works 1. Upload via Web Form: The process starts when a user uploads a fax (as a PDF file) through a simple, secure web form generated by n8n. 2. AI Document Analysis: The PDF is sent directly to Google Gemini's advanced multimodal model, which reads the entire document—including text, tables, and form fields. It extracts all relevant information based on a detailed prompt. 3. AI Data Structuring: The raw extracted text is then passed to a second AI step. This step cleans the information and strictly structures it into a predictable JSON format (e.g., Patient ID, Name, DOB, etc.). 4. Save to Google Sheets: The final, structured data is automatically appended as a new, clean row in your designated Google Sheet, creating an organized and usable dataset from the unstructured fax. Key Requirements to Use This Template 1. n8n Instance & Required Nodes: An active n8n account (Cloud or self-hosted). This workflow uses the official n8n LangChain integration (@n8n/n8n-nodes-langchain). If you are using a self-hosted version of n8n, please ensure this package is installed. 2. Google Accounts: Google Drive Account: For temporarily storing the uploaded file. Google Gemini AI Account: A Google Cloud account with the Vertex AI API (for Gemini models) enabled and an associated API Key. Google Sheets Account: A pre-made Google Sheet with columns that match the data you want to extract. Customer Setup Guide: Here is a detailed, step-by-step guide to help you configure and run this workflow. 1. Before You Begin: Prerequisites Please ensure you have the following ready: The FAX-Content-Extraction.json file we provided. Active accounts for n8n, Google Drive, Google Cloud (for Gemini AI), and Google Sheets. A Google Sheet created with header columns that match the data you want to extract (e.g., Patient ID, Patient Name, Date of Birth, etc.). 2. Step-by-Step Configuration Step 1: Import the Workflow Open your n8n canvas. Click "Import from File" and select the FAX-Content-Extraction.json file. The workflow will appear on your canvas. Step 2: Set Up the Form Trigger The workflow starts with the "On form submission" node. Click on this node. In the settings panel, you will see a "Form URL". Copy this URL. This is the link to the web form where you will upload your fax files. Step 3: Configure the Google Drive Node Click on the "Upload file" (Google Drive) node. Credentials: Select your Google Drive account from the "Credentials" dropdown or click "Create New" to connect your account. Folder ID: In the "Folder ID" field, choose the specific Google Drive folder where you want the uploaded faxes to be saved. Step 4: Configure the Google Gemini AI Nodes (Very Important) This workflow uses AI in two places, and both need to be connected. First AI Call (PDF Reading): Click on the "Call Gemini 2.0 Flash with PDF Capabilities" (HTTP Request) node. Under "Authentication", make sure "Predefined Credential Type" is selected. For "Credential Type", choose "Google Palm API". In the "Credentials" dropdown, select your Google Gemini API key or click "Create New" to add it. Second AI Call (Data Structuring): Click on the "Google Gemini Chat Model" node (it's connected below the "Basic LLM Chain" node). In the "Credentials" dropdown, select the same Google Gemini API key you used before. Step 5: (Optional) Customize What Data is Extracted You have full control over what information the AI looks for. To change the extraction rules: Click on the "Define Prompt" node. You can edit the text in the "Value" field to tell the AI what to look for (e.g., "Extract only the patient's name and medication list"). To change the final output columns: Click on the "Basic LLM Chain" node. In the "Text" field, you can edit the JSON schema to add, remove, or rename the fields you want in your final output. The keys here MUST match the column headers in your Google Sheet. Step 6: Configure the Final Google Sheets Node Click on the "Append row in sheet" node. Credentials: Select your Google Sheets account from the "Credentials" dropdown. Document ID: Select your target spreadsheet from the "Document" dropdown list. Sheet Name: Select the specific sheet within that document. Columns: Ensure that the fields listed here match the columns in your sheet and the schema from the "Basic LLM Chain" node. 4. Running the Workflow Save and Activate: Click "Save" and then toggle the workflow to "Active". Open the Form: Open the Form URL you copied in Step 2 in a new browser tab. Upload a File: Upload a sample fax PDF and submit the form. Check Your Sheet: After a minute, a new row with the extracted data should appear in your Google Sheet. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Worflow Automation Click here- Get Started
by Jay Emp0
Automatically turns trending Reddit posts into punchy, first-person tweets powered by Google Gemini AI, Reddit, and Twitter API, with Google Sheets logging. 🧩 Overview This workflow repurposes Reddit content into original tweets every few hours. It’s perfect for creators, marketers, or founders who want to automate content inspiration while keeping tweets sounding human, edgy, and fresh. Core automation loop: Fetch trending Reddit posts from selected subreddits. Use Gemini AI to write a short, first-person tweet. Check your Google Sheet to avoid reusing the same Reddit post. Publish to Twitter automatically. Log tweet + Reddit reference in Google Sheets. 🧠 Workflow Diagram 🪄 How It Works 1️⃣ Every 2 hours → the workflow triggers automatically. 2️⃣ It picks a subreddit (like r/automation, r/n8n, r/SaaS). 3️⃣ Gemini AI analyzes a rising Reddit post and writes a fresh, short tweet. 4️⃣ The system checks your Google Sheet to ensure it hasn’t used that Reddit post before. 5️⃣ Once validated, the tweet is published via Twitter API and logged. 🧠 Example Tweet Output 📊 Logged Data (Google Sheets) Each tweet is automatically logged for version control and duplication checks. | Date | Subreddit | Post ID | Tweet Text | |------|------------|----------|-------------| | 08/10/2025 | n8n_ai_agents | 1o16ome | Just saw a wild n8n workflow on Reddit... | ⚙️ Key Components | Node | Function | |------|-----------| | Schedule Trigger | Runs every 2 hours to generate a new tweet. | | Code (Randomly Decide Subreddit) | Picks one subreddit randomly from your preset list. | | Gemini Chat Model | Generates tweet text in first person tone using custom prompt rules. | | Reddit Tool | Fetches top or rising posts from the chosen subreddit. | | Google Sheets (read database) | Keeps a record of already-used Reddit posts. | | Structured Output Parser | Ensures consistent tweet formatting (tweet text, subreddit, post ID). | | Twitter Node | Publishes the AI-generated tweet. | | Append Row in Sheet | Logs the tweet with date, subreddit, and post ID. | 🧩 Setup Tutorial 1️⃣ Prerequisites | Tool | Purpose | |------|----------| | n8n Cloud or Self-Host | Workflow execution | | Google Gemini API Key | For tweet generation | | Reddit OAuth2 API | To fetch posts | | Twitter (X) API OAuth2 | To publish tweets | | Google Sheets API | For logging and duplication tracking | 2️⃣ Import the Workflow Download Reddit Twitter Automation.json. In n8n, click Import Workflow → From File. Connect your credentials: Gemini → Gemini Reddit → Reddit account Twitter → X Google Sheets → Gsheet 3️⃣ Configure Google Sheet Your sheet must include these columns: | Column | Description | |--------|--------------| | PAST TWEETS | The tweet text | | Date | Auto-generated date | | subreddit | Reddit source | | post_id | Reddit post reference | 4️⃣ Customize Subreddits In the Code Node, update this array to choose which subreddits to monitor: const subreddits = [ "n8n", "microsaas", "SaaS", "automation", "n8n_ai_agents" ];
by Budi SJ
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Multi Platform Content Generator from YouTube using AI & RSS This workflow automates content generation by monitoring YouTube channels, extracting transcripts via AI, and creating platform-optimized content for LinkedIn, X/Twitter, Threads, and Instagram. Ideal for creators, marketers, and social media managers aiming to scale content production with minimal effort. ✨ Key Features 🔔 Automated YouTube Monitoring** via RSS feed 🧠 AI-Powered Transcript Extraction** using Supadata API ✍️ Multi-Platform Content Generation** with OpenRouter AI 🎯 Platform Optimization** based on tone and character limits 📬 Telegram Notification** for easy preview 📊 Centralized Data Management via Google Sheets** > 🗂️ All video data, summaries, and generated content are tracked and stored in a single, centralized Google Sheets template > This ensures full visibility, easy access, and smooth collaboration across your team. ⚙️ Workflow Components 1. 🧭 Channel Monitoring Schedule Trigger**: Initiates workflow periodically Google Sheets (Read)**: Pulls YouTube channel URLs HTTP Request + HTML Parser**: Extracts channel IDs from URLs RSS Reader**: Fetches latest video metadata 2. 🧾 Content Processing Supadata API**: Extracts transcript from YouTube video OpenRouter AI**: Summarizes transcript + generates content per platform Conditional Check**: Prevents duplicate content by checking existing records 3. 📤 Multi-Platform Output LinkedIn**: Story-driven format (≤ 1300 characters) X/Twitter**: Short, punchy copy (≤ 280 characters) Threads**: Friendly, conversational Instagram**: Short captions for visual posts 4. 🗃️ Data Management Google Sheets (Write)**: Stores video metadata + generated posts Telegram Bot**: Sends content preview ID Tracking**: Avoids reprocessing using video ID 🔐 Required Credentials Google Sheets OAuth2** Supadata API** OpenRouter API** Telegram Bot Token & Chat ID** 🎁 Benefits ⌛ Save Time**: Automates transcript + content generation 🔊 Consistent Tone**: Adjust AI prompts for brand voice 📡 Multi-Platform Ready**: One video → multiple formats 📂 Centralized Logs via Google Sheets**: Easily track, audit, and collaborate 🚀 Scalable**: Handle many channels with ease
by Simeon Penev
Who’s it for Content/SEO teams who want a fast, consistent, research-driven brief for a copywriters from a single keyword—without manual review and analysis of the SERP (Google results). How it works / What it does Form Trigger collects the keyword/topic and redirects to Google Drive Folder after the final node. FireCrawl Search & Scrape pulls the top 5 pages for the chosen keyword. AI Agent (with Think + OpenAI Chat Model) analyzes sources and generates an original Markdown brief. Markdown to JSON converts the Markdown into Google Docs batchUpdate requests (H1/H2/H3, lists, links, spacing). Then this is used in Update a document for updating the empty doc. Create a document + Update a document write a Google Doc titled “SEO Brief for ” and update the Google Doc in your target Drive folder. How to set up Add credentials: Firecrawl (Authorization header), OpenAI (Chat), Google Docs OAuth2. Replace placeholders: {{APIKEY}}, {{googledrivefolderid}}, {{googledrivefolderurl}}. Publish and open the Form URL to test. Requirements Firecrawl API key • OpenAI API key • Google account with access to the target Drive folder. Resources Google OAuth2 Credentials Setup - https://docs.n8n.io/integrations/builtin/credentials/google/oauth-generic/ Firecrawl API key - https://take.ms/lGcUp OpenAI API key - https://docs.n8n.io/integrations/builtin/credentials/openai/
by Onur
Automated B2B Lead Generation: Google Places, Scrape.do & AI Enrichment This workflow is a powerful, fully automated B2B lead generation engine. It starts by finding businesses on Google Maps based on your criteria (e.g., "dentists" in "Istanbul"), assigns a quality score to each, and then uses Scrape.do to reliably access their websites. Finally, it leverages an AI agent to extract valuable contact information like emails and social media profiles. The final, enriched data is then neatly organized and saved directly into a Google Sheet. This template is built for reliability, using Scrape.do to handle the complexities of web scraping, ensuring you can consistently gather data without getting blocked. 🚀 What does this workflow do? Automatically finds businesses using the Google Places API based on a category and location you define. Calculates a leadScore for each business based on its rating, website presence, and operational status to prioritize high-quality leads. Filters out low-quality leads** to ensure you only focus on the most promising prospects. Reliably scrapes the website of each high-quality lead using Scrape.do to bypass common blocking issues and retrieve the raw HTML. Uses an AI Agent (OpenAI) to intelligently parse the website's HTML and extract hard-to-find contact details (emails, social media links, phone numbers). Saves all enriched lead data** to a Google Sheet, creating a clean, actionable list for your sales or marketing team. Runs on a schedule**, continuously finding new leads without any manual effort. 🎯 Who is this for? Sales & Business Development Teams:** Automate prospecting and build targeted lead lists. Marketing Agencies:** Generate leads for clients in specific industries and locations. Freelancers & Consultants:** Quickly find potential clients for your services. Startups & Small Businesses:** Build a customer database without spending hours on manual research. ✨ Benefits Full Automation:** Set it up once and let it run on a schedule to continuously fill your pipeline. AI-Powered Enrichment:** Go beyond basic business info. Get actual emails and social profiles that aren't available on Google Maps. Reliable Website Access:* Leverages *Scrape.do** to handle proxies and prevent IP blocks, ensuring consistent data gathering from target websites. High-Quality Leads:** The built-in scoring and filtering system ensures you don't waste time on irrelevant or incomplete listings. Centralized Database:** All your leads are automatically organized in a single, easy-to-access Google Sheet. ⚙️ How it Works Schedule Trigger: The workflow starts automatically at your chosen interval (e.g., daily). Set Parameters: You define the business type (searchCategory) and location (locationName) in a central Set node. Find Businesses: It calls the Google Places API to get a list of businesses matching your criteria. Score & Filter: A custom Function node scores each lead. An IF node then separates high-quality leads from low-quality ones. Loop & Enrich: The workflow processes each high-quality lead one by one. It uses a scraping service (Scrape.do) to reliably fetch the lead's website content. An AI Agent (OpenAI) analyzes the website's footer to find contact and social media links. Save Data: The final, enriched lead information is appended as a new row in your Google Sheet. 📋 n8n Nodes Used Schedule Trigger Set HTTP Request (for Google Places & Scrape.do) Function If Split in Batches (Loop Over Items) HTML Langchain Agent (with OpenAI Chat Model & Structured Output Parser) Google Sheets 🔑 Prerequisites An active n8n instance. Google Cloud Project* with the *Places API** enabled. Google Places API Key**, stored in n8n's Header Auth credentials. A Scrape.do Account and API Token**. This is essential for reliably scraping websites without your n8n server's IP getting blocked. OpenAI Account & API Key** for the AI-powered data extraction. Google Account** with access to Google Sheets. Google Sheets API Credentials (OAuth2)** configured in n8n. A Google Sheet* prepared with columns to store the lead data (e.g., BusinessName, Address, Phone, Website, Email, Facebook, etc.*). 🛠️ Setup Import the workflow into your n8n instance. Configure Credentials: Create and/or select your credentials for: Google Places API: In the 2. Find Businesses (Google Places) node, select your Header Auth credential containing your API key. Scrape.do: In the 6a. Scrape Website HTML node, configure credentials for your Scrape.do account. OpenAI: In the OpenAI Chat Model node, select your OpenAI credentials. Google Sheets: In the 7. Save to Google Sheets node, select your Google Sheets OAuth2 credentials. Define Your Search: In the 1. Set Search Parameters node, update the searchCategory and locationName values to match your target market. Link Your Google Sheet: In the 7. Save to Google Sheets node, select your Spreadsheet and Sheet Name from the dropdown lists. Map the incoming data to the correct columns in your sheet. Set Your Schedule: Adjust the Schedule Trigger to run as often as you like (e.g., once a day). Activate the workflow! Your automated lead generation will begin on the next scheduled run.
by Michael Gullo
Workflow Purpose The workflow is designed to scan submitted URLs using urlscan.io and VirusTotal, combine the results into a single structured summary, and send the report via Telegram. I built this workflow for people who primarily work from their phones and receive a constant stream of emails throughout the day. If a user gets an email asking them to sign a document, review a report, or take any action where the link looks suspicious, they can simply open the Telegram bot and quickly check whether the URL is safe before clicking it. Key Components 1. Input / Trigger Accepts URLs that need to be checked. Initiates requests to VirusTotal and urlscan.io. 2. VirusTotal Scan Always returns results if the URL is reachable. Provides reputation, malicious/clean flags, and scan metadata. 3. urlscan.io Scan Returns details on how the URL behaves when loaded (domains, requests, resources, etc.). Sometimes fails due to blocks or restrictions. 4. Error Handling with Code Node Checks whether urlscan.io responded successfully. Ensures the workflow always produces a summary, even if urlscan.io fails. 5. Summary Generation If both scans succeed → summarize combined findings from VirusTotal + urlscan.io. If urlscan.io fails → state clearly in the summary “urlscan.io scan was blocked/failed. Relying on VirusTotal results.” Ensures user still gets a complete security report. 6. Telegram Output Final formatted summary is delivered to a Telegram chat via the bot. Chat ID issue was fixed after the Code Node restructuring. Outcome The workflow now guarantees a consistent, user-friendly summary regardless of urlscan.io failures. It leverages VirusTotal as the fallback source of truth. The Telegram bot provides real-time alerts with clear indications of scan success/failure. Prequisites Telegram In Telegram, start a chat with @BotFather. Send /newbot, pick a name and a unique username. Copy the HTTP API token BotFather returns (store securely) Start a DM with your bot and send any message. Call getUpdates and read the chat.id urlscan.io Create/log into your urlscan.io account. Go to Settings & API → New API key and generate a key. (Recommended) In Settings & API, set Default Scan Visibility to Unlisted to avoid exposing PII in public scans. Save the key securely (env var or n8n Credentials). Rate limits note: urlscan.io enforces per-minute/hour/day quotas; exceeding them returns HTTP 429. You can view your personal quotas on their dashboard/quotas endpoint Virustotal Sign up / sign in to VirusTotal Community. Open My API key (Profile menu) and copy your Public API key. Store it securely (env var or n8n Credentials). For a more reliable connection with VirusTotal and improved scanning results, enable the Header section in the node settings. Add a header parameter with a clear name (e.g., x-apikey), and then paste your API key into the Value field. Rate limits (Public API): 4 requests/minute, 500/day; not for commercial workflows. Consider Premium if you’ll exceed this. How to Customize the Workflow This workflow is designed to be highly customizable, allowing users to adapt it to their specific needs and use cases. For example, additional malicious website scanners can be integrated through HTTP Request nodes. To make this work, the user simply needs to update the Merge node so that all information flows correctly through the workflow. In addition, users can connect either Gmail or Outlook nodes to automatically test URLs, binary attachments, and other types of information received via email—helping them evaluate data before opening it. Users can also customize how they receive reports. For instance, results can be sent through Telegram (as in the default setup), Slack, Microsoft Teams, or even saved to Google Drive or a Google Sheet for recordkeeping and audit purposes. For consulting and support, or if you have questions, please feel free to connect with me on Linkedin or via email.