by Onur
Lead Sourcing by Job Posts For Outreach With Scrape.do API & Open AI & Google Sheets Overview This n8n workflow automates the complete lead generation process by scraping job postings from Indeed, enriching company data via Apollo.io, identifying decision-makers, and generating personalized LinkedIn outreach messages using OpenAI. It integrates with Scrape.do for reliable web scraping, Apollo.io for B2B data enrichment, OpenAI for AI-powered personalization, and Google Sheets for centralized data storage. Perfect for: Sales teams, recruiters, business development professionals, and marketing agencies looking to automate their outbound prospecting pipeline. Workflow Components 1. β° Schedule Trigger | Property | Value | |----------|-------| | Type | Schedule Trigger | | Purpose | Automatically initiates workflow on a recurring schedule | | Frequency | Weekly (Every Monday) | | Time | 00:00 UTC | Function: Ensures consistent, hands-off lead generation by running the pipeline automatically without manual intervention. 2. π Scrape.do Indeed API | Property | Value | |----------|-------| | Type | HTTP Request (GET) | | Purpose | Scrapes job listings from Indeed via Scrape.do proxy API | | Endpoint | https://api.scrape.do | | Output Format | Markdown | Request Parameters: | Parameter | Value | Description | |-----------|-------|-------------| | token | API Token | Scrape.do authentication | | url | Indeed Search URL | Target job search page | | super | true | Uses residential proxies | | geoCode | us | US-based content | | render | true | JavaScript rendering enabled | | device | mobile | Mobile viewport for cleaner HTML | | output | markdown | Lightweight text output | Function: Fetches Indeed job listings with anti-bot bypass, returning clean markdown for easy parsing. 3. π Parse Indeed Jobs | Property | Value | |----------|-------| | Type | Code Node (JavaScript) | | Purpose | Extracts structured job data from markdown | | Mode | Run once for all items | Extracted Fields: | Field | Description | Example | |-------|-------------|---------| | jobTitle | Position title | "Senior Data Engineer" | | jobUrl | Indeed job link | "https://indeed.com/viewjob?jk=abc123" | | jobId | Indeed job identifier | "abc123" | | companyName | Hiring company | "Acme Corporation" | | location | City, State | "San Francisco, CA" | | salary | Pay range | "$120,000 - $150,000" | | jobType | Employment type | "Full-time" | | source | Data source | "Indeed" | | dateFound | Scrape date | "2025-01-15" | Function: Parses markdown using regex patterns, filters invalid entries, and deduplicates by company name. 4. π Add New Company (Google Sheets) | Property | Value | |----------|-------| | Type | Google Sheets Node | | Purpose | Stores parsed job postings for tracking | | Operation | Append rows | | Target Sheet | "Add New Company" | Function: Creates a historical record of all discovered job postings and companies for pipeline tracking. 5. π’ Apollo Organization Search | Property | Value | |----------|-------| | Type | HTTP Request (POST) | | Purpose | Enriches company data via Apollo.io API | | Endpoint | https://api.apollo.io/v1/organizations/search | | Authentication | HTTP Header Auth (x-api-key) | Request Body: { "q_organization_name": "Company Name", "page": 1, "per_page": 1 } Response Fields: | Field | Description | |-------|-------------| | id | Apollo organization ID | | name | Official company name | | website_url | Company website | | linkedin_url | LinkedIn company page | | industry | Business sector | | estimated_num_employees | Company size | | founded_year | Year established | | city, state, country | Location details | | short_description | Company overview | Function: Retrieves comprehensive company intelligence including LinkedIn profiles, industry classification, and employee count. 6. π€ Extract Apollo Org Data | Property | Value | |----------|-------| | Type | Code Node (JavaScript) | | Purpose | Parses Apollo response and merges with original data | | Mode | Run once for each item | Function: Extracts relevant fields from Apollo API response and combines with job posting data for downstream processing. 7. π₯ Apollo People Search | Property | Value | |----------|-------| | Type | HTTP Request (POST) | | Purpose | Finds decision-makers at target companies | | Endpoint | https://api.apollo.io/v1/mixed_people/search | | Authentication | HTTP Header Auth (x-api-key) | Request Body: { "organization_ids": ["apollo_org_id"], "person_titles": [ "CTO", "Chief Technology Officer", "VP Engineering", "Head of Engineering", "Engineering Manager", "Technical Director", "CEO", "Founder" ], "page": 1, "per_page": 3 } Response Fields: | Field | Description | |-------|-------------| | first_name | Contact first name | | last_name | Contact last name | | title | Job title | | email | Email address | | linkedin_url | LinkedIn profile URL | | phone_number | Direct phone | Function: Identifies key stakeholders and decision-makers based on configurable title filters. 8. π Format Leads | Property | Value | |----------|-------| | Type | Code Node (JavaScript) | | Purpose | Structures lead data for outreach | | Mode | Run once for all items | Function: Combines person data with company context, creating comprehensive lead profiles ready for personalization. 9. π€ Generate Personalized Message (OpenAI) | Property | Value | |----------|-------| | Type | OpenAI Node | | Purpose | Creates custom LinkedIn connection messages | | Model | gpt-4o-mini | | Max Tokens | 150 | | Temperature | 0.7 | System Prompt: You are a professional outreach specialist. Write personalized LinkedIn connection request messages. Keep messages under 300 characters. Be friendly, professional, and mention a specific reason for connecting based on their role and company. User Prompt Variables: | Variable | Source | |----------|--------| | Name | $json.fullName | | Title | $json.title | | Company | $json.companyName | | Industry | $json.industry | | Job Context | $json.jobTitle | Function: Generates unique, contextual outreach messages that reference specific hiring activity and company details. 10. π Merge Lead + Message | Property | Value | |----------|-------| | Type | Code Node (JavaScript) | | Purpose | Combines lead data with generated message | | Mode | Run once for each item | Function: Merges OpenAI response with lead profile, creating the final enriched record. 11. πΎ Save Leads to Sheet | Property | Value | |----------|-------| | Type | Google Sheets Node | | Purpose | Stores final lead data with personalized messages | | Operation | Append rows | | Target Sheet | "Leads" | Data Mapping: | Column | Data | |--------|------| | First Name | Lead's first name | | Last Name | Lead's last name | | Title | Job title | | Company | Company name | | LinkedIn URL | Profile link | | Country | Location | | Industry | Business sector | | Date Added | Timestamp | | Source | "Indeed + Apollo" | | Personalized Message | AI-generated outreach text | Function: Creates actionable lead database ready for outreach campaigns. Workflow Flow β° Schedule Trigger β βΌ π Scrape.do Indeed API βββΊ Fetches job listings with JS rendering β βΌ π Parse Indeed Jobs βββΊ Extracts company names, job details β βΌ π Add New Company βββΊ Saves to Google Sheets (Companies) β βΌ π’ Apollo Org Search βββΊ Enriches company data β βΌ π€ Extract Apollo Org Data βββΊ Parses API response β βΌ π₯ Apollo People Search βββΊ Finds decision-makers β βΌ π Format Leads βββΊ Structures lead profiles β βΌ π€ Generate Personalized Message βββΊ AI creates custom outreach β βΌ π Merge Lead + Message βββΊ Combines all data β βΌ πΎ Save Leads to Sheet βββΊ Final storage (Leads) Configuration Requirements API Keys & Credentials | Credential | Purpose | Where to Get | |------------|---------|--------------| | Scrape.do API Token | Web scraping with anti-bot bypass | scrape.do/dashboard | | Apollo.io API Key | B2B data enrichment | apollo.io/settings/integrations | | OpenAI API Key | AI message generation | platform.openai.com | | Google Sheets OAuth2 | Data storage | n8n Credentials Setup | n8n Credential Setup | Credential Type | Configuration | |-----------------|---------------| | HTTP Header Auth (Apollo) | Header: x-api-key, Value: Your Apollo API key | | OpenAI API | API Key: Your OpenAI API key | | Google Sheets OAuth2 | Complete OAuth flow with Google | Key Features π Intelligent Job Scraping Anti-Bot Bypass:** Residential proxy rotation via Scrape.do JavaScript Rendering:** Full headless browser for dynamic content Mobile Optimization:** Cleaner HTML with mobile viewport Markdown Output:** Lightweight, easy-to-parse format π’ B2B Data Enrichment Company Intelligence:** Industry, size, location, LinkedIn Decision-Maker Discovery:** Title-based filtering Contact Information:** Email, phone, LinkedIn profiles Real-Time Data:** Fresh information from Apollo.io π€ AI-Powered Personalization Contextual Messages:** References specific hiring activity Character Limit:** Optimized for LinkedIn (300 chars) Variable Temperature:** Balanced creativity and consistency Role-Specific:** Tailored to recipient's title and company π Automated Data Management Dual Sheet Storage:** Companies + Leads separation Timestamp Tracking:** Historical records Deduplication:** Prevents duplicate entries Ready for Export:** CSV-compatible format Use Cases π― Sales Prospecting Identify companies actively hiring in your target market Find decision-makers at companies investing in growth Generate personalized cold outreach at scale Track pipeline from discovery to contact π₯ Recruiting & Talent Acquisition Monitor competitor hiring patterns Identify companies building specific teams Connect with hiring managers directly Build talent pipeline relationships π Market Intelligence Track industry hiring trends Monitor competitor expansion signals Identify emerging market opportunities Benchmark salary ranges by role π€ Partnership Development Find companies investing in complementary areas Identify potential integration partners Connect with technical leadership Build strategic relationship pipeline Technical Notes | Specification | Value | |---------------|-------| | Processing Time | 2-5 minutes per run (depending on job count) | | Jobs per Run | ~25 unique companies | | API Calls per Run | 1 Scrape.do + 25 Apollo Org + 25 Apollo People + ~75 OpenAI | | Data Accuracy | 90%+ for company matching | | Success Rate | 99%+ with proper error handling | Rate Limits to Consider | Service | Free Tier Limit | Recommendation | |---------|-----------------|----------------| | Scrape.do | 1,000 credits/month | ~40 runs/month | | Apollo.io | 100 requests/day | Add Wait nodes if needed | | OpenAI | Based on usage | Monitor costs (~$0.01-0.05/run) | | Google Sheets | 300 requests/minute | No issues expected | Setup Instructions Step 1: Import Workflow Copy the JSON workflow configuration In n8n: Workflows β Import from JSON Paste configuration and save Step 2: Configure Scrape.do Sign up at scrape.do Navigate to Dashboard β API Token Copy your token Token is embedded in URL query parameter (already configured) To customize search: Change the url parameter in "Scrape.do Indeed API" node: q=data+engineer (search term) l=Remote (location) fromage=7 (last 7 days) Step 3: Configure Apollo.io Sign up at apollo.io Go to Settings β Integrations β API Keys Create new API key In n8n: Credentials β Add Credential β Header Auth Name: x-api-key Value: Your Apollo API key Select this credential in both Apollo HTTP nodes Step 4: Configure OpenAI Go to platform.openai.com Create new API key In n8n: Credentials β Add Credential β OpenAI Paste API key Select credential in "Generate Personalized Message" node Step 5: Configure Google Sheets Create new Google Spreadsheet Create two sheets: Sheet 1: "Add New Company" Columns: companyName | jobTitle | jobUrl | location | salary | source | postedDate Sheet 2: "Leads" Columns: First Name | Last Name | Title | Company | LinkedIn URL | Country | Industry | Date Added | Source | Personalized Message Copy Sheet ID from URL In n8n: Credentials β Add Credential β Google Sheets OAuth2 Update both Google Sheets nodes with your Sheet ID Step 6: Test and Activate Manual Test: Click "Execute Workflow" button Verify Each Node: Check outputs step by step Review Data: Confirm data appears in Google Sheets Activate: Toggle workflow to "Active" Error Handling Common Issues | Issue | Cause | Solution | |-------|-------|----------| | "Invalid character: " | Empty/malformed company name | Check Parse Indeed Jobs output | | "Node does not have credentials" | Credential not linked | Open node β Select credential | | Empty Parse Results | Indeed HTML structure changed | Check Scrape.do raw output | | Apollo Rate Limit (429) | Too many requests | Add 5-10s Wait node between calls | | OpenAI Timeout | Too many tokens | Reduce batch size or max_tokens | | "Your request is invalid" | Malformed JSON body | Verify expression syntax in HTTP nodes | Troubleshooting Steps Verify Credentials: Test each credential individually Check Node Outputs: Use "Execute Node" for debugging Monitor API Usage: Check Apollo and OpenAI dashboards Review Logs: Check n8n execution history for details Test with Sample: Use known company name to verify Apollo Recommended Error Handling Additions For production use, consider adding: IF node after Apollo Org Search to handle empty results Error Workflow trigger for notifications Wait nodes between API calls for rate limiting Retry logic for transient failures Performance Specifications | Metric | Value | |--------|-------| | Execution Time | 2-5 minutes per scheduled run | | Jobs Discovered | ~25 per Indeed page | | Leads Generated | 1-3 per company (based on title matches) | | Message Quality | Professional, contextual, <300 chars | | Data Freshness | Real-time from Indeed + Apollo | | Storage Format | Google Sheets (unlimited rows) | API Reference Scrape.do API | Endpoint | Method | Purpose | |----------|--------|---------| | https://api.scrape.do | GET | Direct URL scraping | Documentation: [scrape.do/documentation Apollo.io API | Endpoint | Method | Purpose | |----------|--------|---------| | /v1/organizations/search | POST | Company lookup | | /v1/mixed_people/search | POST | People search | Documentation: apolloio.github.io/apollo-api-docs OpenAI API | Endpoint | Method | Purpose | |----------|--------|---------| | /v1/chat/completions | POST | Message generation | Documentation: [platform.openai.com
by Li CHEN
AWS News Analysis and LinkedIn Automation Pipeline Transform AWS industry news into engaging LinkedIn content with AI-powered analysis and automated approval workflows. Who's it for This template is perfect for: Cloud architects and DevOps engineers** who want to stay current with AWS developments Content creators** looking to automate their AWS news coverage Marketing teams** needing consistent, professional AWS content Technical leaders** who want to share industry insights on LinkedIn AWS consultants** building thought leadership through automated content How it works This workflow creates a comprehensive AWS news analysis and content generation pipeline with two main flows: Flow 1: News Collection and Analysis Scheduled RSS Monitoring: Automatically fetches latest AWS news from the official AWS RSS feed daily at 8 PM AI-Powered Analysis: Uses AWS Bedrock (Claude 3 Sonnet) to analyze each news item, extracting: Professional summary Key themes and keywords Importance rating (Low/Medium/High) Business impact assessment Structured Data Storage: Saves analyzed news to Feishu Bitable with approval status tracking Flow 2: LinkedIn Content Generation Manual Approval Trigger: Feishu automation sends approved news items to the webhook AI Content Creation: AWS Bedrock generates professional LinkedIn posts with: Attention-grabbing headlines Technical insights from a Solutions Architect perspective Business impact analysis Call-to-action engagement Automated Publishing: Posts directly to LinkedIn with relevant hashtags How to set up Prerequisites AWS Bedrock access** with Claude 3 Sonnet model enabled Feishu account** with Bitable access LinkedIn company account** with posting permissions n8n instance** (self-hosted or cloud) Detailed Configuration Steps 1. AWS Bedrock Setup Step 1: Enable Claude 3 Sonnet Model Log into your AWS Console Navigate to AWS Bedrock Go to Model access in the left sidebar Find Anthropic Claude 3 Sonnet and click Request model access Fill out the access request form (usually approved within minutes) Once approved, verify the model appears in your Model access list Step 2: Create IAM User and Credentials Go to IAM Console Click Users β Create user Name: n8n-bedrock-user Attach policy: AmazonBedrockFullAccess (or create custom policy with minimal permissions) Go to Security credentials tab β Create access key Choose Application running outside AWS Download the credentials CSV file Step 3: Configure in n8n In n8n, go to Credentials β Add credential Select AWS credential type Enter your Access Key ID and Secret Access Key Set Region to your preferred AWS region (e.g., us-east-1) Test the connection Useful Links: AWS Bedrock Documentation Claude 3 Sonnet Model Access AWS Bedrock Pricing 2. Feishu Bitable Configuration Step 1: Create Feishu Account and App Sign up at Feishu International Create a new Bitable (multi-dimensional table) Go to Developer Console β Create App Enable Bitable permissions in your app Generate App Token and App Secret Step 2: Create Bitable Structure Create a new Bitable with these columns: title (Text) pubDate (Date) summary (Long Text) keywords (Multi-select) rating (Single Select: Low, Medium, High) link (URL) approval_status (Single Select: Pending, Approved, Rejected) Get your App Token and Table ID: App Token: Found in app settings Table ID: Found in the Bitable URL (tbl...) Step 3: Set Up Automation In your Bitable, go to Automation β Create automation Trigger: When field value changes β Select approval_status field Condition: approval_status equals "Approved" Action: Send HTTP request Method: POST URL: Your n8n webhook URL (from Flow 2) Headers: Content-Type: application/json Body: {{record}} Step 4: Configure Feishu Credentials in n8n Install Feishu Lite community node (self-hosted only) Add Feishu credential with your App Token and App Secret Test the connection Useful Links: Feishu Developer Documentation Bitable API Reference Feishu Automation Guide 3. LinkedIn Company Account Setup Step 1: Create LinkedIn App Go to LinkedIn Developer Portal Click Create App Fill in app details: App name: AWS News Automation LinkedIn Page: Select your company page App logo: Upload your logo Legal agreement: Accept terms Step 2: Configure OAuth2 Settings In your app, go to Auth tab Add redirect URL: https://your-n8n-instance.com/rest/oauth2-credential/callback Request these scopes: w_member_social (Post on behalf of members) r_liteprofile (Read basic profile) r_emailaddress (Read email address) Step 3: Get Company Page Access Go to your LinkedIn Company Page Navigate to Admin tools β Manage admins Ensure you have Content admin or Super admin role Note your Company Page ID (found in page URL) Step 4: Configure LinkedIn Credentials in n8n Add LinkedIn OAuth2 credential Enter your Client ID and Client Secret Complete OAuth2 flow by clicking Connect my account Select your company page for posting Useful Links: LinkedIn Developer Portal LinkedIn API Documentation LinkedIn OAuth2 Guide 4. Workflow Activation Final Setup Steps: Import the workflow JSON into n8n Configure all credential connections: AWS Bedrock credentials Feishu credentials LinkedIn OAuth2 credentials Update webhook URL in Feishu automation to match your n8n instance Activate the scheduled trigger (daily at 8 PM) Test with manual webhook trigger using sample data Verify Feishu Bitable receives data Test approval workflow and LinkedIn posting Requirements Service Requirements AWS Bedrock** with Claude 3 Sonnet model access AWS account with Bedrock service enabled IAM user with Bedrock permissions Model access approval for Claude 3 Sonnet Feishu Bitable** for news storage and approval workflow Feishu account (International or Lark) Developer app with Bitable permissions Automation capabilities for webhook triggers LinkedIn Company Account** for automated posting LinkedIn company page with admin access LinkedIn Developer app with posting permissions OAuth2 authentication setup n8n community nodes**: Feishu Lite node (self-hosted only) Technical Requirements n8n instance** (self-hosted recommended for community nodes) Webhook endpoint** accessible from Feishu automation Internet connectivity** for API calls and RSS feeds Storage space** for workflow execution logs Cost Considerations AWS Bedrock**: ~$0.01-0.05 per news analysis Feishu**: Free tier available, paid plans for advanced features LinkedIn**: Free API access with rate limits n8n**: Self-hosted (free) or cloud subscription How to customize the workflow Content Customization Modify AI prompts** in the AI Agent nodes to change tone, focus, or target audience Adjust hashtags** in the LinkedIn posting node for different industries Change scheduling** frequency by modifying the Schedule Trigger settings Integration Options Replace LinkedIn** with Twitter/X, Facebook, or other social platforms Add Slack notifications** for approved content before posting Integrate with CRM** systems to track content performance Add content calendar** integration for better planning Advanced Features Multi-language support** by modifying AI prompts for different regions Content categorization** by adding tags for different AWS services Performance tracking** by integrating analytics platforms Team collaboration** by adding approval workflows with multiple reviewers Technical Modifications Change RSS sources** to monitor other AWS blogs or competitor news Adjust AI models** to use different Bedrock models or external APIs Add data validation** nodes for better error handling Implement retry logic** for failed API calls Important Notes Service Limitations This template uses community nodes (Feishu Lite) and requires self-hosted n8n Geo-restrictions** may apply to AWS Bedrock models in certain regions Rate limits** may affect high-frequency posting - adjust scheduling accordingly Content moderation** is recommended before automated posting Cost considerations**: Each AI analysis costs approximately $0.01-0.05 USD per news item Troubleshooting Common Issues AWS Bedrock Issues: Model not found**: Ensure Claude 3 Sonnet access is approved in your region Access denied**: Verify IAM permissions include Bedrock service access Rate limiting**: Implement retry logic or reduce analysis frequency Feishu Integration Issues: Authentication failed**: Check App Token and App Secret are correct Table not found**: Verify Table ID matches your Bitable URL Automation not triggering**: Ensure webhook URL is accessible and returns 200 status LinkedIn Posting Issues: OAuth2 errors**: Re-authenticate LinkedIn credentials Posting failed**: Verify company page admin permissions Rate limits**: LinkedIn has daily posting limits for company pages Security Best Practices Never hardcode credentials** in workflow nodes Use environment variables** for sensitive configuration Regularly rotate API keys** and access tokens Monitor API usage** to prevent unexpected charges Implement error handling** for failed API calls
by Khairul Muhtadin
Stop wasting hours watching long videos. This n8n workflow acts as your personal "TL;DW" (Too Long; Didn't Watch) assistant. It automatically pulls YouTube transcripts using Decodo, analyzes them with Google Gemini, and sends a detailed summary straight to your Telegram. Why You Need This Save Time:** Turn a 2-hour video into a 5-minute read (95% faster). Don't Miss a Thing:** Captures key points, chapters, tools mentioned, and quotes that you might miss while skimming. Instant Results:** Get a structured summary in Telegram within 30-60 seconds. Multi-Language:** Works with any video language that has YouTube captions. Who Is This For? Creators & Marketers:** Spy on competitor strategies and extract tools without watching endless footage. Students:** Turn lecture recordings into instant study notes. Busy Pros:** Digest conference talks and webinars on the go. How It Works Send Link: You message a YouTube link to your Telegram bot. Scrape: The bot uses the Decodo API to grab the video transcript and metadata (views, chapters, etc.). Analyze: Google Gemini reads the text and writes a structured summary (overview, takeaways, tools). Deliver: You receive the formatted summary in chat. Setup Guide What You Need n8n instance** (to run the workflow) Telegram Bot Token** (free via @BotFather) Decodo Scraper API Key** (for YouTube data - Get it here) Google Gemini API Key** (for the AI - Get it here) Quick Installation Import: Load the JSON file into your n8n instance. Credentials: Add your API keys for Telegram, Decodo, and Google Gemini in the n8n credentials section. Configure: In the "Alert Admin" node, set the chatId to your Telegram User ID (find it via @userinfobot). (Optional) Change the languageCode in the Config node if you want non-English transcripts. Test: Send a YouTube link to your bot. You should see a "Processing..." message followed by your summary! Troubleshooting & Tips "Not a YouTube URL":** Make sure you are sending a standard youtube.com or youtu.be link. No Transcript:** The video must have captions (auto-generated or manual) for this to work. Customization:** You can edit the AI Prompt in the "Generate TLDR" node to change how the summary looks (e.g., "Make it funny" or "Focus on technical details"). Created by: Khaisa Studio Category: AI-Powered Automation Tags: YouTube, AI, Telegram, Summarization, Decodo, Gemini Need custom workflows? Contact us Connect with the creator: Portfolio β’ Workflows β’ LinkedIn β’ Medium β’ Threads
by DataForSEO
This weekly workflow automatically discovers new high-volume, ranked keywords for your domain on Google without manual SERP monitoring. On each run, the workflow fetches the latest ranking and search volume data using the DataForSEO Labs API and stores a fresh snapshot in Google Sheets. It then compares this data with the previous run to identify any new keywords your domain started ranking for, focusing on queries with a search volume above 1,000. All newly ranked keywords that match this rule are added to a dedicated Google Sheet, along with their ranking position and search volume, creating a growing historical log you can use to analyze gains over time. Once new terms are identified, the workflow creates tasks in Asana to help your team act on them quickly, and sends you a Slack summary highlighting the latest changes. Whoβs it for SEO professionals, marketers, and content teams who want an automated way to discover newly ranked, high-volume Google keywords and turn organic ranking gains into actionable content or optimization tasks. What it does This workflow automatically detects when your domain starts ranking for new high-volume keywords on Google, records them in Google Sheets, creates related tasks in Asana, and sends a weekly summary via Slack. How it works Runs on a predefined schedule (default: once a week). Reads your keywords and target domains from Google Sheets. Extracts the latest Google results and keyword metrics via DataForSEO API. Compares current data with the previous snapshot. Logs newly ranked keywords to a dedicated Google Sheet. Creates follow-up tasks in Asana for content team. Sends a Slack summary with key changes. Requirements DataForSEO account and API credentials Google Sheets spreadsheet with your keywords, following the required column structure (as in the example). Google Sheets spreadsheet with your target domains, following the required column structure (as in the example). Asana account Slack account Customization You can easily tailor this workflow to your needs by adjusting the run schedule, changing the minimum search volume threshold, exporting results to other tools (like Looker Studio or BigQuery), and customizing the content of the Asana task or Slack message to match your teamβs workflow.
by Fahmi Fahreza
Weekly SEO Watchlist Audit to Google Sheets (Gemini + Decodo) Sign up for Decodo HERE for Discount Automatically fetches page content, generates a compact SEO audit (score, issues, fixes), and writes both a per-URL summary and a normalized βAll Issuesβ table to Google Sheetsβgreat for weekly monitoring and prioritization. Whoβs it for? Content/SEO teams that want lightweight, scheduled audits of key pages with actionable next steps and spreadsheet reporting. How it works Weekly trigger loads the Google Sheet of URLs. Split in Batches processes each URL. Decodo fetches page content (markdown + status). Gemini produces a strict JSON audit via the AI Chain + Output Parser. Code nodes flatten data for two tabs. Google Sheets nodes append Summary and All Issues rows. Split in Batches continues to the next URL. How to set up Add credentials for Google Sheets, Decodo, and Gemini. Set sheet_id and Sheet GIDs in the Set node. Ensure input sheet has a URL column. Configure your Google Sheets tabs with proper headers matching each field being appended (e.g., URL, Decodo Score, Priority, etc.). Adjust schedule as needed. Activate the workflow.
by Club de Inteligencia Artificial PolitΓ©cnico CIAP
Telegram Appointment Scheduling Bot with n8n π Description Tired of managing appointments manually? This template transforms your Telegram account into a smart virtual assistant that handles the entire scheduling process for you, 24/7. This workflow allows you to deploy a fully functional Telegram bot that not only schedules appointments but also checks real-time availability in your Google Calendar, logs a history in Google Sheets, and allows your clients to cancel or view their upcoming appointments. It's the perfect solution for professionals, small businesses, or anyone looking to automate their booking system professionally and effortlessly. β¨ Key Features Complete Appointment Management:** Allows users to schedule, cancel, and list their future appointments. Conflict Prevention:** Integrates with Google Calendar to check availability before confirming a booking, eliminating the risk of double-booking. Automatic Logging:** Every confirmed appointment is saved to a row in Google Sheets, creating a perfect database for tracking and analysis. Smart Interaction:** The bot handles unrecognized commands and guides the user, ensuring a smooth experience. Easy to Adapt:** Connect your own accounts, customize messages, and tailor it to your business needs in minutes. π Setup Follow these steps to deploy your own instance of this bot: 1. Prerequisites An n8n instance (Cloud or self-hosted). A Telegram account. A Google account. 2. Telegram Bot Talk to @BotFather on Telegram. Create a new bot using /newbot. Give it a name and a username. Copy and save the API token it provides. 3. Google Cloud & APIs Go to the Google Cloud Console. Create a new project. Enable the Google Calendar API and Google Sheets API. Create OAuth 2.0 Client ID credentials. Make sure to add your n8n instance's OAuth redirect URL. Save the Client ID and Client Secret. 4. Google Sheets Create a new spreadsheet in Google Sheets. Define the column headers in the first row. For example: id, Client Name, Date and Time, ISO Date. 5. n8n Import the workflow JSON file into your n8n instance. Set up the credentials: Telegram: Create a new credential and paste your bot's token. Google Calendar & Google Sheets (OAuth2): Create a new credential and paste the Client ID and Client Secret from the Google Cloud Console. Review the Google Calendar and Google Sheets nodes to select your correct calendar and spreadsheet. Activate the workflow! π¬ Usage Once the bot is running, you can interact with it using the following commands in Telegram: To start the bot:** /start To schedule a new appointment:** agendar YYYY-MM-DD HH:MM Your Full Name To cancel an existing appointment:** cancelar YYYY-MM-DD HH:MM Your Full Name To view your future appointments:** mis citas Your Full Name π₯ Authors Jaren PazmiΓ±o President of the Polytechnic Artificial Intelligence Club (CIAP)
by Santhej Kallada
Who is this for? Creators, designers, and developers exploring AI-powered image generation. Automation enthusiasts who want to integrate image creation into n8n workflows. Telegram bot builders looking to add visual AI capabilities. Marketers or freelancers automating creative content workflows. What problem is this workflow solving? Creating AI images usually requires multiple tools and manual setup. This workflow removes the complexity by: Connecting Nano Banana (AI image model) directly to n8n. Allowing image generation via Telegram chatbot. Providing a no-code setup that is fully automated and scalable. What this workflow does This workflow demonstrates how to generate AI images using Nano Banana and n8n, with an integrated Telegram chatbot interface. The process includes: Connecting Gemini Nano Banana to n8n. Automating image generation requests triggered from Telegram. Returning AI-generated images back to the user. Allowing customization of prompts and styles dynamically. By the end, youβll have a fully functional automation to generate and send AI-created images through Telegram β no coding required. Setup Create accounts: Sign up on n8n.io and ensure you have Telegram Bot API access. Connect your Nano Banana or Gemini API endpoint. Set up your Telegram Bot: Use BotFather to create a new bot and get the token. Add the βTelegram Triggerβ node in n8n. Configure Nano Banana connection: Add an HTTP Request node for Nano Banana API. Insert your API key and prompt parameters. Handle responses: Parse the AI-generated image output. Send the image file back to the Telegram user. Test and Deploy: Run a sample image prompt. Verify that Telegram returns the correct generated image. How to customize this workflow to your needs Modify prompts or styles to fit different artistic use cases. Add conditional logic for image size, aspect ratio, or filters. Integrate with Google Drive or Notion for image storage. Schedule automatic image generation for campaigns or content creation. Expand with OpenAI or Stability AI for hybrid workflows. Notes Nano Banana API may have rate limits depending on usage. Ensure your Telegram bot has permission to send files and images. You can host this workflow on n8n Cloud or self-hosted setups. Want A Video Tutorial on How to Setup This Automation: Β https://youtu.be/0s6ZdU1fjc4
by Growth AI
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Website sitemap generator and visual tree creator Who's it for Web developers, SEO specialists, UX designers, and digital marketers who need to analyze website structure, create visual sitemaps, or audit site architecture for optimization purposes. What it does This workflow automatically generates a comprehensive sitemap from any website URL and creates an organized hierarchical structure in Google Sheets. It follows the website's sitemap to discover all pages, then organizes them by navigation levels (Level 1, Level 2, etc.) with proper parent-child relationships. The output can be further processed to create visual tree diagrams and mind maps. How it works The workflow follows a five-step automation process: URL Input: Accepts website URL via chat interface Site Crawling: Uses Firecrawl to discover all pages following the website's sitemap only Success Validation: Checks if crawling was successful (some sites block external crawlers) Hierarchical Organization: Processes URLs into a structured tree with proper level relationships Google Sheets Export: Creates a formatted spreadsheet with the complete site architecture The system respects robots.txt and follows only sitemap-declared pages to ensure ethical crawling. Requirements Firecrawl API key (for website crawling and sitemap discovery) Google Sheets access Google Drive access (for template duplication) How to set up Step 1: Prepare your template (recommended) It's recommended to create your own copy of the base template: Access the base Google Sheets template Make a copy for your personal use Update the workflow's "Copy template" node with your template's file ID (replace the default ID: 12lV4HwgudgzPPGXKNesIEExbFg09Tuu9gyC_jSS1HjI) This ensures you have control over the template formatting and can customize it as needed Step 2: Configure API credentials Set up the following credentials in n8n: Firecrawl API: For crawling websites and discovering sitemaps Google Sheets OAuth2: For creating and updating spreadsheets Google Drive OAuth2: For duplicating the template file Step 3: Configure Firecrawl settings (optional) The workflow uses optimized Firecrawl settings: ignoreSitemap: false - Respects the website's sitemap sitemapOnly: true - Only crawls URLs listed in sitemap files These settings ensure ethical crawling and faster processing Step 4: Access the workflow The workflow uses a chat trigger interface - no manual configuration needed Simply provide the website URL you want to analyze when prompted How to use the workflow Basic usage Start the chat: Access the workflow via the chat interface Provide URL: Enter the website URL you want to analyze (e.g., "https://example.com") Wait for processing: The system will crawl, organize, and export the data Receive your results: Get an automatic direct clickable link to your generated Google Sheets - no need to search for the file Error handling Invalid URLs: If the provided URL is invalid or the website blocks crawling, you'll receive an immediate error message Graceful failure: The workflow stops without creating unnecessary files when errors occur Common causes: Incorrect URL format, robots.txt restrictions, or site security settings File organization Automatic naming: Generated files follow the pattern "[Website URL] - n8n - Arborescence" Google Drive storage: Files are automatically organized in your Google Drive Instant access: Direct link provided immediately upon completion Advanced processing for visual diagrams Step 1: Copy sitemap data Once your Google Sheets is ready: Copy all the hierarchical data from the generated spreadsheet Prepare it for AI processing Step 2: Generate ASCII tree structure Use any AI model with this prompt: Create a hierarchical tree structure from the following website sitemap data. Return ONLY the tree structure using ASCII tree formatting with βββ and βββ characters. Do not include any explanations, comments, or additional text - just the pure tree structure. The tree should start with the root domain and show all pages organized by their hierarchical levels. Use proper indentation to show parent-child relationships. Here is the sitemap data: [PASTE THE SITEMAP DATA HERE] Requirements: Use ASCII tree characters (βββ βββ β) Show clear hierarchical relationships Include all pages from the sitemap Return ONLY the tree structure, no other text Start with the root domain as the top level Step 3: Create visual mind map Visit the Whimsical Diagrams GPT Request a mind map creation using your ASCII tree structure Get a professional visual representation of your website architecture Results interpretation Google Sheets output structure The generated spreadsheet contains: Niv 0 to Niv 5: Hierarchical levels (0 = homepage, 1-5 = navigation depth) URL column: Complete URLs for reference Hyperlinked structure: Clickable links organized by hierarchy Multi-domain support: Handles subdomains and different domain structures Data organization features Automatic sorting: Pages organized by navigation depth and alphabetical order Parent-child relationships: Clear hierarchical structure maintained Domain separation: Main domains and subdomains processed separately Clean formatting: URLs decoded and formatted for readability Workflow limitations Sitemap dependency: Only discovers pages listed in the website's sitemap Crawling restrictions: Some websites may block external crawlers Level depth: Limited to 5 hierarchical levels for clarity Rate limits: Respects Firecrawl API limitations Template dependency: Requires access to the base template for duplication Use cases SEO audits: Analyze site structure for optimization opportunities UX research: Understand navigation patterns and user paths Content strategy: Identify content gaps and organizational issues Site migrations: Document existing structure before redesigns Competitive analysis: Study competitor site architectures Client presentations: Create visual site maps for stakeholder reviews
by Sona Labs
Automatically enrich company records with comprehensive firmographic data by pulling domains from Google Sheets, setting up custom HubSpot fields, enriching through Sona API, and syncing complete profiles to HubSpot CRM with custom property mapping. Import company domains from a Google Sheet, configure custom HubSpot fields for Sona data, automatically enrich domains with detailed firmographic intelligence, and create fully populated company records in HubSpotβso you can build rich prospect databases without manual research. How it works Step 1: Get Company List Reads company domains from your Google Sheet Aggregates all domains into a single array Prepares data for batch processing Step 2: Setup HubSpot Fields Creates custom Sona fields in HubSpot CRM Defines all enrichment data fields needed Ensures proper field mapping for incoming data Step 3: Prepare for Processing Converts aggregated domains into individual items Sets data for batch loop processing Readies each company for enrichment Step 4: Enrich & Sync to HubSpot Loops through each company domain Calls Sona API for enrichment data Creates company in HubSpot with standard fields Formats and updates custom Sona properties Combines firmographics + tech data in one profile Includes 2-second wait between operations for rate limiting What you'll get The workflow enriches each company record with: Firmographic Data**: Company size, employee count, revenue estimates, headquarters location, and founding year Contact Information**: Phone numbers, social media profiles, and timezone details Business Intelligence**: Company descriptions and industry positioning Custom HubSpot Properties**: All Sona data mapped to dedicated custom fields Organized CRM Records**: All data automatically synced to HubSpot for immediate use Domain Tracking**: Companies linked to their websites for future reference Why use this Eliminate manual research**: Save 10-15 minutes per company by automating firmographic lookups Build rich databases**: Transform basic domain lists into comprehensive company profiles Custom field management**: Automatically creates and populates HubSpot custom properties Improve targeting**: Segment and prioritize accounts based on size, location, and other firmographics Keep data current**: Run scheduled enrichments to maintain up-to-date company information Scale your prospecting**: Process hundreds of companies in minutes instead of days Better lead qualification**: Make informed decisions with complete company intelligence Streamlined workflow**: One-click enrichment from spreadsheet to CRM with custom field setup Setup instructions Before you start, you'll need: Google Sheets with a column named website_Domain containing company domains (e.g., example.com) HubSpot Account & App Token - Get an app token by creating a legacy app: Go to HubSpot Settings β Integrations β Legacy Apps Click Create Legacy App Select Private (for one account) In the scopes section, enable the following permissions: crm.schemas.companies.write crm.objects.companies.write crm.schemas.companies.read crm.objects.companies.read Click Create Copy the access token from the Auth tab Sona API Key (for company enrichment) Sign up at https://app.sonalabs.com Free tier available for testing Configuration steps: Prepare your data: Create a Google Sheet with a "website_Domain" column and add 2-3 test companies (e.g., example.com, anthropic.com) Connect Google Sheets: In the "Get Company List from Sheet" node, authenticate with Google and select your spreadsheet and sheet name Configure HubSpot field creation: In the "Create Custom HubSpot Fields" node (Step 2), authenticate with your HubSpot access token and review the custom Sona fields that will be created Add Sona credentials: In the "Sona Enrich" node, authenticate with your Sona API key Connect HubSpot for company creation: In the "Create HubSpot Company" and "Update Company with AI Data" nodes, authenticate using your HubSpot access token Test with sample data: Run the workflow with 2-3 test companies and verify: Custom fields are created in HubSpot Company records appear correctly in HubSpot All firmographic data is populated in custom properties Add error handling: Configure notifications for failed enrichments or API errors (optional but recommended) Scale and automate: Process your full company list, then optionally add a Schedule Trigger for automatic daily or weekly enrichment to keep your CRM data fresh
by Δα» ThΓ nh NguyΓͺn
Automated Facebook Page Story Video Publisher (Google Drive β Facebook β Google Sheet) > Recommended: Self-hosted via tino.vn/vps-n8n?affid=388 β use code VPSN8N for up to 39% off. This workflow is an automated solution for publishing video content from Google Drive to your Facebook Page Stories, while using Google Sheets as a posting queue manager. What This Workflow Does (Workflow Function) This automation orchestrates a complete multi-step process for uploading and publishing videos to Facebook Stories: Queue Management: Every 2 hours and 30 minutes, the workflow checks a Google Sheet (Get Row Sheet node) to find the first video whose Stories column is empty β meaning it hasnβt been posted yet. Conditional Execution: An If node confirms that the videoβs File ID exists before proceeding. Video Retrieval: Using the File ID, the workflow downloads the video from Google Drive (Google Drive node) and calculates its binary size (Set to the total size in bytes node). Facebook 3-Step Upload: It performs the Facebook Graph APIβs three-step upload process through HTTP Request nodes: Step 1 β Initialize Session: Starts an upload session and retrieves the upload_url and video_id. Step 2 β Upload File: Uploads the binary video data to the provided upload_url. Step 3 β Publish Video: Finalizes and publishes the uploaded video as a Facebook Story. Status Update: Once completed, the workflow updates the same row in Google Sheets (Update upload status in sheet node) using the row_number to mark the video as processed. Prerequisites (What You Need Before Running) 1. n8n Instance > Recommended: Self-hosted via tino.vn/vps-n8n?affid=388 β use code VPSN8N for up to 39% off. 2. Google Services Google Drive Credentials:** OAuth2 credentials for Google Drive to let n8n download video files. Google Sheets Credentials:** OAuth2 credentials for Google Sheets to read the posting queue and update statuses. Google Sheet:** A spreadsheet (ID: 1RnE5O06l7W6TLCLKkwEH5Oyl-EZ3OE-Uc3OWFbDohYI) containing: File ID β the videoβs unique ID in Google Drive. Stories β posting status column (leave empty for pending videos). row_number β used for updating the correct row after posting. 3. Facebook Setup Page ID:** Your Facebook Page ID (currently hardcoded as 115432036514099 in the info node). Access Token:* A *Page Access Token** with permissions such as pages_manage_posts and pages_read_engagement. This token is hardcoded in the info node and again in Step 3. Post video. Usage Guide and Implementation Notes How to Use Queue Videos: Add video entries to your Google Sheet. Each entry must include a valid Google Drive File ID. Leave the Stories column empty for videos that havenβt been posted. Activate: Save and activate the workflow. The Schedule Trigger will automatically handle new uploads every 2 hours and 30 minutes. Implementation Notes β οΈ Token Security:* Hardcoding your *Access Token* inside the info node is *not recommended**. Tokens expire and expose your Page to risk if leaked. π Action: Replace the static token with a secure Credential setup that supports token rotation. Loop Efficiency:* The *βfalseβ** output of the If node currently loops back to the Get Row Sheet node. This creates unnecessary cycles if no videos are found. π Action: Disconnect that branch so the workflow stops gracefully when no unposted videos remain. Status Updates:* To prevent re-posting the same video, the final Update upload status in sheet node must update the *Stories** column (e.g., write "POSTED"). π Action: Add this mapping explicitly to your Google Sheets node. Automated File ID Sync:** This workflow assumes that the Google Sheet already contains valid File IDs. π You can build a secondary workflow (using Schedule Trigger1 β Search files and folders β Append or update row in sheet) to automatically populate new video File IDs from your Google Drive. β Result Once active, this workflow automatically: pulls pending videos from your Google Sheet, uploads them to Facebook Stories, and marks them as posted β all without manual intervention.
by Joe V
π AI Video Polling Engine - Long-Running Job Handler for Veo, Sora & Seedance The async backbone that makes AI video generation production-ready β‘π¬ π₯ See It In Action π Full Demo: youtu.be/OI_oJ_2F1O0 β οΈ Must Read First This is a companion workflow for the main AI Shorts Generator: π Main Workflow: AI Shorts Reactor This workflow handles the "waiting game" so your main bot stays fast and responsive. Think of it as the backstage crew that handles the heavy lifting while your main workflow performs on stage. π€ The Problem This Solves Without This Workflow: User sends message β Bot calls AI API β β³ Bot waits 2-5 minutes... (BLOCKED) β β Timeout errors β Execution limits exceeded β Users think bot is broken β Can't handle multiple requests With This Workflow: User sends message β Bot calls AI API β β Bot responds instantly: "Video generating..." β π This webhook polls in background β β‘ Main bot handles other users β β Video ready β Auto-sends to user Result: Your bot feels instant, scales infinitely, and never times out π π What This Workflow Does This is a dedicated polling webhook that acts as the async job handler for AI video generation. It's the invisible worker that: 1οΈβ£ Receives the Job POST /webhook/poll-video { "sessionId": "user_123", "taskId": "veo_abc456", "model": "veo3", "attempt": 1 } 2οΈβ£ Responds Instantly 200 OK - "Polling started" (Main workflow never waits!) 3οΈβ£ Polls in Background Wait 60s β Check status β Repeat β±οΈ Waits 1 minute between checks (API-friendly) π Polls up to 15 times (~15 minutes max) π― Supports Veo, Sora, and Seedance APIs 4οΈβ£ Detects Completion Handles multiple API response formats: // Veo format { status: "completed", videoUrl: "https://..." } // Market format (Sora/Seedance) { job: { status: "success", result: { url: "..." } } } // Legacy format { data: { video_url: "..." } } (No matter how the API responds, this workflow figures it out) 5οΈβ£ Delivers the Video Once ready: π₯ Downloads video from AI provider βοΈ Uploads to your S3 storage πΎ Restores user session from Redis π± Sends Telegram preview with buttons π Enables video extension (Veo only) π Logs metadata for analytics βοΈ Technical Architecture The Flow: Main Workflow Polling Webhook β β βββ[Trigger AI Job]βββββββββββ€ β "Task ID: abc123" β β β βββ[Return Instantly] β β "Generating..." β β β βββ[Handle New User] β β βββ[Wait 60s] β β β βββ[Check Status] β β "Processing..." β β β βββ[Wait 60s] β β β βββ[Check Status] β β "Completed!" β β β βββ[Download Video] β β β βββ[Upload to S3] β β β βββ[Send to User] β β ββββββββββββββββββββββββββββββββββββ "Your video is ready!" π Key Features β‘ Non-Blocking Architecture Main workflow never waits Handle unlimited concurrent jobs Each user gets instant responses π Intelligent Polling Respects API rate limits (60s intervals) Auto-retries on transient failures Graceful timeout handling (15 attempts max) π― Multi-Provider Support Handles different API formats: Veo** - record-info endpoint Sora** - Market job status Seedance** - Market job status π‘οΈ Robust Error Handling β Missing video URL β Retry with fallback parsers β API timeout β Continue polling β Invalid response β Parse alternative formats β Max attempts reached β Notify user gracefully πΎ Session Management Stores state in Redis Restores full context when video is ready Supports video extension workflows Maintains user preferences π Production Features Detailed logging at each step Metadata tracking (generation time, model used, etc.) S3 storage integration Telegram notifications Analytics-ready data structure π§© Integration Points Works Seamlessly With: | Use Case | How It Helps | |----------|--------------| | π€ Telegram Bots | Keeps bot responsive during 2-5 min video generation | | πΊ YouTube Automation | Polls video, then triggers auto-publish | | π¬ Multi-Video Pipelines | Handles 10+ videos simultaneously | | π’ Content Agencies | Production-grade reliability for clients | | π§ͺ A/B Testing | Generate multiple variations without blocking | Required Components: β Main workflow that triggers video generation β Redis for session storage β S3-compatible storage for videos β KIE.ai API credentials β Telegram Bot (for notifications) π How to Use Step 1: Set Up Main Workflow Import and configure the AI Shorts Reactor Step 2: Import This Webhook Add this workflow to your n8n instance Step 3: Configure Credentials KIE.ai API key Redis connection S3 storage credentials Telegram bot token Step 4: Link Workflows In your main workflow, call this webhook: // After triggering AI video generation const response = await httpRequest({ method: 'POST', url: 'YOUR_WEBHOOK_URL/poll-video', body: { sessionId: sessionId, taskId: taskId, model: 'veo3', attempt: 1 } }); Step 5: Activate & Test Activate this polling webhook Trigger a video generation from main workflow Watch it poll in background and deliver results π― Real-World Example Scenario: User generates 3 videos simultaneously Without This Workflow: User A: "Generate video" β Bot: β³ Processing... (BLOCKED 5 min) User B: "Generate video" β Bot: β Timeout (main workflow still processing User A) User C: "Generate video" β Bot: β Never receives request With This Workflow: User A: "Generate video" β Bot: β "Generating! Check back in 3 min" β Polling webhook handles in background User B: "Generate video" β Bot: β "Generating! Check back in 3 min" β Second polling instance starts User C: "Generate video" β Bot: β "Generating! Check back in 3 min" β Third polling instance starts 3 minutes later--- User A: πΉ "Your video is ready!" [Preview] [Publish] User B: πΉ "Your video is ready!" [Preview] [Publish] User C: πΉ "Your video is ready!" [Preview] [Publish] All three users served simultaneously with zero blocking! π π§ Customization Options Adjust Polling Frequency // Default: 60 seconds // For faster testing (use credits faster): const waitTime = 30; // seconds // For more API-friendly (slower updates): const waitTime = 90; // seconds Change Timeout Limits // Default: 15 attempts (15 minutes) const maxAttempts = 20; // Increase for longer videos Add More Providers Extend to support other AI video APIs: switch(model) { case 'veo3': // Existing Veo logic case 'runway': // Add Runway ML polling case 'pika': // Add Pika Labs polling } Custom Notifications Replace Telegram with: Discord webhooks Slack messages Email notifications SMS via Twilio Push notifications π Monitoring & Analytics What Gets Logged: { "sessionId": "user_123", "taskId": "veo_abc456", "model": "veo3", "status": "completed", "attempts": 7, "totalTime": "6m 32s", "videoUrl": "s3://bucket/videos/abc456.mp4", "metadata": { "duration": 5.2, "resolution": "1080x1920", "fileSize": "4.7MB" } } Track Key Metrics: β±οΈ Average generation time per model π Polling attempts before completion β Failure rate by provider π° Cost per video (API usage) π Concurrent job capacity π¨ Troubleshooting "Video never completes" β Check KIE.ai API status β Verify task ID is valid β Increase maxAttempts if needed β Check API response format hasn't changed "Polling stops after 1 attempt" β Ensure webhook URL is correct β Check n8n execution limits β Verify Redis connection is stable "Video downloads but doesn't send" β Check Telegram bot credentials β Verify S3 upload succeeded β Ensure Redis session exists β Check Telegram chat ID is valid "Multiple videos get mixed up" β Confirm sessionId is unique per user β Check Redis key collisions β Verify taskId is properly passed ποΈ Architecture Benefits Why Separate This Logic? | Aspect | Monolithic Workflow | Separated Webhook | |--------|--------------------|--------------------| | β‘ Response Time | 2-5 minutes | <1 second | | π Concurrency | 1 job at a time | Unlimited | | π° Execution Costs | High (long-running) | Low (short bursts) | | π Debugging | Hard (mixed concerns) | Easy (isolated logic) | | π Scalability | Poor | Excellent | | π§ Maintenance | Complex | Simple | π οΈ Requirements Services Needed: β n8n Instance (cloud or self-hosted) β KIE.ai API (Veo, Sora, Seedance access) β Redis (session storage) β S3-compatible Storage (videos) β Telegram Bot (optional, for notifications) Skills Required: Basic n8n knowledge Understanding of webhooks Redis basics (key-value storage) S3 upload concepts Setup Time: ~15 minutes Technical Level: Intermediate π·οΈ Tags webhook polling async-jobs long-running-tasks ai-video veo sora seedance production-ready redis s3 telegram youtube-automation content-pipeline scalability microservices n8n-webhook job-queue background-worker π‘ Best Practices Do's β Keep polling interval at 60s minimum (respect API limits) Always handle timeout scenarios Log generation metadata for analytics Use unique session IDs per user Clean up Redis after job completion Don'ts β Don't poll faster than 30s (risk API bans) Don't store videos in Redis (use S3) Don't skip error handling Don't use this for real-time updates (<10s) Don't forget to activate the webhook π Success Stories After Implementing This Webhook: | Metric | Before | After | |--------|--------|-------| | β‘ Bot response time | 2-5 min | <1 sec | | π¬ Concurrent videos | 1 | 50+ | | β Timeout errors | 30% | 0% | | π User satisfaction | 6/10 | 9.5/10 | | π° Execution costs | $50/mo | $12/mo | π Related Workflows π¬ Main: AI Shorts Reactor - The full video generation bot π€ YouTube Auto-Publisher - Publish completed videos π¨ Video Style Presets - Custom prompt templates π Analytics Dashboard - Track all generations π License MIT License - Free to use, modify, and distribute! β‘ Make your AI video workflows production-ready. Let the webhook handle the waiting. β‘ Created by Joe Venner | Built with β€οΈ and n8n | Part of the AI Shorts Reactor ecosystem
by WeblineIndia
Zoho CRM Sales Cycle Performance Analyzer & Improver This workflow automatically analyzes your Zoho CRM deal cycles with AI-powered intelligence, compares them against historical performance data from Google Sheets, and delivers actionable insights to Slack. It identifies bottlenecks, predicts outcomes, analyzes sentiment, generates smart recommendations, creates data visualizations, and builds a historical dataset for future intelligenceβall without manual reporting. Quick Implementation Steps Connect Accounts: Set up credentials for Zoho CRM, Google Sheets, Slack, and OpenAI in n8n. Prepare Sheet: Create a Google Sheet with headers: Deal_Name, Stage, Created_Time, Closed_Time (or Modified_Time). Configure Nodes: Zoho Trigger: Ensure it pulls your deals. Google Sheets: Link your "Historical Data" sheet to both the "Fetch" and "Log" nodes. OpenAI Nodes: Configure your OpenAI API key for AI analysis. Slack: Select your #sales-insights channel. Activate: Turn on the workflow to start receiving AI-enhanced real-time insights on deal closure. What It Does This n8n workflow serves as an AI-powered automated data analyst for your sales team. Whenever a deal is fetched from Zoho CRM, the workflow first filters for relevance (e.g., recently closed or modified deals). It then cross-references this specific deal against your historical sales data stored in Google Sheets to calculate key performance metrics like "Days to Close" and "Stage Dwell Time." π€ AI-Enhanced Features: Sentiment Analysis**: Analyzes deal descriptions and communications for emotional tone and risk indicators Predictive Analytics**: Uses historical patterns to predict win probability and expected close dates Smart Recommendations**: Generates AI-powered, data-driven process improvement suggestions Data Visualization**: Creates charts and trend analysis for performance metrics Performance Scoring**: Calculates comprehensive performance scores and risk levels Beyond simple calculations, the workflow applies AI intelligence to generate human-readable insights. It determines if a deal was faster or slower than average, identifies which stage caused delays, analyzes sentiment for risk assessment, predicts outcomes, and suggests specific process improvements based on the data. Finally, it closes the loop by broadcasting these AI-enhanced focused insights to a Slack channel for immediate team visibility and logging the new deal's performance back into Google Sheets. This ensures your historical dataset grows richer and more accurate with every closed deal, continuously improving the quality of future AI predictions. Whoβs It For Sales Managers**: To monitor team performance and identify coaching opportunities without digging into CRM reports. RevOps Professionals**: To automate the collection of cycle-time data and spot process bottlenecks. Small Business Owners**: To get enterprise-grade sales analytics without hiring a data analyst. Sales Teams**: To get immediate feedback on their wins and losses, fostering a culture of continuous improvement. Prerequisites n8n Instance**: A self-hosted or cloud version of n8n. Zoho CRM Account**: With permission to read Deals. Google Account**: Access to Google Sheets. Slack Workspace**: Permission to post messages to channels. OpenAI Account**: API access for GPT-4 model integration. Google Sheet**: A formatted sheet to store and retrieve historical deal data. How to Use & Setup 1. Google Sheet Setup Create a new Google Sheet. In the first row, add the following headers (the workflow tries to match various case formats, but these are recommended): Deal_Name Stage Created_Time Closed_Time Stage_History (Optional, for advanced dwell time analysis) 2. Configure Credentials In your n8n dashboard, ensure you have authenticated: Zoho CRM Google Sheets Slack OpenAI** (for AI-powered analysis) 3. Node Configuration Zoho CRM - Deal Trigger**: This node is set to "Get All" deals. You might want to adjust this to a Trigger node that listens for "Deal Updated" or "Deal Created" events for real-time automation, or keep it as a scheduled poll. Filter Recent Deals (Code Node)**: Currently configured to process deals closed in the last 7 days and limit to 10 items. No changes needed unless you want to process larger batches. Fetch Historical Averages (Google Sheets)**: Select your Credential. Resource: Document -> Select your prepared Sheet. Operation: Get Many ("GetAll" or "Read"). Return All: True. AI Sentiment Analysis (OpenAI)**: Select your OpenAI Credential. Model: GPT-4 (recommended for best results). Automatically analyzes deal sentiment and emotional tone. AI Predictive Analytics (OpenAI)**: Uses historical data to predict outcomes and win probabilities. Provides risk assessment and expected close dates. AI Smart Recommendations (OpenAI)**: Generates intelligent, context-aware recommendations. Prioritizes suggestions based on impact and feasibility. Advanced Data Visualization**: Creates charts for cycle trends, stage distribution, and performance metrics. Generates data for visual analysis and reporting. Slack Notification**: Select your Credential. Channel: Enter the name of your channel (e.g., #sales-insights). Now includes AI-enhanced insights in the message format. Log to Historical Sheet (Google Sheets)**: Select your Credential. Resource: Document -> Select the same sheet as above. Operation: Append. 4. Running the Workflow Test**: Click "Execute Workflow" manually to test with the "Zoho CRM - Deal Trigger" (conceptually acting as a manual fetch here). Production*: Switch the trigger to a legitimate *Schedule Trigger (e.g., run every morning) or a Zoho CRM Trigger (Real-time) to automate the process. How To Customize Nodes Adjusting the Risk/Insight Logic The core intelligence lives in the Analyze Cycle code node. You can modify the JavaScript here to change thresholds. Change "Slow" Threshold**: Look for if (totalDays > avgDays * 1.25). Change 1.25 to 1.5 to only flag deals that are 50% slower than average. custom Suggestions**: Add new if statements in the // Process improvement suggestions section to add your own coaching advice based on specific stages or owners. Customizing AI Prompts The AI nodes use specific prompts that can be customized: AI Sentiment Analysis**: Modify the prompt in the OpenAI node to focus on specific aspects (e.g., competitor mentions, pricing concerns). AI Predictive Analytics**: Adjust the prediction criteria or add custom factors relevant to your business. AI Smart Recommendations**: Customize the recommendation style or focus on specific business objectives. Changing the Output Format The Slack Notification node uses a template. You can customize the message layout by editing the Text field. You can use standard Slack markdown (e.g., bold, italics) and add variables from specific fields in your CRM (like "Lead Source" or "Competitor"). AI Model Configuration Model Selection**: Change from GPT-4 to GPT-3.5-turbo for faster processing (slightly less accurate). Temperature Adjustment**: Modify creativity level in AI responses (0.0 = deterministic, 1.0 = highly creative). Token Limits**: Adjust response length for more detailed or concise AI outputs. Addβons To extend the functionality of this workflow, consider adding: Weekly Report Email**: Add an "Email" node at the end to send a summary digest to the CEO every Friday. Manager Alert**: Add an IF node before Slack to tag the Sales Manager (@user) only if the totalDays exceeds 60 days or if AI risk level is "High". CRM Update: Write the calculated "Days to Close" and **AI predictions back into custom fields in Zoho CRM so you can report on it directly inside Zoho. Dashboard Integration**: Send visualization data to tools like Grafana or Power BI for real-time dashboards. Competitor Analysis**: Add AI node to analyze deal descriptions for competitor mentions and market trends. Use Case Examples 1. Post-Mortem on Lost Deals When a deal is marked "Closed Lost," the workflow calculates how long it sat in each stage. AI sentiment analysis detects negative communication patterns, and the Slack alert highlights this bottleneck, prompting a review of the negotiation strategy. 2. Celebrating Efficiency A deal closes in 15 days when the average is 45. The workflow identifies this anomaly, calculates it is "66% faster than average," AI predicts high success factors, and posts a celebratory message, asking the rep to share what worked. 3. Reviewing Stalled Deals By changing the trigger to look for open deals, you can use this logic to flag active deals that have already exceeded the average winning cycle time, signaling they are "at risk." AI predictive analytics provides win probability for each stalled deal. 4. Onboarding Usage New sales reps can see immediate feedback on their deals compared to the company historical average, helping them calibrate their pace without constant manager intervention. AI recommendations provide personalized coaching tips. 5. Product/Service Specific Analysis Duplicate the workflow and filter by "Product Type" in the Code node. Maintain separate Google Sheets for "Enterprise" vs "SMB" deal cycles to get more accurate baselines for different business lines. AI sentiment analysis can identify product-specific communication patterns. 6. AI-Enhanced Deal Scoring NEW: The workflow now provides AI-powered deal scoring, sentiment-based risk assessment, and predictive win probabilities, enabling sales teams to prioritize high-potential deals and focus resources effectively. Troubleshooting Guide | Issue | Possible Cause | Solution | | :--- | :--- | :--- | | No insights generated | Google Sheet is empty or headers don't match. | Ensure your Google Sheet has at least one row of valid historical data with matching headers (Created_Time, Closed_Time). | | "Invalid Date" errors | Date formats in Zoho or Sheets are inconsistent. | Check that your system regional settings match. The Code node expects standard date strings. | | Slack message is empty | Deal_Name or sensitive data is missing. | The "Check Valid Data" node filters out incomplete records. Ensure your test deals have a Name and timestamps. | | Workflow times out | Too many deals being processed. | The "Filter Recent Deals" node limits to 10 items. If you remove this limit, n8n may timeout on large datasets. Keep the batch size small. | | Google Sheets Error | Authentication or Sheet ID missing. | Re-authenticate your Google account and re-select the Document and Sheet from the list in the node settings. | | AI nodes not working | OpenAI API key missing or invalid. | Configure your OpenAI credentials in n8n settings and ensure the API key has sufficient credits. | | AI responses too slow | Using GPT-4 with large datasets. | Switch to GPT-3.5-turbo for faster processing, or reduce the amount of data sent to AI nodes. | | Sentiment analysis inaccurate | Limited deal description data. | Ensure your Zoho deals have meaningful descriptions and communication logs for better sentiment analysis. | | Predictions seem wrong | Insufficient historical data. | AI predictions improve with more historical data. Ensure at least 50+ historical deals for accurate predictions. | Need Help? Setting up custom analytics or complex logic in Code nodes can be tricky. If you need help tailoring this workflow to your specific business rules, creating advanced Add-ons or integrating with other CRMs: Contact WeblineIndia We specialize in building robust business process automation solutions. Whether you need a simple tweak or a fully custom enterprise automation suite, our experts are ready to assist. Reach out to us today to unlock the full potential of your sales data!