by Cheng Siong Chin
Introduction Automates flight deal discovery and intelligent analysis for travel bloggers and deal hunters. Scrapes live pricing, enriches with weather data, applies AI evaluation, and auto-publishes to WordPress—eliminating manual research and accelerating content delivery. How It Works User submits route via form, scrapes real-time flight prices and weather data, AI analyzes deal quality considering weather conditions, formats results, publishes to WordPress, sends Slack notification—fully automated from input to publication. Workflow Template Form Input → Extract Data → Scrape Flight Prices → Extract Pricing → Fetch Weather → Parse Weather → Prepare AI Input → AI Analysis → Parse Output → Format Results → Publish WordPress → Slack Alert → User Response Setup Instructions Form Setup: Configure user input fields for flight routes and preferences APIs: Connect Google Flights scraping endpoint, weather API credentials, OpenAI/Chat Model API key Publishing: Set WordPress credentials, target blog category, Slack webhook URL AI Configuration: Define analysis prompts, output structure, parser rules Workflow Steps Data Collection: Form captures route, scrapes Google Flights pricing, fetches destination weather via API AI Processing: Enriches flight data with weather context, analyzes deal quality using OpenAI/Chat Model with structured output parsing Publishing: Formats analysis results, creates WordPress post, sends Slack notification, delivers response to user Prerequisites n8n instance, Google Flights access, weather API key, OpenAI/compatible AI service, WordPress site with API access, Slack workspace Use Cases Travel blog automation, flight deal newsletters, price comparison services, seasonal travel planning, destination weather analysis, automated social media content Customization Modify AI analysis criteria, adjust weather impact weighting, customize WordPress post templates, add email distribution, integrate additional data sources, expand to hotel/rental deals Benefits Eliminates manual price checking, combines multiple data sources automatically, delivers AI-enhanced insights, accelerates publishing workflow, scales across unlimited routes, provides weather-aware recommendations
by Dr. Firas
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Create and Auto-Post Viral AI Videos with VEO3 and Blotato to 9 Platforms Who is this for? This template is ideal for content creators, growth marketers, e-commerce entrepreneurs, and video-first brands who want to automate the creation and multi-platform distribution of viral short-form ads using AI. If you're looking to scale video production without editing tools or posting manually, this is for you. What problem is this workflow solving? Creating high-converting video content is time-consuming. You need to: Come up with ideas Write compelling scripts Generate visuals Adapt content for each platform Manually publish and track results This workflow automates that entire process and turns a single idea into a ready-to-publish video campaign across 9 platforms. What this workflow does Triggers via Telegram when a new video idea is submitted Fetches parameters (style, tone, duration) from Google Sheets Generates the video script using GPT-4 and a master AI prompt Creates the video using Google’s VEO3 video generation model Downloads the final video once rendering is complete Rewrites the caption with GPT-4o for platform-optimized posting Logs the result in Google Sheets for tracking Sends preview links to Telegram for review Auto-posts the video to 9 platforms using Blotato (TikTok, YouTube, Instagram, Threads, Facebook, X, LinkedIn, Pinterest, Bluesky) Setup Install n8n (self-hosted) with Community Nodes enabled Connect your Telegram Bot Token to the trigger node Add your OpenAI API Key for GPT-4 and GPT-4o models Configure your VEO3 API access (Google AI Studio) Set up Blotato with your platform tokens & IDs Link your Google Sheets and set the expected column structure (idea, style, caption, etc.) Adjust the Telegram trigger format (e.g., idea: ...) to your team’s input style How to customize this workflow to your needs Edit the master prompt to match your brand voice or industry Replace the caption prompt to generate more marketing-style hooks Modify the platform list if you only publish to a few specific channels Add approval steps (Slack, email, Telegram) before publishing Integrate tracking by pushing published URLs to your analytics or CRM 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Khairul Muhtadin
Decodo Amazon Product Recommender delivers instant, AI-powered shopping recommendations directly through Telegram. Send any product name and receive Amazon product analysis featuring price comparisons, ratings, sales data, and categorized recommendations (budget, premium, best value) in under 40 seconds—eliminating hours of manual research. Why Use This Workflow? Time Savings: Reduce product research from 45+ minutes to under 30 seconds Decision Quality: Compare 20+ products automatically with AI-curated recommendations Zero Manual Work: Complete automation from message input to formatted recommendations Ideal For E-commerce Entrepreneurs:** Quickly research competitor products, pricing strategies, and market trends for inventory decisions Smart Shoppers & Deal Hunters:** Get instant product comparisons with sales volume data and discount tracking before purchasing Product Managers & Researchers:** Analyze Amazon marketplace positioning, customer sentiment, and pricing ranges for competitive intelligence How It Works Trigger: User sends product name via Telegram (e.g., "iPhone 15 Pro Max case") AI Validation: Gemini 2.5 Flash extracts core product keywords and validates input authenticity Data Collection: Decodo API scrapes Amazon search results, extracting prices, ratings, reviews, sales volume, and product URLs Processing: JavaScript node cleans data, removes duplicates, calculates value scores, and categorizes products (top picks, budget, premium, best value, most popular) Intelligence Layer: AI generates personalized recommendations with Telegram-optimized markdown formatting, shortened product names, and clean Amazon URLs Output & Delivery: Formatted recommendations sent to user with categorized options and direct purchase links Error Handling: Admin notifications via separate Telegram channel for workflow monitoring Setup Guide Prerequisites | Requirement | Type | Purpose | |-------------|------|---------| | n8n instance | Essential | Workflow execution platform | | Decodo Account | Essential | Amazon product data scraping | | Telegram Bot Token | Essential | Chat interface for user interactions | | Google Gemini API | Essential | AI-powered product validation and recommendations | | Telegram Account | Optional | Admin error notifications | Installation Steps Import the JSON file to your n8n instance Configure credentials: Decodo API: Sign up at decodo.com → Dashboard → Scraping APIs → Web Advanced → Copy BASIC AUTH TOKEN Telegram Bot: Message @BotFather on Telegram → /newbot → Copy HTTP API token (format: 123456789:ABCdefGHI...) Google Gemini: Obtain API key from Google AI Studio for Gemini 2.5 Flash model Update environment-specific values: Replace YOUR-CHAT-ID in "Notify Admin" node with your Telegram chat ID for error notifications Verify Telegram webhook IDs are properly configured Customize settings: Adjust AI prompt in "Generate Recommendations" node for different output formats Set character limits (default: 2500) for Telegram message length Test execution: Send test message to your Telegram bot: "iPhone 15 Pro" Verify processing status messages appear Confirm recommendations arrive with properly formatted links Customization Options Basic Adjustments: Character Limit**: Modify 2500 in AI prompt to adjust response length (Telegram max: 4096) Advanced Enhancements: Multi-language Support**: Add language detection and translation nodes for international users Price Tracking**: Integrate Google Sheets to log historical prices and trigger alerts on drops Image Support**: Enable Telegram photo messages with product images from scraping results Troubleshooting Common Issues: | Problem | Cause | Solution | |---------|-------|----------| | "No product detected" for valid inputs | AI validation too strict or ambiguous query | Add specific product details (model number, brand) in user input | | Empty recommendations returned | Decodo API rate limit or Amazon blocking | Wait 60 seconds between requests; verify Decodo account status | | Telegram message formatting broken | Special characters in product names | Ensure Telegram markdown mode is set to "Markdown" (legacy) not "MarkdownV2" | Use Case Examples Scenario 1: E-commerce Store Owner Challenge: Needs to quickly assess competitor pricing and product positioning for new inventory decisions without spending hours browsing Amazon Solution: Sends "wireless earbuds" to bot, receives categorized analysis of 20+ products with price ranges ($15-$250), top sellers, and discount opportunities Result: Identifies $35-$50 price gap in market, sources comparable product, achieves 40% profit margin Scenario 2: Smart Shopping Enthusiast Challenge: Wants to buy a laptop backpack but overwhelmed by 200+ Amazon options with varying prices and unclear value propositions Solution: Messages "laptop backpack" to bot, gets AI recommendations sorted by budget ($30), premium ($50+), best value (highest discount + good ratings), and most popular (by sales volume) Result: Purchases "Best Value" recommendation with 35% discount, saves $18 and 45 minutes of research time Created by: Khaisa Studio Category: AI | Productivity | E-commerce | Tags: amazon, telegram, ai, product-research, shopping, automation, gemini Need custom workflows? Contact us Connect with the creator: Portfolio • Workflows • LinkedIn • Medium • Threads
by Meak
LinkedIn Job-Based Cold Email System Most outreach tools rely on generic lead lists and recycled contact data. This workflow builds a live, personalized lead engine that scrapes new LinkedIn job posts, finds company decision-maker emails, and generates custom cold emails using GPT — all fully automated through n8n. Benefits Automated daily scraping of “Marketing Manager” jobs in Belgium Real-time leads from companies currently hiring for marketing roles Filters out HR and staffing agencies to keep only real businesses Enriches each company with verified CEO, Sales, and Marketing emails Generates unique, human-like cold emails and subject lines with GPT-4o Saves clean data to Google Sheets and drafts personalized Gmail messages How It Works Schedule Trigger runs every morning at 08:00. Apify LinkedIn Scraper collects new “Marketing Manager” jobs in Belgium. Remove Duplicates ensures each company appears only once. Filter Staffing excludes recruiters, HR agencies, and interim firms. Save Useful Infos extracts core company data — name, domain, size, description. Filter Domain & Size keeps valid websites and companies under 100 employees. Anymailfinder API looks up CEO, Sales, and Marketing decision-maker emails. Merge + If Node validates email results and removes invalid entries. Split Out + Deduplicate ensures unique, verified contacts. Extract Lead Name (Code Node) separates first and last names. Google Sheets Node appends all enriched lead data to your master sheet. GPT-4o (LangChain) writes a 100–120 word personalized cold email. GPT-4o (LangChain) creates a short, casual subject line. Gmail Draft Node builds a ready-to-send email using both outputs. Wait Node loops until all leads are processed. Who Is This For B2B agencies targeting Belgian SMEs Outbound marketers using job postings as purchase intent signals Freelancers or founders running lean, automated outreach systems Growth teams building scalable cold email engines Setup Apify**: use curious_coder~linkedin-jobs-scraper actor + API token Anymailfinder**: header auth with decision-maker categories (ceo, sales, marketing) Google Sheets**: connect a sheet named “LinkedIn Job Scraper” and map columns OpenAI (GPT-4o)**: insert your API key into both LangChain nodes Gmail**: OAuth2 connection; resource set to draft n8n**: store all credentials securely; set HTTP nodes to continue on error ROI & Results Save 1–3 hours per day on manual research and outreach prep Contact active hiring companies when they need marketing help most Scale to multiple industries or regions by changing search URLs Outperform paid lead databases with fresh, verified data Strategy Insights Add funding or tech-stack data for better lead scoring A/B test GPT subject lines and log open rates in Sheets Schedule GPT follow-ups 3 and 7 days later for full automation Push all enriched data to your CRM for advanced segmentation Use hiring signals to trigger ad audiences or retargeting campaigns Check Out My Channel For more advanced automation workflows that generate real client results, check out my YouTube channel — where I share the exact systems I use to automate outreach, scale agency pipelines, and close deals faster.
by Khairul Muhtadin
Decodo Amazon Product Recommender delivers instant, AI-powered shopping recommendations directly through Telegram. Send any product name and receive Amazon product analysis featuring price comparisons, ratings, sales data, and categorized recommendations (budget, premium, best value) in under 40 seconds—eliminating hours of manual research. Why Use This Workflow? Time Savings: Reduce product research from 45+ minutes to under 30 seconds Decision Quality: Compare 20+ products automatically with AI-curated recommendations Zero Manual Work: Complete automation from message input to formatted recommendations Ideal For E-commerce Entrepreneurs:** Quickly research competitor products, pricing strategies, and market trends for inventory decisions Smart Shoppers & Deal Hunters:** Get instant product comparisons with sales volume data and discount tracking before purchasing Product Managers & Researchers:** Analyze Amazon marketplace positioning, customer sentiment, and pricing ranges for competitive intelligence How It Works Trigger: User sends product name via Telegram (e.g., "iPhone 15 Pro Max case") AI Validation: Gemini 2.5 Flash extracts core product keywords and validates input authenticity Data Collection: Decodo API scrapes Amazon search results, extracting prices, ratings, reviews, sales volume, and product URLs Processing: JavaScript node cleans data, removes duplicates, calculates value scores, and categorizes products (top picks, budget, premium, best value, most popular) Intelligence Layer: AI generates personalized recommendations with Telegram-optimized markdown formatting, shortened product names, and clean Amazon URLs Output & Delivery: Formatted recommendations sent to user with categorized options and direct purchase links Error Handling: Admin notifications via separate Telegram channel for workflow monitoring Setup Guide Prerequisites | Requirement | Type | Purpose | |-------------|------|---------| | n8n instance | Essential | Workflow execution platform | | Decodo Account | Essential | Amazon product data scraping | | Telegram Bot Token | Essential | Chat interface for user interactions | | Google Gemini API | Essential | AI-powered product validation and recommendations | | Telegram Account | Optional | Admin error notifications | Installation Steps Import the JSON file to your n8n instance Configure credentials: Decodo API: Sign up at decodo.com → Dashboard → Scraping APIs → Web Advanced → Copy BASIC AUTH TOKEN Telegram Bot: Message @BotFather on Telegram → /newbot → Copy HTTP API token (format: 123456789:ABCdefGHI...) Google Gemini: Obtain API key from Google AI Studio for Gemini 2.5 Flash model Update environment-specific values: Replace YOUR-CHAT-ID in "Notify Admin" node with your Telegram chat ID for error notifications Verify Telegram webhook IDs are properly configured Customize settings: Adjust AI prompt in "Generate Recommendations" node for different output formats Set character limits (default: 2500) for Telegram message length Test execution: Send test message to your Telegram bot: "iPhone 15 Pro" Verify processing status messages appear Confirm recommendations arrive with properly formatted links Customization Options Basic Adjustments: Character Limit**: Modify 2500 in AI prompt to adjust response length (Telegram max: 4096) Advanced Enhancements: Multi-language Support**: Add language detection and translation nodes for international users Price Tracking**: Integrate Google Sheets to log historical prices and trigger alerts on drops Image Support**: Enable Telegram photo messages with product images from scraping results Troubleshooting Common Issues: | Problem | Cause | Solution | |---------|-------|----------| | "No product detected" for valid inputs | AI validation too strict or ambiguous query | Add specific product details (model number, brand) in user input | | Empty recommendations returned | Decodo API rate limit or Amazon blocking | Wait 60 seconds between requests; verify Decodo account status | | Telegram message formatting broken | Special characters in product names | Ensure Telegram markdown mode is set to "Markdown" (legacy) not "MarkdownV2" | Use Case Examples Scenario 1: E-commerce Store Owner Challenge: Needs to quickly assess competitor pricing and product positioning for new inventory decisions without spending hours browsing Amazon Solution: Sends "wireless earbuds" to bot, receives categorized analysis of 20+ products with price ranges ($15-$250), top sellers, and discount opportunities Result: Identifies $35-$50 price gap in market, sources comparable product, achieves 40% profit margin Scenario 2: Smart Shopping Enthusiast Challenge: Wants to buy a laptop backpack but overwhelmed by 200+ Amazon options with varying prices and unclear value propositions Solution: Messages "laptop backpack" to bot, gets AI recommendations sorted by budget ($30), premium ($50+), best value (highest discount + good ratings), and most popular (by sales volume) Result: Purchases "Best Value" recommendation with 35% discount, saves $18 and 45 minutes of research time Created by: Khaisa Studio Category: AI | Productivity | E-commerce | Tags: amazon, telegram, ai, product-research, shopping, automation, gemini Need custom workflows? Contact us Connect with the creator: Portfolio • Workflows • LinkedIn • Medium • Threads
by Dr. Firas
💥 Generate UGC Promo Videos with Blotato and Sora 2 for eCommerce 🧩 Who is this for? This workflow is perfect for eCommerce brands, content creators, and marketing teams who want to automatically generate short, eye-catching videos from their product images — without editing software or manual work. 🚀 What problem does this workflow solve? Creating engaging promotional videos manually can be time-consuming and expensive. This automation eliminates that friction by combining Blotato, Sora 2, and AI scripting to turn static product images into dynamic UGC-style videos ready for TikTok, Instagram Reels, and YouTube Shorts. ⚙️ What this workflow does This workflow: Receives a product image directly from Telegram or another input source. Analyzes the image with OpenAI Vision to understand the product’s features and audience. Generates a natural, short UGC-style script using GPT-based AI. Sends the image and script to Sora 2 via the Fal API to generate a vertical promotional video. Monitors the video status every 15 seconds until completion. Downloads or automatically publishes the final video to your social platforms. 🧠 Setup Create a Fal.ai API key and set it in your n8n credentials (Authorization: Key YOUR_FAL_KEY). Connect your Telegram, OpenAI, and HTTP Request nodes as shown in the workflow. Make sure the Build Public Image URL node outputs a valid, public image link. In the HTTP Request node for Sora 2, set: Method: POST URL: https://fal.run/fal-ai/sora-2/image-to-video Headers: Authorization: Key YOUR_FAL_KEY Content-Type: application/json Body: Raw JSON with parameters like prompt, image_url, duration, and aspect_ratio. Run the workflow and monitor the execution logs for your video URL. Blotato → API key for social media publishing 🎨 How to customize this workflow to your needs 🧾 Change the video tone: Edit the OpenAI prompt to produce educational, emotional, or luxury-style scripts. 🎬 Adjust duration or format: Use Sora 2’s supported durations (4, 8, or 12 seconds) and aspect ratios (e.g., 9:16 for social media). 📲 Auto-publish your videos: Connect the TikTok, Instagram, or YouTube upload nodes for full automation. ✨ Add branding: Include overlays, logos, or end screens via CapCut or an external API integration. 🎥 Watch This Tutorial 👋 Need help or want to customize this? 📩 Contact: LinkedIn 📺 YouTube: @DRFIRASS 🚀 Workshops: Mes Ateliers n8n 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube / 🚀 Mes Ateliers n8n
by Harsh Agrawal
Automated SEO Intelligence Platform with DataForSEO and Claude Transform any company website into a detailed SEO audit report in minutes! This workflow combines real-time web scraping, comprehensive SEO data analysis, and advanced AI reasoning to deliver client-ready reports automatically. Perfect for digital agencies scaling their audit services, freelance SEO consultants automating research, or SaaS teams analyzing competitor strategies before sales calls. The Process Discovery Phase: Input a company name and website URL to kick things off. The system begins with website content extraction. Intelligence Gathering: A dedicated scraper sub-workflow extracts all website content and converts it to structured markdown. Strategic Analysis: LLMs process the scraped content to understand the business model, target market, and competitive positioning. They generate business research insights and product strategy recommendations tailored to that specific company. Once this analysis completes, DataForSEO API then pulls technical metrics, backlink profiles, keyword rankings, and site health indicators. Report Assembly: All findings flow into a master report generator that structures the data into sections covering technical SEO, content strategy, competitive landscape, and actionable next steps. Custom branded cover and closing pages are added. Delivery: The HTML report converts to PDF format and emails directly to your recipient - no manual intervention needed. Setup Steps Add API credentials: OpenRouter (for AI), DataForSEO (for scraping/SEO data), and PDFco (for PDF generation) Configure email sending through your preferred service (Gmail, SendGrid, etc.) Optional: Upload custom first/last page PDFs for white-label branding Test with your own website first to see the magic happen! Customize It Adjust analysis depth: Modify the AI prompts to focus on specific SEO aspects (local SEO, e-commerce, B2B SaaS, etc.) Change report style: Edit the HTML template in the Sample_Code node for different formatting Add integrations: Connect to your CRM to automatically trigger reports when leads enter your pipeline Scale it up: Process multiple URLs in batch by feeding a Google Sheet of prospects What You'll Need OpenRouter account (Claude Opus 4.1 recommended for best insights) DataForSEO subscription (handles both scraping and SEO metrics) PDFco account (converts your reports to professional PDFs) Email service credentials configured in n8n Need Help? Connect with me on LinkedIn if you have any doubt
by Andrey
Overview This n8n workflow automates brand monitoring across social media platforms (Reddit, LinkedIn, X, and Instagram) using the AnySite API. It searches posts mentioning your defined keywords, stores results in n8n Data Tables, analyzes engagement and sentiment, and generates a detailed AI-powered social media report automatically sent to your email. Key Features Multi-Platform Monitoring:** Reddit, LinkedIn, X (Twitter), and Instagram Automated Post Collection:** Searches for new posts containing tracked keywords Data Persistence:** Saves all posts and comments in structured Data Tables AI-Powered Reporting:** Uses GPT (OpenAI API) to summarize and analyze trends, engagement, and risks Automated Email Delivery:** Sends comprehensive daily/weekly reports via Gmail Comment Extraction:** Collects and formats post comments for deeper sentiment analysis Scheduling Support:** Can be executed manually or automatically (e.g., every night) How It Works Triggers The workflow runs: Automatically (via Schedule Trigger) — e.g., once daily Manually (via Manual Trigger) — for testing or on-demand analysis Data Collection Process Keyword Loading: Reads all keywords from the Data Table “Brand Monitoring Words” Social Media Search: For each keyword, the workflow calls the AnySite API endpoints: api/reddit/search/posts api/linkedin/search/posts api/twitter/search/posts (X) api/instagram/search/posts Deduplication: Before saving, checks if a post already exists in the “Brand Monitoring Posts” table. Data Storage: Inserts new posts into the Data Table with fields like type, title, url, vote_count, comment_count, etc. Comments Enrichment: For Reddit and LinkedIn, retrieves and formats comments into JSON strings, then updates the record. AI Analysis & Report Generation: The AI Agent (OpenAI GPT model) aggregates posts, analyzes sentiment, engagement, risks, and generates a structured HTML email report. Email Sending: Sends the final report via Gmail using your connected account. Setup Instructions Requirements Self-hosted or cloud n8n instance AnySite API key** – https://AnySite.io OpenAI API key** (GPT-4o or later) Connected Gmail account (for report delivery) Installation Steps Import the workflow Import the provided file: Social Media Monitoring.json Configure credentials AnySite API: Add access-token header with your API key OpenAI: Add your OpenAI API key in the “OpenAI Chat Model” node Gmail: Connect your Gmail account (OAuth2) in the “Send a message in Gmail” node Create required Data Tables 1️⃣ Brand Monitoring Words | Field | Type | Description | |-------|------|-------------| | word | string | Keyword or brand name to monitor | > Each row represents a single keyword to be tracked. 2️⃣ Brand Monitoring Posts | Field | Type | Description | |-------|------|-------------| | type | string | Platform type (e.g., reddit, linkedin, x, instagram) | | title | string | Post title or headline | | url | string | Direct link to post | | created_at | string | Post creation date/time | | subreddit_id | string | (Reddit only) subreddit ID | | subreddit_alias | string | (Reddit only) subreddit alias | | subreddit_url | string | (Reddit only) subreddit URL | | subreddit_description | string | (Reddit only) subreddit description | | comment_count | number | Number of comments | | vote_count | number | Votes, likes, or reactions count | | subreddit_member_count | number | (Reddit only) member count | | post_id | string | Unique post identifier | | text | string | Post body text | | comments | string | Serialized comments (JSON string) | | word | string | Matched keyword that triggered capture | AI Reporting Logic Collects all posts gathered during the run Aggregates by keyword and platform Evaluates sentiment, engagement, and risk signals Summarizes findings with an executive summary and key metrics Sends the Social Media Intelligence Report to your configured email Customization Options Schedule:** Adjust the trigger frequency (daily, hourly, etc.) Keywords:* Add or remove keywords in the *Brand Monitoring Words** table Report Depth:** Modify system prompts in the “AI Agent” node to customize tone and analysis focus Email Recipient:** Change the target email address in the “Send a message in Gmail” node Troubleshooting | Issue | Solution | |-------|-----------| | No posts found | Check AnySite API key and keyword relevance | | Duplicate posts | Verify Data Table deduplication setup | | Report not sent | Confirm Gmail OAuth2 connection | | AI Agent error | Ensure OpenAI API key and model selection are correct | Best Practices Use specific brand or product names in keywords for better precision Run the workflow daily to maintain fresh insights Periodically review and clean Data Tables Adjust AI prompt parameters to refine analytical tone Review AI-generated reports to ensure data quality Author Notes Created for automated cross-platform brand reputation monitoring, enabling real-time insights into how your brand is discussed online.
by Onur
Automated B2B Lead Generation: Google Places, Scrape.do & AI Enrichment This workflow is a powerful, fully automated B2B lead generation engine. It starts by finding businesses on Google Maps based on your criteria (e.g., "dentists" in "Istanbul"), assigns a quality score to each, and then uses Scrape.do to reliably access their websites. Finally, it leverages an AI agent to extract valuable contact information like emails and social media profiles. The final, enriched data is then neatly organized and saved directly into a Google Sheet. This template is built for reliability, using Scrape.do to handle the complexities of web scraping, ensuring you can consistently gather data without getting blocked. 🚀 What does this workflow do? Automatically finds businesses using the Google Places API based on a category and location you define. Calculates a leadScore for each business based on its rating, website presence, and operational status to prioritize high-quality leads. Filters out low-quality leads** to ensure you only focus on the most promising prospects. Reliably scrapes the website of each high-quality lead using Scrape.do to bypass common blocking issues and retrieve the raw HTML. Uses an AI Agent (OpenAI) to intelligently parse the website's HTML and extract hard-to-find contact details (emails, social media links, phone numbers). Saves all enriched lead data** to a Google Sheet, creating a clean, actionable list for your sales or marketing team. Runs on a schedule**, continuously finding new leads without any manual effort. 🎯 Who is this for? Sales & Business Development Teams:** Automate prospecting and build targeted lead lists. Marketing Agencies:** Generate leads for clients in specific industries and locations. Freelancers & Consultants:** Quickly find potential clients for your services. Startups & Small Businesses:** Build a customer database without spending hours on manual research. ✨ Benefits Full Automation:** Set it up once and let it run on a schedule to continuously fill your pipeline. AI-Powered Enrichment:** Go beyond basic business info. Get actual emails and social profiles that aren't available on Google Maps. Reliable Website Access:* Leverages *Scrape.do** to handle proxies and prevent IP blocks, ensuring consistent data gathering from target websites. High-Quality Leads:** The built-in scoring and filtering system ensures you don't waste time on irrelevant or incomplete listings. Centralized Database:** All your leads are automatically organized in a single, easy-to-access Google Sheet. ⚙️ How it Works Schedule Trigger: The workflow starts automatically at your chosen interval (e.g., daily). Set Parameters: You define the business type (searchCategory) and location (locationName) in a central Set node. Find Businesses: It calls the Google Places API to get a list of businesses matching your criteria. Score & Filter: A custom Function node scores each lead. An IF node then separates high-quality leads from low-quality ones. Loop & Enrich: The workflow processes each high-quality lead one by one. It uses a scraping service (Scrape.do) to reliably fetch the lead's website content. An AI Agent (OpenAI) analyzes the website's footer to find contact and social media links. Save Data: The final, enriched lead information is appended as a new row in your Google Sheet. 📋 n8n Nodes Used Schedule Trigger Set HTTP Request (for Google Places & Scrape.do) Function If Split in Batches (Loop Over Items) HTML Langchain Agent (with OpenAI Chat Model & Structured Output Parser) Google Sheets 🔑 Prerequisites An active n8n instance. Google Cloud Project* with the *Places API** enabled. Google Places API Key**, stored in n8n's Header Auth credentials. A Scrape.do Account and API Token**. This is essential for reliably scraping websites without your n8n server's IP getting blocked. OpenAI Account & API Key** for the AI-powered data extraction. Google Account** with access to Google Sheets. Google Sheets API Credentials (OAuth2)** configured in n8n. A Google Sheet* prepared with columns to store the lead data (e.g., BusinessName, Address, Phone, Website, Email, Facebook, etc.*). 🛠️ Setup Import the workflow into your n8n instance. Configure Credentials: Create and/or select your credentials for: Google Places API: In the 2. Find Businesses (Google Places) node, select your Header Auth credential containing your API key. Scrape.do: In the 6a. Scrape Website HTML node, configure credentials for your Scrape.do account. OpenAI: In the OpenAI Chat Model node, select your OpenAI credentials. Google Sheets: In the 7. Save to Google Sheets node, select your Google Sheets OAuth2 credentials. Define Your Search: In the 1. Set Search Parameters node, update the searchCategory and locationName values to match your target market. Link Your Google Sheet: In the 7. Save to Google Sheets node, select your Spreadsheet and Sheet Name from the dropdown lists. Map the incoming data to the correct columns in your sheet. Set Your Schedule: Adjust the Schedule Trigger to run as often as you like (e.g., once a day). Activate the workflow! Your automated lead generation will begin on the next scheduled run.
by Roshan Ramani
Who's it for This template is perfect for content creators, researchers, marketers, and Reddit enthusiasts who want to stay updated on specific topics without manually browsing Reddit. If you need curated, AI-summarized Reddit insights delivered directly to your Telegram, this workflow automates the entire process. What it does This workflow transforms your Telegram into a powerful Reddit search engine with AI-powered curation. Simply send any keyword to your Telegram bot, and it will: Search Reddit across 4 different sorting methods (top, hot, relevance) to capture diverse perspectives Automatically remove duplicate posts from multiple search results Filter posts based on quality metrics (minimum 50 upvotes, recent content within 15 days, non-empty text) Extract key information: title, upvotes, subreddit, publication date, URL, and content Generate a clean, Telegram-formatted summary using Google Gemini AI Deliver structured results with direct links back to you instantly The AI summary includes post titles, upvote counts, timestamps, brief insights, and direct Reddit links—all formatted for easy mobile reading. How it works Step 1: Telegram Trigger User sends a search keyword via Telegram (e.g., "voice AI agents") Step 2: Parallel Reddit Searches Four simultaneous Reddit API calls search with different sorting algorithms: Top posts (all-time popularity) Hot posts (trending now) Relevance (best keyword matches) Top posts (duplicate for broader coverage) Step 3: Merge & Deduplicate All search results combine into one stream, then a JavaScript code node removes duplicate posts by comparing post IDs Step 4: Field Extraction The Edit Fields node extracts and formats: Post title Upvote count Subreddit name and subscriber count Publication date (converted from Unix timestamp) Reddit URL Post content (selftext) Step 5: Quality Filtering The Filter node applies three conditions: Minimum 50 upvotes (ensures quality) Non-empty content (excludes link-only posts) Posted within last 15 days (ensures freshness) Step 6: Data Aggregation All filtered posts aggregate into a single dataset for AI processing Step 7: AI Summarization Google Gemini AI analyzes the aggregated posts and generates a concise, Telegram-formatted summary with: Emoji indicators for better readability Point-wise breakdown of top 5-7 posts Upvote counts and relative timestamps Brief 1-2 sentence summaries Direct Reddit links Step 8: Delivery The formatted summary sends back to the user's Telegram chat Requirements Credentials needed: Reddit OAuth2 API** - For searching Reddit posts (Get Reddit API credentials) Google Gemini API** - For AI-powered summarization (Get Gemini API key) Telegram Bot Token** - For receiving queries and sending results (Create Telegram Bot) n8n Version: Self-hosted or Cloud (latest version recommended) Setup Instructions 1. Create Telegram Bot Message @BotFather on Telegram Send /newbot and follow prompts Copy the bot token for n8n credentials Start a chat with your new bot 2. Configure Reddit API Go to https://www.reddit.com/prefs/apps Click "Create App" → Select "script" Note your Client ID and Secret Add credentials to n8n's Reddit OAuth2 3. Get Gemini API Key Visit https://ai.google.dev/ Create a new API key Add to n8n's Google Gemini credentials 4. Import & Configure Workflow Import this template into n8n Add your three credentials to respective nodes Remove pinData from "Telegram Trigger" node (test data) Activate the workflow 5. Test It Send any keyword to your Telegram bot (e.g., "machine learning") Wait 10-20 seconds for results Receive AI-summarized Reddit insights How to customize Adjust Quality Filters: Edit the Filter node conditions: Change minimum upvotes (currently 50) Modify time range (currently 15 days) Add subreddit subscriber minimum Limit Results: Add a Limit node after Filter to cap results at 10-15 posts for faster processing Change Search Strategies: Modify the Reddit nodes' "sort" parameter: new - Latest posts first comments - Most commented controversial - Controversial content Customize AI Output: Edit the AI Agent's system message to: Change summary style (more/less detail) Adjust formatting (bullets, numbered lists) Modify language/tone Add emoji preferences Add User Feedback: Insert a Telegram Send Message node after the trigger: "🔍 Searching Reddit for '{{ $json.message.text }}'... Please wait." Enable Error Handling: Create an Error Workflow: Add Error Trigger node Send fallback message: "❌ Search failed. Please try again." Sort by Popularity: Add a Sort node after Filter: Field: upvotes Order: Descending
by Fahmi Fahreza
AI Research Assistant Using Gemini AI and Decodo Sign up for Decodo HERE for Discount This workflow transforms your Telegram bot into a smart academic research assistant powered by Gemini AI and Decodo. It analyzes queries, interprets URLs, scrapes scholarly data, and returns concise summaries of research papers directly in chat. Who’s it for? For researchers, students, and AI enthusiasts who want to search and summarize academic content via Telegram using Google Scholar and arXiv. How it works The Telegram bot captures text, voice, or image messages. Gemini models interpret academic URLs and user intent. Decodo extracts paper details like titles, abstracts, and publication info. The AI agent summarizes results and delivers them as text or file (if too long). How to set up Add your Telegram bot credentials in the Start Telegram Bot node. Connect Google Gemini and Decodo API credentials. Replace {{INPUT_SEARCH_URL_INSIGHTS}} placeholder on Research Summary Agent's system message with your search URL insights (or use the pinned example). Test by sending a text, image, or voice message to your bot. Activate the workflow to run in real-time.
by Nguyen Thieu Toan
ForumPulse for n8n – Daily Pulse & On-demand Deep Dives Author: Nguyen Thieu Toan Category: Community & Knowledge Automation Tags: Telegram, Reddit, n8n Forum, AI Summarization, Gemini, Groq How it works ForumPulse is an AI-powered assistant that keeps you connected to the latest discussions around n8n. The workflow integrates Reddit (r/n8n) and the n8n Community Forum, fetches trending and recent posts, and uses Gemini/Groq AI models to generate clean, structured summaries. It works in two complementary modes: Daily Pulse (Automated Digest): Runs on schedule (default: 8:00 AM) to gather highlights and deliver a concise summary directly to your Telegram. On-demand Deep Dive (Interactive): Listens to Telegram queries in real-time, detects intent (search, deep dive, open link, or chat), and provides summaries, comments, and insights for any chosen post. When AI intent detection confidence drops below 0.7, the bot automatically asks for clarification before proceeding—ensuring accuracy and transparency. Step-by-step 1. Setup & Prerequisites n8n instance** (cloud or self-hosted). Telegram Bot** (created via BotFather). MongoDB** (optional, for persistent memory). API keys** for Gemini and Groq. Your Telegram user ID** (to receive replies). ⚠️ Replace all test credentials and tokens with your own. Never commit real secrets into exported templates. 2. Daily Pulse Automation Schedule Trigger** runs the workflow every morning at the configured time. Reddit + Forum Search** collects hot/new topics. Merge Results** combines both sources into a unified dataset. AI Summarizer Overview** condenses the results into a short, engaging digest. Telegram Output** delivers the digest, automatically split into safe chunks under 2000 characters. 3. On-demand Interaction Telegram Trigger** listens for incoming messages. Intent Analysis (AI Agent)* classifies the query as *Search | Open Link | Deep Dive | Chitchat. Confidence Gate**: if confidence < 0.7, sends a clarification prompt to the user. Branch by Intent**: Search: Query Reddit/Forum with filters. Open Link: Fetch details of a specific post. Deep Dive: Retrieve comments and metadata. Chitchat: Respond conversationally. AI Summarizer** structures the output, highlighting trends, issues, and takeaways. Telegram Delivery** formats and sends the reply, respecting HTML tags and message length. 4. Deep Dive Details Post Extraction** fetches titles, authors, timestamps, and stats. Comment Parsing** organizes replies into structured data. Merge Post + Comments** builds a complete context package. Summarizer** produces detailed, actionable insights. 5. Error Handling & Safety Confidence Check** prevents wrong answers by requiring clarification. Error Paths** handle API downtime or unexpected formats gracefully. Auto Chunking** avoids Telegram’s message length cap (2000 chars). Safe Defaults** ensure fallback queries when inputs are missing or unclear. Customization Options Sources**: Add or replace platforms by editing HTTP Request nodes. Schedule**: Change the cron time in the Schedule Trigger (e.g., 7:30 AM). Filters**: Adjust default sort order, time ranges, and result limits. AI Persona**: Reword the systemMessage in AI Agent nodes to change tone (professional, casual, emoji-rich). Languages**: Auto-detects user language, but you can force English or Vietnamese by editing prompt settings. Memory**: Enable MongoDB nodes for persistent user context across sessions. Integrations**: Extend beyond Telegram—send digests to Slack, Discord, or email. Models**: Swap Gemini/Groq with other supported LLMs for experimentation. ✨ Crafted by Nguyen Thieu Toan with a focus on clarity, reliability, and community-driven insights. This workflow is not just functional - it reflects a design philosophy: automation should feel natural, transparent, and genuinely useful.