by Li CHEN
AWS News Analysis and LinkedIn Automation Pipeline Transform AWS industry news into engaging LinkedIn content with AI-powered analysis and automated approval workflows. Who's it for This template is perfect for: Cloud architects and DevOps engineers** who want to stay current with AWS developments Content creators** looking to automate their AWS news coverage Marketing teams** needing consistent, professional AWS content Technical leaders** who want to share industry insights on LinkedIn AWS consultants** building thought leadership through automated content How it works This workflow creates a comprehensive AWS news analysis and content generation pipeline with two main flows: Flow 1: News Collection and Analysis Scheduled RSS Monitoring: Automatically fetches latest AWS news from the official AWS RSS feed daily at 8 PM AI-Powered Analysis: Uses AWS Bedrock (Claude 3 Sonnet) to analyze each news item, extracting: Professional summary Key themes and keywords Importance rating (Low/Medium/High) Business impact assessment Structured Data Storage: Saves analyzed news to Feishu Bitable with approval status tracking Flow 2: LinkedIn Content Generation Manual Approval Trigger: Feishu automation sends approved news items to the webhook AI Content Creation: AWS Bedrock generates professional LinkedIn posts with: Attention-grabbing headlines Technical insights from a Solutions Architect perspective Business impact analysis Call-to-action engagement Automated Publishing: Posts directly to LinkedIn with relevant hashtags How to set up Prerequisites AWS Bedrock access** with Claude 3 Sonnet model enabled Feishu account** with Bitable access LinkedIn company account** with posting permissions n8n instance** (self-hosted or cloud) Detailed Configuration Steps 1. AWS Bedrock Setup Step 1: Enable Claude 3 Sonnet Model Log into your AWS Console Navigate to AWS Bedrock Go to Model access in the left sidebar Find Anthropic Claude 3 Sonnet and click Request model access Fill out the access request form (usually approved within minutes) Once approved, verify the model appears in your Model access list Step 2: Create IAM User and Credentials Go to IAM Console Click Users → Create user Name: n8n-bedrock-user Attach policy: AmazonBedrockFullAccess (or create custom policy with minimal permissions) Go to Security credentials tab → Create access key Choose Application running outside AWS Download the credentials CSV file Step 3: Configure in n8n In n8n, go to Credentials → Add credential Select AWS credential type Enter your Access Key ID and Secret Access Key Set Region to your preferred AWS region (e.g., us-east-1) Test the connection Useful Links: AWS Bedrock Documentation Claude 3 Sonnet Model Access AWS Bedrock Pricing 2. Feishu Bitable Configuration Step 1: Create Feishu Account and App Sign up at Feishu International Create a new Bitable (multi-dimensional table) Go to Developer Console → Create App Enable Bitable permissions in your app Generate App Token and App Secret Step 2: Create Bitable Structure Create a new Bitable with these columns: title (Text) pubDate (Date) summary (Long Text) keywords (Multi-select) rating (Single Select: Low, Medium, High) link (URL) approval_status (Single Select: Pending, Approved, Rejected) Get your App Token and Table ID: App Token: Found in app settings Table ID: Found in the Bitable URL (tbl...) Step 3: Set Up Automation In your Bitable, go to Automation → Create automation Trigger: When field value changes → Select approval_status field Condition: approval_status equals "Approved" Action: Send HTTP request Method: POST URL: Your n8n webhook URL (from Flow 2) Headers: Content-Type: application/json Body: {{record}} Step 4: Configure Feishu Credentials in n8n Install Feishu Lite community node (self-hosted only) Add Feishu credential with your App Token and App Secret Test the connection Useful Links: Feishu Developer Documentation Bitable API Reference Feishu Automation Guide 3. LinkedIn Company Account Setup Step 1: Create LinkedIn App Go to LinkedIn Developer Portal Click Create App Fill in app details: App name: AWS News Automation LinkedIn Page: Select your company page App logo: Upload your logo Legal agreement: Accept terms Step 2: Configure OAuth2 Settings In your app, go to Auth tab Add redirect URL: https://your-n8n-instance.com/rest/oauth2-credential/callback Request these scopes: w_member_social (Post on behalf of members) r_liteprofile (Read basic profile) r_emailaddress (Read email address) Step 3: Get Company Page Access Go to your LinkedIn Company Page Navigate to Admin tools → Manage admins Ensure you have Content admin or Super admin role Note your Company Page ID (found in page URL) Step 4: Configure LinkedIn Credentials in n8n Add LinkedIn OAuth2 credential Enter your Client ID and Client Secret Complete OAuth2 flow by clicking Connect my account Select your company page for posting Useful Links: LinkedIn Developer Portal LinkedIn API Documentation LinkedIn OAuth2 Guide 4. Workflow Activation Final Setup Steps: Import the workflow JSON into n8n Configure all credential connections: AWS Bedrock credentials Feishu credentials LinkedIn OAuth2 credentials Update webhook URL in Feishu automation to match your n8n instance Activate the scheduled trigger (daily at 8 PM) Test with manual webhook trigger using sample data Verify Feishu Bitable receives data Test approval workflow and LinkedIn posting Requirements Service Requirements AWS Bedrock** with Claude 3 Sonnet model access AWS account with Bedrock service enabled IAM user with Bedrock permissions Model access approval for Claude 3 Sonnet Feishu Bitable** for news storage and approval workflow Feishu account (International or Lark) Developer app with Bitable permissions Automation capabilities for webhook triggers LinkedIn Company Account** for automated posting LinkedIn company page with admin access LinkedIn Developer app with posting permissions OAuth2 authentication setup n8n community nodes**: Feishu Lite node (self-hosted only) Technical Requirements n8n instance** (self-hosted recommended for community nodes) Webhook endpoint** accessible from Feishu automation Internet connectivity** for API calls and RSS feeds Storage space** for workflow execution logs Cost Considerations AWS Bedrock**: ~$0.01-0.05 per news analysis Feishu**: Free tier available, paid plans for advanced features LinkedIn**: Free API access with rate limits n8n**: Self-hosted (free) or cloud subscription How to customize the workflow Content Customization Modify AI prompts** in the AI Agent nodes to change tone, focus, or target audience Adjust hashtags** in the LinkedIn posting node for different industries Change scheduling** frequency by modifying the Schedule Trigger settings Integration Options Replace LinkedIn** with Twitter/X, Facebook, or other social platforms Add Slack notifications** for approved content before posting Integrate with CRM** systems to track content performance Add content calendar** integration for better planning Advanced Features Multi-language support** by modifying AI prompts for different regions Content categorization** by adding tags for different AWS services Performance tracking** by integrating analytics platforms Team collaboration** by adding approval workflows with multiple reviewers Technical Modifications Change RSS sources** to monitor other AWS blogs or competitor news Adjust AI models** to use different Bedrock models or external APIs Add data validation** nodes for better error handling Implement retry logic** for failed API calls Important Notes Service Limitations This template uses community nodes (Feishu Lite) and requires self-hosted n8n Geo-restrictions** may apply to AWS Bedrock models in certain regions Rate limits** may affect high-frequency posting - adjust scheduling accordingly Content moderation** is recommended before automated posting Cost considerations**: Each AI analysis costs approximately $0.01-0.05 USD per news item Troubleshooting Common Issues AWS Bedrock Issues: Model not found**: Ensure Claude 3 Sonnet access is approved in your region Access denied**: Verify IAM permissions include Bedrock service access Rate limiting**: Implement retry logic or reduce analysis frequency Feishu Integration Issues: Authentication failed**: Check App Token and App Secret are correct Table not found**: Verify Table ID matches your Bitable URL Automation not triggering**: Ensure webhook URL is accessible and returns 200 status LinkedIn Posting Issues: OAuth2 errors**: Re-authenticate LinkedIn credentials Posting failed**: Verify company page admin permissions Rate limits**: LinkedIn has daily posting limits for company pages Security Best Practices Never hardcode credentials** in workflow nodes Use environment variables** for sensitive configuration Regularly rotate API keys** and access tokens Monitor API usage** to prevent unexpected charges Implement error handling** for failed API calls
by Oussama
Production-ready solution for controlling AI agent usage and preventing abuse while managing costs. 🎯 Problem Solved Unlimited AI interactions → Excessive API costs Service abuse → Uncontrolled resource consumption No built-in limits → Need for usage quotas ✅ Solution Overview Two-Part System: Main Flow: User interaction tracking + AI responses Reset Flow: Automated counter resets 🔄 How It Works User Message → Track Counter → Check Limit → Allow/Block → AI Response 🛠️ Core Components Main Workflow 📱 Telegram Trigger - Receives user messages 📊 Google Sheets Counter - Tracks messages per user 🔀 Switch Logic - Checks limits (default: 3 messages) 🤖 AI Agent - Processes allowed interactions 💬 Smart Responses - Delivers AI answers or limit warnings Auto-Reset System ⏰ Schedule Trigger - Runs every configurable interval 🔄 Bulk Counter Reset - Resets all users to 0 ⚙️ Configuration Message Limits Modify Switch Node conditions: > 3 messages → Block silently = 3 messages → Send limit warning < 3 messages → Allow AI response Reset Schedules Testing: Every 1 minute Hourly: 0 * * * * Daily: 0 0 * * * Weekly: 0 0 * * 0 📋 Setup Requirements Credentials Needed: 🤖 Telegram Bot Token 📊 Google Sheets API 🧠 AI Model *Google Sheets Structure: *Column A: User ID (Telegram chat.id) Column B: Message Counter 🎯 Perfect For 💰 Cost Control - Prevent runaway API costs 🛡️ Demo/Trial Bots - Limited interactions 🏢 Customer Service - Usage quotas 🎓 Educational Bots - Daily limits 🚫 Anti-Abuse - Fair usage policies 🚀 Key Benefits ✅ Cost Management - Control AI API expenses ✅ Fair Access - Equal usage for all users ✅ Production Ready - Robust error handling ✅ Flexible Limits - Easy adjustment ✅ Auto-Reset - No manual intervention ✅ User-Friendly - Clear limit messages 📝 Quick Customization Adjust Limits: Change Switch node values Reset Timing: Modify Schedule Trigger Custom Messages: Edit Telegram response nodes User Tiers: Add columns to Google Sheets
by Club de Inteligencia Artificial Politécnico CIAP
Telegram Appointment Scheduling Bot with n8n 📃 Description Tired of managing appointments manually? This template transforms your Telegram account into a smart virtual assistant that handles the entire scheduling process for you, 24/7. This workflow allows you to deploy a fully functional Telegram bot that not only schedules appointments but also checks real-time availability in your Google Calendar, logs a history in Google Sheets, and allows your clients to cancel or view their upcoming appointments. It's the perfect solution for professionals, small businesses, or anyone looking to automate their booking system professionally and effortlessly. ✨ Key Features Complete Appointment Management:** Allows users to schedule, cancel, and list their future appointments. Conflict Prevention:** Integrates with Google Calendar to check availability before confirming a booking, eliminating the risk of double-booking. Automatic Logging:** Every confirmed appointment is saved to a row in Google Sheets, creating a perfect database for tracking and analysis. Smart Interaction:** The bot handles unrecognized commands and guides the user, ensuring a smooth experience. Easy to Adapt:** Connect your own accounts, customize messages, and tailor it to your business needs in minutes. 🚀 Setup Follow these steps to deploy your own instance of this bot: 1. Prerequisites An n8n instance (Cloud or self-hosted). A Telegram account. A Google account. 2. Telegram Bot Talk to @BotFather on Telegram. Create a new bot using /newbot. Give it a name and a username. Copy and save the API token it provides. 3. Google Cloud & APIs Go to the Google Cloud Console. Create a new project. Enable the Google Calendar API and Google Sheets API. Create OAuth 2.0 Client ID credentials. Make sure to add your n8n instance's OAuth redirect URL. Save the Client ID and Client Secret. 4. Google Sheets Create a new spreadsheet in Google Sheets. Define the column headers in the first row. For example: id, Client Name, Date and Time, ISO Date. 5. n8n Import the workflow JSON file into your n8n instance. Set up the credentials: Telegram: Create a new credential and paste your bot's token. Google Calendar & Google Sheets (OAuth2): Create a new credential and paste the Client ID and Client Secret from the Google Cloud Console. Review the Google Calendar and Google Sheets nodes to select your correct calendar and spreadsheet. Activate the workflow! 💬 Usage Once the bot is running, you can interact with it using the following commands in Telegram: To start the bot:** /start To schedule a new appointment:** agendar YYYY-MM-DD HH:MM Your Full Name To cancel an existing appointment:** cancelar YYYY-MM-DD HH:MM Your Full Name To view your future appointments:** mis citas Your Full Name 👥 Authors Jaren Pazmiño President of the Polytechnic Artificial Intelligence Club (CIAP)
by Santhej Kallada
Who is this for? Creators, designers, and developers exploring AI-powered image generation. Automation enthusiasts who want to integrate image creation into n8n workflows. Telegram bot builders looking to add visual AI capabilities. Marketers or freelancers automating creative content workflows. What problem is this workflow solving? Creating AI images usually requires multiple tools and manual setup. This workflow removes the complexity by: Connecting Nano Banana (AI image model) directly to n8n. Allowing image generation via Telegram chatbot. Providing a no-code setup that is fully automated and scalable. What this workflow does This workflow demonstrates how to generate AI images using Nano Banana and n8n, with an integrated Telegram chatbot interface. The process includes: Connecting Gemini Nano Banana to n8n. Automating image generation requests triggered from Telegram. Returning AI-generated images back to the user. Allowing customization of prompts and styles dynamically. By the end, you’ll have a fully functional automation to generate and send AI-created images through Telegram — no coding required. Setup Create accounts: Sign up on n8n.io and ensure you have Telegram Bot API access. Connect your Nano Banana or Gemini API endpoint. Set up your Telegram Bot: Use BotFather to create a new bot and get the token. Add the “Telegram Trigger” node in n8n. Configure Nano Banana connection: Add an HTTP Request node for Nano Banana API. Insert your API key and prompt parameters. Handle responses: Parse the AI-generated image output. Send the image file back to the Telegram user. Test and Deploy: Run a sample image prompt. Verify that Telegram returns the correct generated image. How to customize this workflow to your needs Modify prompts or styles to fit different artistic use cases. Add conditional logic for image size, aspect ratio, or filters. Integrate with Google Drive or Notion for image storage. Schedule automatic image generation for campaigns or content creation. Expand with OpenAI or Stability AI for hybrid workflows. Notes Nano Banana API may have rate limits depending on usage. Ensure your Telegram bot has permission to send files and images. You can host this workflow on n8n Cloud or self-hosted setups. Want A Video Tutorial on How to Setup This Automation: https://youtu.be/0s6ZdU1fjc4
by Intuz
This n8n template from Intuz provides a complete and automated solution for identifying high-intent leads from LinkedIn job postings and automatically generating personalized outreach emails. Disclaimer Community nodes are used in this workflow. Who’s this workflow for? B2B Sales Teams & SDRs Recruitment Agencies & Tech Recruiters Startup Founders Growth Marketing Teams How it works 1. Scrape Hiring Signals: The workflow starts by using an Apify scraper to find companies actively hiring for specific roles on LinkedIn (e.g., “ML Engineer”). 2. Filter & Qualify Companies: It automatically filters the results based on your criteria (e.g., company size, industry) to create a high-quality list of target accounts. 3. Find Decision-Makers: For each qualified company, it uses Apollo.io to find key decision-makers (VPs, Directors, etc.) and enrich their profiles with verified email addresses using user’s Apollo API. 4. Build a Lead List: All the enriched lead data—contact name, title, email, company info—is systematically added to a Google Sheet. 5. Generate AI-Powered Emails: The workflow then feeds each lead’s data to a Google Gemini AI model, which drafts a unique, personalized cold email that references the specific job the company is hiring for. 6. Complete the Outreach List: Finally, the AI-generated subject line and email body are saved back into the Google Sheet, leaving you with a fully prepared, hyper-targeted outreach campaign. Setup Instructions 1. Apify Configuration: Connect your Apify account in the Run the LinkedIn Job Scraper node. You’ll need an apify scrapper, we have used this scrapper In the Custom Body field, paste the URL of your target LinkedIn Jobs search query. 2. Data Enrichment: Connect your account API of data providers like Clay, Hunter, Apollo, etc. using HTTP Header Auth in the Get Targeted Personnel and Email Finder nodes. 3. Google Gemini AI: Connect your Google Gemini (or PaLM) API account in the Google Gemini Chat Model node. 4. Google Sheets Setup: Connect your Google Sheets account. Create a spreadsheet and update the Document ID and Sheet Name in the three Google Sheets nodes to match your own. 5. Activate Workflow: Click “Execute workflow” to run the entire lead generation and email-writing process on demand. Connect with us: Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Workflow Automation Click here- Get Started
by Roshan Ramani
Who's it for This template is perfect for content creators, researchers, marketers, and Reddit enthusiasts who want to stay updated on specific topics without manually browsing Reddit. If you need curated, AI-summarized Reddit insights delivered directly to your Telegram, this workflow automates the entire process. What it does This workflow transforms your Telegram into a powerful Reddit search engine with AI-powered curation. Simply send any keyword to your Telegram bot, and it will: Search Reddit across 4 different sorting methods (top, hot, relevance) to capture diverse perspectives Automatically remove duplicate posts from multiple search results Filter posts based on quality metrics (minimum 50 upvotes, recent content within 15 days, non-empty text) Extract key information: title, upvotes, subreddit, publication date, URL, and content Generate a clean, Telegram-formatted summary using Google Gemini AI Deliver structured results with direct links back to you instantly The AI summary includes post titles, upvote counts, timestamps, brief insights, and direct Reddit links—all formatted for easy mobile reading. How it works Step 1: Telegram Trigger User sends a search keyword via Telegram (e.g., "voice AI agents") Step 2: Parallel Reddit Searches Four simultaneous Reddit API calls search with different sorting algorithms: Top posts (all-time popularity) Hot posts (trending now) Relevance (best keyword matches) Top posts (duplicate for broader coverage) Step 3: Merge & Deduplicate All search results combine into one stream, then a JavaScript code node removes duplicate posts by comparing post IDs Step 4: Field Extraction The Edit Fields node extracts and formats: Post title Upvote count Subreddit name and subscriber count Publication date (converted from Unix timestamp) Reddit URL Post content (selftext) Step 5: Quality Filtering The Filter node applies three conditions: Minimum 50 upvotes (ensures quality) Non-empty content (excludes link-only posts) Posted within last 15 days (ensures freshness) Step 6: Data Aggregation All filtered posts aggregate into a single dataset for AI processing Step 7: AI Summarization Google Gemini AI analyzes the aggregated posts and generates a concise, Telegram-formatted summary with: Emoji indicators for better readability Point-wise breakdown of top 5-7 posts Upvote counts and relative timestamps Brief 1-2 sentence summaries Direct Reddit links Step 8: Delivery The formatted summary sends back to the user's Telegram chat Requirements Credentials needed: Reddit OAuth2 API** - For searching Reddit posts (Get Reddit API credentials) Google Gemini API** - For AI-powered summarization (Get Gemini API key) Telegram Bot Token** - For receiving queries and sending results (Create Telegram Bot) n8n Version: Self-hosted or Cloud (latest version recommended) Setup Instructions 1. Create Telegram Bot Message @BotFather on Telegram Send /newbot and follow prompts Copy the bot token for n8n credentials Start a chat with your new bot 2. Configure Reddit API Go to https://www.reddit.com/prefs/apps Click "Create App" → Select "script" Note your Client ID and Secret Add credentials to n8n's Reddit OAuth2 3. Get Gemini API Key Visit https://ai.google.dev/ Create a new API key Add to n8n's Google Gemini credentials 4. Import & Configure Workflow Import this template into n8n Add your three credentials to respective nodes Remove pinData from "Telegram Trigger" node (test data) Activate the workflow 5. Test It Send any keyword to your Telegram bot (e.g., "machine learning") Wait 10-20 seconds for results Receive AI-summarized Reddit insights How to customize Adjust Quality Filters: Edit the Filter node conditions: Change minimum upvotes (currently 50) Modify time range (currently 15 days) Add subreddit subscriber minimum Limit Results: Add a Limit node after Filter to cap results at 10-15 posts for faster processing Change Search Strategies: Modify the Reddit nodes' "sort" parameter: new - Latest posts first comments - Most commented controversial - Controversial content Customize AI Output: Edit the AI Agent's system message to: Change summary style (more/less detail) Adjust formatting (bullets, numbered lists) Modify language/tone Add emoji preferences Add User Feedback: Insert a Telegram Send Message node after the trigger: "🔍 Searching Reddit for '{{ $json.message.text }}'... Please wait." Enable Error Handling: Create an Error Workflow: Add Error Trigger node Send fallback message: "❌ Search failed. Please try again." Sort by Popularity: Add a Sort node after Filter: Field: upvotes Order: Descending
by octik5
🤖 This n8n workflow automatically parses news articles from a webpage, enhances them with AI, and publishes them to a Telegram channel with a watermarked image. Unlike the RSS-based setup, this workflow directly fetches and processes content from any specified webpage. Use Cases Automatically post new website articles to your Telegram channel. Use AI to rewrite or summarize text for better readability. Add branded watermarks to images and keep your channel visually consistent. How It Works Schedule Trigger: Runs the workflow on a custom schedule. Fetch Web Page: Retrieves the HTML content of your chosen website. Extract Links: Parses article links from the HTML source. Check & Update Google Sheet: Skips already processed links and records new ones. Fetch & Clean Article: Retrieves, extracts, and formats the article text. AI Text Customization: Uses an AI agent to enhance the text. Image Watermarking: Fetches the article image and applies a watermark. Telegram Publishing: Posts the final image and AI-enhanced text to your channel. Setup Steps Google Sheet:** Create and share a sheet to track processed links. Web URL:** Enter your target webpage in the HTTP Request node. AI Agent:** Choose a model and prompt for text customization (e.g., OpenRouter or Gemini). Telegram Bot:** Add your bot token and chat ID. Run & Test:** Execute once manually, then let it run on schedule. Tips AI usage may incur costs depending on the model provider. Some AI models can be geo-restricted — check availability if you get “model not found.” Customize watermark style (font, color, size) to match your branding. Use Telegram Markdown for rich message formatting. ✅ Key Advantage: No RSS required — the workflow directly parses websites, enhances content with AI, and automates publishing to Telegram.
by Fahmi Fahreza
AI Research Assistant Using Gemini AI and Decodo Sign up for Decodo HERE for Discount This workflow transforms your Telegram bot into a smart academic research assistant powered by Gemini AI and Decodo. It analyzes queries, interprets URLs, scrapes scholarly data, and returns concise summaries of research papers directly in chat. Who’s it for? For researchers, students, and AI enthusiasts who want to search and summarize academic content via Telegram using Google Scholar and arXiv. How it works The Telegram bot captures text, voice, or image messages. Gemini models interpret academic URLs and user intent. Decodo extracts paper details like titles, abstracts, and publication info. The AI agent summarizes results and delivers them as text or file (if too long). How to set up Add your Telegram bot credentials in the Start Telegram Bot node. Connect Google Gemini and Decodo API credentials. Replace {{INPUT_SEARCH_URL_INSIGHTS}} placeholder on Research Summary Agent's system message with your search URL insights (or use the pinned example). Test by sending a text, image, or voice message to your bot. Activate the workflow to run in real-time.
by Nguyen Thieu Toan
ForumPulse for n8n – Daily Pulse & On-demand Deep Dives Author: Nguyen Thieu Toan Category: Community & Knowledge Automation Tags: Telegram, Reddit, n8n Forum, AI Summarization, Gemini, Groq How it works ForumPulse is an AI-powered assistant that keeps you connected to the latest discussions around n8n. The workflow integrates Reddit (r/n8n) and the n8n Community Forum, fetches trending and recent posts, and uses Gemini/Groq AI models to generate clean, structured summaries. It works in two complementary modes: Daily Pulse (Automated Digest): Runs on schedule (default: 8:00 AM) to gather highlights and deliver a concise summary directly to your Telegram. On-demand Deep Dive (Interactive): Listens to Telegram queries in real-time, detects intent (search, deep dive, open link, or chat), and provides summaries, comments, and insights for any chosen post. When AI intent detection confidence drops below 0.7, the bot automatically asks for clarification before proceeding—ensuring accuracy and transparency. Step-by-step 1. Setup & Prerequisites n8n instance** (cloud or self-hosted). Telegram Bot** (created via BotFather). MongoDB** (optional, for persistent memory). API keys** for Gemini and Groq. Your Telegram user ID** (to receive replies). ⚠️ Replace all test credentials and tokens with your own. Never commit real secrets into exported templates. 2. Daily Pulse Automation Schedule Trigger** runs the workflow every morning at the configured time. Reddit + Forum Search** collects hot/new topics. Merge Results** combines both sources into a unified dataset. AI Summarizer Overview** condenses the results into a short, engaging digest. Telegram Output** delivers the digest, automatically split into safe chunks under 2000 characters. 3. On-demand Interaction Telegram Trigger** listens for incoming messages. Intent Analysis (AI Agent)* classifies the query as *Search | Open Link | Deep Dive | Chitchat. Confidence Gate**: if confidence < 0.7, sends a clarification prompt to the user. Branch by Intent**: Search: Query Reddit/Forum with filters. Open Link: Fetch details of a specific post. Deep Dive: Retrieve comments and metadata. Chitchat: Respond conversationally. AI Summarizer** structures the output, highlighting trends, issues, and takeaways. Telegram Delivery** formats and sends the reply, respecting HTML tags and message length. 4. Deep Dive Details Post Extraction** fetches titles, authors, timestamps, and stats. Comment Parsing** organizes replies into structured data. Merge Post + Comments** builds a complete context package. Summarizer** produces detailed, actionable insights. 5. Error Handling & Safety Confidence Check** prevents wrong answers by requiring clarification. Error Paths** handle API downtime or unexpected formats gracefully. Auto Chunking** avoids Telegram’s message length cap (2000 chars). Safe Defaults** ensure fallback queries when inputs are missing or unclear. Customization Options Sources**: Add or replace platforms by editing HTTP Request nodes. Schedule**: Change the cron time in the Schedule Trigger (e.g., 7:30 AM). Filters**: Adjust default sort order, time ranges, and result limits. AI Persona**: Reword the systemMessage in AI Agent nodes to change tone (professional, casual, emoji-rich). Languages**: Auto-detects user language, but you can force English or Vietnamese by editing prompt settings. Memory**: Enable MongoDB nodes for persistent user context across sessions. Integrations**: Extend beyond Telegram—send digests to Slack, Discord, or email. Models**: Swap Gemini/Groq with other supported LLMs for experimentation. ✨ Crafted by Nguyen Thieu Toan with a focus on clarity, reliability, and community-driven insights. This workflow is not just functional - it reflects a design philosophy: automation should feel natural, transparent, and genuinely useful.
by Growth AI
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Website sitemap generator and visual tree creator Who's it for Web developers, SEO specialists, UX designers, and digital marketers who need to analyze website structure, create visual sitemaps, or audit site architecture for optimization purposes. What it does This workflow automatically generates a comprehensive sitemap from any website URL and creates an organized hierarchical structure in Google Sheets. It follows the website's sitemap to discover all pages, then organizes them by navigation levels (Level 1, Level 2, etc.) with proper parent-child relationships. The output can be further processed to create visual tree diagrams and mind maps. How it works The workflow follows a five-step automation process: URL Input: Accepts website URL via chat interface Site Crawling: Uses Firecrawl to discover all pages following the website's sitemap only Success Validation: Checks if crawling was successful (some sites block external crawlers) Hierarchical Organization: Processes URLs into a structured tree with proper level relationships Google Sheets Export: Creates a formatted spreadsheet with the complete site architecture The system respects robots.txt and follows only sitemap-declared pages to ensure ethical crawling. Requirements Firecrawl API key (for website crawling and sitemap discovery) Google Sheets access Google Drive access (for template duplication) How to set up Step 1: Prepare your template (recommended) It's recommended to create your own copy of the base template: Access the base Google Sheets template Make a copy for your personal use Update the workflow's "Copy template" node with your template's file ID (replace the default ID: 12lV4HwgudgzPPGXKNesIEExbFg09Tuu9gyC_jSS1HjI) This ensures you have control over the template formatting and can customize it as needed Step 2: Configure API credentials Set up the following credentials in n8n: Firecrawl API: For crawling websites and discovering sitemaps Google Sheets OAuth2: For creating and updating spreadsheets Google Drive OAuth2: For duplicating the template file Step 3: Configure Firecrawl settings (optional) The workflow uses optimized Firecrawl settings: ignoreSitemap: false - Respects the website's sitemap sitemapOnly: true - Only crawls URLs listed in sitemap files These settings ensure ethical crawling and faster processing Step 4: Access the workflow The workflow uses a chat trigger interface - no manual configuration needed Simply provide the website URL you want to analyze when prompted How to use the workflow Basic usage Start the chat: Access the workflow via the chat interface Provide URL: Enter the website URL you want to analyze (e.g., "https://example.com") Wait for processing: The system will crawl, organize, and export the data Receive your results: Get an automatic direct clickable link to your generated Google Sheets - no need to search for the file Error handling Invalid URLs: If the provided URL is invalid or the website blocks crawling, you'll receive an immediate error message Graceful failure: The workflow stops without creating unnecessary files when errors occur Common causes: Incorrect URL format, robots.txt restrictions, or site security settings File organization Automatic naming: Generated files follow the pattern "[Website URL] - n8n - Arborescence" Google Drive storage: Files are automatically organized in your Google Drive Instant access: Direct link provided immediately upon completion Advanced processing for visual diagrams Step 1: Copy sitemap data Once your Google Sheets is ready: Copy all the hierarchical data from the generated spreadsheet Prepare it for AI processing Step 2: Generate ASCII tree structure Use any AI model with this prompt: Create a hierarchical tree structure from the following website sitemap data. Return ONLY the tree structure using ASCII tree formatting with ├── and └── characters. Do not include any explanations, comments, or additional text - just the pure tree structure. The tree should start with the root domain and show all pages organized by their hierarchical levels. Use proper indentation to show parent-child relationships. Here is the sitemap data: [PASTE THE SITEMAP DATA HERE] Requirements: Use ASCII tree characters (├── └── │) Show clear hierarchical relationships Include all pages from the sitemap Return ONLY the tree structure, no other text Start with the root domain as the top level Step 3: Create visual mind map Visit the Whimsical Diagrams GPT Request a mind map creation using your ASCII tree structure Get a professional visual representation of your website architecture Results interpretation Google Sheets output structure The generated spreadsheet contains: Niv 0 to Niv 5: Hierarchical levels (0 = homepage, 1-5 = navigation depth) URL column: Complete URLs for reference Hyperlinked structure: Clickable links organized by hierarchy Multi-domain support: Handles subdomains and different domain structures Data organization features Automatic sorting: Pages organized by navigation depth and alphabetical order Parent-child relationships: Clear hierarchical structure maintained Domain separation: Main domains and subdomains processed separately Clean formatting: URLs decoded and formatted for readability Workflow limitations Sitemap dependency: Only discovers pages listed in the website's sitemap Crawling restrictions: Some websites may block external crawlers Level depth: Limited to 5 hierarchical levels for clarity Rate limits: Respects Firecrawl API limitations Template dependency: Requires access to the base template for duplication Use cases SEO audits: Analyze site structure for optimization opportunities UX research: Understand navigation patterns and user paths Content strategy: Identify content gaps and organizational issues Site migrations: Document existing structure before redesigns Competitive analysis: Study competitor site architectures Client presentations: Create visual site maps for stakeholder reviews
by Mirza Ajmal
Description This powerful workflow automates the evaluation of new digital tools, websites, or platforms with the goal of assessing their potential impact on your business. By leveraging Telegram for user input, Apify for deep content extraction, advanced AI for contextual analysis, and Google Sheets for personalized data integration and record-keeping, this tool delivers clear, actionable verdicts that help you determine whether a tool is worth adopting or exploring further. Key Features and Workflow User-Friendly Input: Submit URLs of tools or websites directly through Telegram for quick and easy evaluation requests. Dynamic Content Extraction: The workflow retrieves detailed content from the submitted URLs using the Apify web crawler, capturing rich data for analysis. AI-Powered Cleaning & Analysis: Sophisticated AI models filter out noise, distill meaningful insights, and contextualize findings based on your business profile and goals stored in Google Sheets. Personalized Business Context: Integration with Google Sheets brings in your company’s specialization, current focus, and strategic objectives to tailor the analysis specifically to your needs. Structured Analysis Output: Receive a thorough, structured report including concise summaries, key considerations, business impact, benefits, risks, actionable insights, and an easy-to-understand final verdict on the tool’s relevance. Decision Support: The tool estimates effort, time to value, urgency, and confidence levels, enabling informed prioritization and strategic decision-making. Seamless Communication: Results are sent back via Telegram, ensuring you get timely and direct feedback without needing to leave your messaging app. Record Keeping & Tracking: All analyses and decisions are logged automatically into Google Sheets, creating a searchable knowledge base for ongoing reference and reporting. Setup Instructions for Key Nodes Telegram Trigger Node: Configure your Telegram bot API credentials here. Link the bot to your Telegram account to receive messages for URL submissions. URL Extraction Node: No credentials needed. This node extracts URLs from incoming messages for processing. Apify Web Crawler Node Setup Guide: Go to Apify's website, sign up for an account if you don’t have one, and get your API token from your profile’s API tokens section. Then, paste this token into the Apify Node’s API Key field in n8n. AI Cleaning and Analysis Nodes: Configure OpenRouter or compatible AI service API keys for content processing. Customize prompts or models if desired to align analysis style. Google Sheets Nodes: Connect using your Google account and provide access to the specified Google Sheet. Ensure sheets for Company Details and Analysis Results exist with proper columns as per this workflow. Telegram Reply Node: Use the Telegram bot API credentials to send analysis summaries and verdicts back to users. Access and Edit the Google Sheet You can access the Google Sheet used by this workflow here: Access the google sheet here Please make a copy of the sheet to your own Google Drive before connecting it with this workflow. This allows you to customize the sheets, update company information, and manage analysis results securely without affecting the original template. Extendibility Beyond manual URL submissions, you can enhance this workflow by scheduling automated daily checks of new product launches from platforms like Product Hunt. The system can proactively analyze emerging tools and deliver timely updates via Telegram, email, or other channels, helping you stay ahead of innovation effortlessly.
by Đỗ Thành Nguyên
Automated Facebook Page Story Video Publisher (Google Drive → Facebook → Google Sheet) > Recommended: Self-hosted via tino.vn/vps-n8n?affid=388 — use code VPSN8N for up to 39% off. This workflow is an automated solution for publishing video content from Google Drive to your Facebook Page Stories, while using Google Sheets as a posting queue manager. What This Workflow Does (Workflow Function) This automation orchestrates a complete multi-step process for uploading and publishing videos to Facebook Stories: Queue Management: Every 2 hours and 30 minutes, the workflow checks a Google Sheet (Get Row Sheet node) to find the first video whose Stories column is empty — meaning it hasn’t been posted yet. Conditional Execution: An If node confirms that the video’s File ID exists before proceeding. Video Retrieval: Using the File ID, the workflow downloads the video from Google Drive (Google Drive node) and calculates its binary size (Set to the total size in bytes node). Facebook 3-Step Upload: It performs the Facebook Graph API’s three-step upload process through HTTP Request nodes: Step 1 – Initialize Session: Starts an upload session and retrieves the upload_url and video_id. Step 2 – Upload File: Uploads the binary video data to the provided upload_url. Step 3 – Publish Video: Finalizes and publishes the uploaded video as a Facebook Story. Status Update: Once completed, the workflow updates the same row in Google Sheets (Update upload status in sheet node) using the row_number to mark the video as processed. Prerequisites (What You Need Before Running) 1. n8n Instance > Recommended: Self-hosted via tino.vn/vps-n8n?affid=388 — use code VPSN8N for up to 39% off. 2. Google Services Google Drive Credentials:** OAuth2 credentials for Google Drive to let n8n download video files. Google Sheets Credentials:** OAuth2 credentials for Google Sheets to read the posting queue and update statuses. Google Sheet:** A spreadsheet (ID: 1RnE5O06l7W6TLCLKkwEH5Oyl-EZ3OE-Uc3OWFbDohYI) containing: File ID — the video’s unique ID in Google Drive. Stories — posting status column (leave empty for pending videos). row_number — used for updating the correct row after posting. 3. Facebook Setup Page ID:** Your Facebook Page ID (currently hardcoded as 115432036514099 in the info node). Access Token:* A *Page Access Token** with permissions such as pages_manage_posts and pages_read_engagement. This token is hardcoded in the info node and again in Step 3. Post video. Usage Guide and Implementation Notes How to Use Queue Videos: Add video entries to your Google Sheet. Each entry must include a valid Google Drive File ID. Leave the Stories column empty for videos that haven’t been posted. Activate: Save and activate the workflow. The Schedule Trigger will automatically handle new uploads every 2 hours and 30 minutes. Implementation Notes ⚠️ Token Security:* Hardcoding your *Access Token* inside the info node is *not recommended**. Tokens expire and expose your Page to risk if leaked. 👉 Action: Replace the static token with a secure Credential setup that supports token rotation. Loop Efficiency:* The *“false”** output of the If node currently loops back to the Get Row Sheet node. This creates unnecessary cycles if no videos are found. 👉 Action: Disconnect that branch so the workflow stops gracefully when no unposted videos remain. Status Updates:* To prevent re-posting the same video, the final Update upload status in sheet node must update the *Stories** column (e.g., write "POSTED"). 👉 Action: Add this mapping explicitly to your Google Sheets node. Automated File ID Sync:** This workflow assumes that the Google Sheet already contains valid File IDs. 👉 You can build a secondary workflow (using Schedule Trigger1 → Search files and folders → Append or update row in sheet) to automatically populate new video File IDs from your Google Drive. ✅ Result Once active, this workflow automatically: pulls pending videos from your Google Sheet, uploads them to Facebook Stories, and marks them as posted — all without manual intervention.