by Li CHEN
AWS News Analysis and LinkedIn Automation Pipeline Transform AWS industry news into engaging LinkedIn content with AI-powered analysis and automated approval workflows. Who's it for This template is perfect for: Cloud architects and DevOps engineers** who want to stay current with AWS developments Content creators** looking to automate their AWS news coverage Marketing teams** needing consistent, professional AWS content Technical leaders** who want to share industry insights on LinkedIn AWS consultants** building thought leadership through automated content How it works This workflow creates a comprehensive AWS news analysis and content generation pipeline with two main flows: Flow 1: News Collection and Analysis Scheduled RSS Monitoring: Automatically fetches latest AWS news from the official AWS RSS feed daily at 8 PM AI-Powered Analysis: Uses AWS Bedrock (Claude 3 Sonnet) to analyze each news item, extracting: Professional summary Key themes and keywords Importance rating (Low/Medium/High) Business impact assessment Structured Data Storage: Saves analyzed news to Feishu Bitable with approval status tracking Flow 2: LinkedIn Content Generation Manual Approval Trigger: Feishu automation sends approved news items to the webhook AI Content Creation: AWS Bedrock generates professional LinkedIn posts with: Attention-grabbing headlines Technical insights from a Solutions Architect perspective Business impact analysis Call-to-action engagement Automated Publishing: Posts directly to LinkedIn with relevant hashtags How to set up Prerequisites AWS Bedrock access** with Claude 3 Sonnet model enabled Feishu account** with Bitable access LinkedIn company account** with posting permissions n8n instance** (self-hosted or cloud) Detailed Configuration Steps 1. AWS Bedrock Setup Step 1: Enable Claude 3 Sonnet Model Log into your AWS Console Navigate to AWS Bedrock Go to Model access in the left sidebar Find Anthropic Claude 3 Sonnet and click Request model access Fill out the access request form (usually approved within minutes) Once approved, verify the model appears in your Model access list Step 2: Create IAM User and Credentials Go to IAM Console Click Users → Create user Name: n8n-bedrock-user Attach policy: AmazonBedrockFullAccess (or create custom policy with minimal permissions) Go to Security credentials tab → Create access key Choose Application running outside AWS Download the credentials CSV file Step 3: Configure in n8n In n8n, go to Credentials → Add credential Select AWS credential type Enter your Access Key ID and Secret Access Key Set Region to your preferred AWS region (e.g., us-east-1) Test the connection Useful Links: AWS Bedrock Documentation Claude 3 Sonnet Model Access AWS Bedrock Pricing 2. Feishu Bitable Configuration Step 1: Create Feishu Account and App Sign up at Feishu International Create a new Bitable (multi-dimensional table) Go to Developer Console → Create App Enable Bitable permissions in your app Generate App Token and App Secret Step 2: Create Bitable Structure Create a new Bitable with these columns: title (Text) pubDate (Date) summary (Long Text) keywords (Multi-select) rating (Single Select: Low, Medium, High) link (URL) approval_status (Single Select: Pending, Approved, Rejected) Get your App Token and Table ID: App Token: Found in app settings Table ID: Found in the Bitable URL (tbl...) Step 3: Set Up Automation In your Bitable, go to Automation → Create automation Trigger: When field value changes → Select approval_status field Condition: approval_status equals "Approved" Action: Send HTTP request Method: POST URL: Your n8n webhook URL (from Flow 2) Headers: Content-Type: application/json Body: {{record}} Step 4: Configure Feishu Credentials in n8n Install Feishu Lite community node (self-hosted only) Add Feishu credential with your App Token and App Secret Test the connection Useful Links: Feishu Developer Documentation Bitable API Reference Feishu Automation Guide 3. LinkedIn Company Account Setup Step 1: Create LinkedIn App Go to LinkedIn Developer Portal Click Create App Fill in app details: App name: AWS News Automation LinkedIn Page: Select your company page App logo: Upload your logo Legal agreement: Accept terms Step 2: Configure OAuth2 Settings In your app, go to Auth tab Add redirect URL: https://your-n8n-instance.com/rest/oauth2-credential/callback Request these scopes: w_member_social (Post on behalf of members) r_liteprofile (Read basic profile) r_emailaddress (Read email address) Step 3: Get Company Page Access Go to your LinkedIn Company Page Navigate to Admin tools → Manage admins Ensure you have Content admin or Super admin role Note your Company Page ID (found in page URL) Step 4: Configure LinkedIn Credentials in n8n Add LinkedIn OAuth2 credential Enter your Client ID and Client Secret Complete OAuth2 flow by clicking Connect my account Select your company page for posting Useful Links: LinkedIn Developer Portal LinkedIn API Documentation LinkedIn OAuth2 Guide 4. Workflow Activation Final Setup Steps: Import the workflow JSON into n8n Configure all credential connections: AWS Bedrock credentials Feishu credentials LinkedIn OAuth2 credentials Update webhook URL in Feishu automation to match your n8n instance Activate the scheduled trigger (daily at 8 PM) Test with manual webhook trigger using sample data Verify Feishu Bitable receives data Test approval workflow and LinkedIn posting Requirements Service Requirements AWS Bedrock** with Claude 3 Sonnet model access AWS account with Bedrock service enabled IAM user with Bedrock permissions Model access approval for Claude 3 Sonnet Feishu Bitable** for news storage and approval workflow Feishu account (International or Lark) Developer app with Bitable permissions Automation capabilities for webhook triggers LinkedIn Company Account** for automated posting LinkedIn company page with admin access LinkedIn Developer app with posting permissions OAuth2 authentication setup n8n community nodes**: Feishu Lite node (self-hosted only) Technical Requirements n8n instance** (self-hosted recommended for community nodes) Webhook endpoint** accessible from Feishu automation Internet connectivity** for API calls and RSS feeds Storage space** for workflow execution logs Cost Considerations AWS Bedrock**: ~$0.01-0.05 per news analysis Feishu**: Free tier available, paid plans for advanced features LinkedIn**: Free API access with rate limits n8n**: Self-hosted (free) or cloud subscription How to customize the workflow Content Customization Modify AI prompts** in the AI Agent nodes to change tone, focus, or target audience Adjust hashtags** in the LinkedIn posting node for different industries Change scheduling** frequency by modifying the Schedule Trigger settings Integration Options Replace LinkedIn** with Twitter/X, Facebook, or other social platforms Add Slack notifications** for approved content before posting Integrate with CRM** systems to track content performance Add content calendar** integration for better planning Advanced Features Multi-language support** by modifying AI prompts for different regions Content categorization** by adding tags for different AWS services Performance tracking** by integrating analytics platforms Team collaboration** by adding approval workflows with multiple reviewers Technical Modifications Change RSS sources** to monitor other AWS blogs or competitor news Adjust AI models** to use different Bedrock models or external APIs Add data validation** nodes for better error handling Implement retry logic** for failed API calls Important Notes Service Limitations This template uses community nodes (Feishu Lite) and requires self-hosted n8n Geo-restrictions** may apply to AWS Bedrock models in certain regions Rate limits** may affect high-frequency posting - adjust scheduling accordingly Content moderation** is recommended before automated posting Cost considerations**: Each AI analysis costs approximately $0.01-0.05 USD per news item Troubleshooting Common Issues AWS Bedrock Issues: Model not found**: Ensure Claude 3 Sonnet access is approved in your region Access denied**: Verify IAM permissions include Bedrock service access Rate limiting**: Implement retry logic or reduce analysis frequency Feishu Integration Issues: Authentication failed**: Check App Token and App Secret are correct Table not found**: Verify Table ID matches your Bitable URL Automation not triggering**: Ensure webhook URL is accessible and returns 200 status LinkedIn Posting Issues: OAuth2 errors**: Re-authenticate LinkedIn credentials Posting failed**: Verify company page admin permissions Rate limits**: LinkedIn has daily posting limits for company pages Security Best Practices Never hardcode credentials** in workflow nodes Use environment variables** for sensitive configuration Regularly rotate API keys** and access tokens Monitor API usage** to prevent unexpected charges Implement error handling** for failed API calls
by Samir Saci
Tags: EU News, RSS, AI Classifier, Data Table, Email Digest, Automation, n8n Context Hi! I’m Samir, Supply Chain Engineer and Data Scientist based in Paris, and founder of the startup LogiGreen. This workflow helps me closely follow EU sustainability news that impacts my business. > Use this assistant to automatically curate and summarize EU news tailored to the topics that matter most to you. By default, the workflow filters sustainability-related news, but you can easily adapt the topic description (e.g. AI, trade, digital, energy) using the edit node Topic Config. 📬 For business inquiries, you can find me on LinkedIn Who is this template for? This template is designed for: Policy analysts and researchers** who want to track EU updates on a specific topic Consultants and sustainability teams** who need a daily view of relevant announcements Business owners or startup founders**, like myself, who need to adapt their business strategies to the recent news What does this workflow do? This workflow acts as an AI-powered EU news filter and digest generator. Fetches the latest press releases from the Council of the EU RSS feed every morning at 09:00 Filters out all the news already recorded to avoid duplicates Uses an AI classifier (OpenAI) to decide whether each article is relevant to your topic Stores only the relevant items in an n8n Data Table Generates a formatted HTML newsletter grouping the day’s relevant articles Sends the digest by email using the Gmail node Generates an audio summary with ElevenLabs that is sent via Telegram Here’s an example of the generated email: 🎥 Tutorial A complete tutorial (with explanations of every node) is available on YouTube: Next Steps 🗒️ Inside the workflow: Replace the Data Table reference with your own Set up your Gmail, OpenAI and ElevenLabs credentials Update the recipient email address in the Gmail node Customize the HTML digest (colors, logo, style) in the Code node Adjust the schedule time if necessary Submitted: 18 November 2025 Template designed with n8n version 1.116.2
by Nirav Gajera
🍕 AI-Powered Restaurant Order & Notification System A Complete n8n Workflow for Automated Ordering and Customer Updates This professional n8n workflow provides an end-to-end solution for small restaurants. It includes a Telegram Customer Bot for placing orders and an Automated Notification System that updates customers via Google Sheets. 📖 Description This system eliminates the need for manual order taking and status updates. It features: AI Order Bot: Customers chat with a Telegram bot to view the menu and place orders. An AI agent (Claude Haiku) parses natural language (e.g., "2 pizza + 1 coke") into structured data. Order Management: Orders are saved automatically to Google Sheets for staff to manage. Real-Time Notifications: As staff change the status in the sheet (e.g., to Preparing or Ready), n8n instantly notifies the customer via Telegram. Key Bot Commands | Command | Description | | :--- | :--- | | /start | Welcome message and instructions | | /menu | View today's food and drink offerings | | /help | See all available commands | | STATUS [Queue #] | Check the live status of an order | | CANCEL [Queue #] | Cancel an order (only if Pending) | | /myorders | View your last 5 orders | 🛠 Setup Requirements 1. Google Sheets Configuration Your spreadsheet acts as your Admin Dashboard. Ensure Row 1 has these exact headers in order: | Column | Header Name | Description | | :---: | :--- | :--- | | A | Queue Number | Auto-generated by the bot (e.g., #4582) | | B | Chat ID | Customer's Telegram ID — captured automatically | | C | Name | Customer's first name | | D | Order | Items ordered — parsed and cleaned by AI | | E | Status | Dropdown: Pending, Preparing, Ready, Completed, Cancelled | | F | Order Time | Timestamp of when the order was placed | | G | Order Date | Date of the order | | H | Last Status Sent | Internal: Tracks last notification sent to prevent duplicates | > ⚠️ Important: Set up a Data Validation dropdown on column E with the values: Pending, Preparing, Ready, Completed, Cancelled. This is how staff update order status. > ⚠️ Important: In the Read All Rows node, go to Options → Output Row Number and ensure it is enabled. The workflow uses row_number to write back to the correct cell. 2. n8n Credential Configuration You need 3 credentials set up in n8n before activating: | Credential Type | Where Used | Notes | | :--- | :--- | :--- | | Anthropic API | Claude Haiku node | Required for AI order parsing | | Google Sheets OAuth2 API | All Google Sheets nodes | Connect your Google account | | Telegram Bot API | All Telegram nodes | Use your bot token from @BotFather | Steps: Go to n8n → Settings → Credentials → Add Credential Add each credential type above After importing the workflow, open each node and select the correct credential 3. Workflow Import Steps Copy the workflow JSON In n8n, click + → Import from JSON → Paste and confirm Connect all credentials in each node Ensure Output Row Number is enabled in Read All Rows node options Pre-fill column H (Last Status Sent) with the current Status for all existing rows to prevent old rows from firing notifications on first run Turn the workflow Active 🏗 How It Works Phase 1: The Customer Bot (Workflow 1) Customer texts bot ↓ Route Message — detects command type ↓ ┌─────────────────────────────────────┐ │ /start → Welcome message │ │ /help → Help guide │ │ /menu → Today's menu │ │ /myorders → Last 5 orders │ │ STATUS → Live order status │ │ CANCEL → Cancel if Pending only │ │ [order] → AI parses → saves │ └─────────────────────────────────────┘ ↓ Order saved to Google Sheet (Status = Pending) Customer receives queue number + wait time Phase 2: The Staff Notification System (Workflow 2) Every 1 minute — Schedule Trigger fires ↓ Read ALL rows from Google Sheet ↓ For EACH row independently (runOnceForEachItem): Skip if no Queue Number or Chat ID Skip if Status = Pending Skip if Status = Last Status Sent (already notified) ✅ Send notification if Status changed ↓ Send Telegram message to that customer only ↓ Write new Status into column H (Last Status Sent) → Prevents duplicate notification next minute Status Flow & Customer Messages | Staff sets Status to | Customer receives | | :--- | :--- | | Preparing | 👨🍳 "Your order is being Prepared! We'll notify you when it's ready." | | Ready | 🍕 "Your order is READY for collection! Please collect from the counter." | | Completed | ✅ "Order marked as Completed. Thank you for dining with us!" | | Cancelled | ❌ "Your order has been Cancelled. We apologise for the inconvenience." | 🔒 Order Cancellation Rules Customers can only cancel orders in Pending status. All other states are protected: | Current Status | Customer tries CANCEL | Response | | :--- | :--- | :--- | | Pending | CANCEL 1234 | ✅ Cancelled successfully | | Preparing | CANCEL 1234 | ⚠️ Cannot cancel — being prepared | | Ready | CANCEL 1234 | ⚠️ Already ready — please collect | | Completed | CANCEL 1234 | ⚠️ Already completed | | Cancelled | CANCEL 1234 | ⚠️ Already cancelled | | Someone else's order | CANCEL 1234 | ❌ You can only cancel your own orders | 📦 Required Credentials Summary | Credential | Provider | Free Tier | | :--- | :--- | :--- | | Anthropic API | anthropic.com | Paid — ~$5 minimum deposit | | Google Sheets OAuth2 API | Google Cloud Console | Free | | Google Sheets Trigger OAuth2 API | Google Cloud Console | Free | | Telegram Bot API | @BotFather on Telegram | Free forever | 🧪 Test Scenarios Run these in order to verify the full system: /start → Should receive welcome message /menu → Should see the menu with prices Type 2 pizza + 1 coke → Should get queue number STATUS [queue] → Should show ⏳ Pending In sheet: change Status to Preparing → Within 1 min, customer gets 👨🍳 message In sheet: change Status to Ready → Customer gets 🍕 message STATUS [queue] → Should now show 🍕 Ready CANCEL [queue] → Should say "already ready, please collect" In sheet: change Status to Completed → Customer gets ✅ message /myorders → Should show order history with final status ⚠️ Known Limitations Notification delay:** Up to 1 minute between staff updating the sheet and customer receiving the message (due to polling interval) Column H required:** The Last Status Sent column must exist in your sheet. Without it, every row will fire a notification on every poll Anthropic API cost:** Claude Haiku is not free — very low cost (~$0.25 per million tokens) but requires a funded account Google Sheets trigger limitation:** The Google Sheets Trigger cannot detect which specific row changed, which is why a Schedule Trigger with row comparison is used instead 🗂 File Structure restaurant_WITH_STICKIES.json — Complete workflow (W1 + W2) with sticky notes restaurant_workflow_docs.md — This documentation file Built with n8n • Claude Haiku AI • Google Sheets • Telegram Bot API
by DataForSEO
This weekly workflow automatically discovers new high-volume, ranked keywords for your domain on Google without manual SERP monitoring. On each run, the workflow fetches the latest ranking and search volume data using the DataForSEO Labs API and stores a fresh snapshot in Google Sheets. It then compares this data with the previous run to identify any new keywords your domain started ranking for, focusing on queries with a search volume above 1,000. All newly ranked keywords that match this rule are added to a dedicated Google Sheet, along with their ranking position and search volume, creating a growing historical log you can use to analyze gains over time. Once new terms are identified, the workflow creates tasks in Asana to help your team act on them quickly, and sends you a Slack summary highlighting the latest changes. Who’s it for SEO professionals, marketers, and content teams who want an automated way to discover newly ranked, high-volume Google keywords and turn organic ranking gains into actionable content or optimization tasks. What it does This workflow automatically detects when your domain starts ranking for new high-volume keywords on Google, records them in Google Sheets, creates related tasks in Asana, and sends a weekly summary via Slack. How it works Runs on a predefined schedule (default: once a week). Reads your keywords and target domains from Google Sheets. Extracts the latest Google results and keyword metrics via DataForSEO API. Compares current data with the previous snapshot. Logs newly ranked keywords to a dedicated Google Sheet. Creates follow-up tasks in Asana for content team. Sends a Slack summary with key changes. Requirements DataForSEO account and API credentials Google Sheets spreadsheet with your keywords, following the required column structure (as in the example). Google Sheets spreadsheet with your target domains, following the required column structure (as in the example). Asana account Slack account Customization You can easily tailor this workflow to your needs by adjusting the run schedule, changing the minimum search volume threshold, exporting results to other tools (like Looker Studio or BigQuery), and customizing the content of the Asana task or Slack message to match your team’s workflow.
by Madame AI
Generate YouTube thumbnails via Telegram using BrowserAct & Nano banana Pro This workflow acts as an AI-powered "Viral Architect" for YouTube creators. Simply send a video topic (e.g., "Kling 2.6") to your Telegram bot, and it will scrape top-performing competitor thumbnails, analyze their visual strategies using AI vision, and generate a new, scientifically optimized thumbnail concept and image for you. Target Audience YouTubers, content creators, and social media managers looking to increase their Click-Through Rate (CTR) with AI-designed thumbnails. How it works Analyze Intent: The workflow receives a message via Telegram. An AI Agent classifies the input: is it a casual chat or a request for a thumbnail? Research Competitors: If a topic is detected, BrowserAct scrapes YouTube for the top-ranking videos on that subject. Visual Forensics: An AI Vision Agent (using OpenRouter/GPT-4o) analyzes the scraped thumbnails to understand why they work (colors, composition, text). User Approval: The bot sends a confirmation button to Telegram to ensure you want to proceed with image generation (saving credits). Creative Synthesis: Upon approval, a specialized Gemini Agent aggregates the research and crafts a high-fidelity prompt using psychological hooks and pop-culture references. Generate & Deliver: Google Gemini renders the final 4K thumbnail image, which is sent back to your Telegram chat. How to set up Configure Credentials: Connect your Telegram, BrowserAct, Google Sheets, OpenRouter, and Google Gemini accounts in n8n. Prepare BrowserAct: Ensure the Thumbnail Maker Bot template (or a YouTube search scraper) is saved in your BrowserAct account. Setup Google Sheet: Create a Google Sheet to store analysis data. The workflow will dynamically create tabs, but you need a master sheet ID. Configure Telegram: Ensure your bot is created via BotFather and the API token is added to the Telegram credentials. Activate: Turn on the workflow. Requirements BrowserAct* account with the *Thumbnail Maker Bot** template. Telegram** account (Bot Token). Google Sheets** account. OpenRouter** account (for Vision capabilities). Google Gemini** account. Google Sheets Requirements Spreadsheet Name:** Thumbnail Data base Sheet Name:** Database Required Columns:** Keyword Current Workflow Sheet ID How to customize the workflow Change Art Style: Modify the system prompt in the Generate Image Prompt agent to enforce a specific aesthetic (e.g., "3D Render," "Anime," "Minimalist"). Adjust Research Depth: Change the number of videos scraped by BrowserAct to analyze more or fewer competitors. Add Text Overlay: Add an image processing node (like Cloudinary or Bannerbear) to automatically overlay the generated text onto the image for a final polish. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Automate Viral Thumbnails with n8n and AI Agents (Step-by-Step)
by Madame AI
Monitor real estate filings via Telegram, BrowserAct and Gemini This workflow transforms your Telegram bot into an automated real estate monitoring tool. Send a message like "Check filings for the last 5 days," and the bot will scrape the official county clerk site for "Lis Pendens" (pre-foreclosure) filings, format the messy legal data into a readable report, and deliver it directly to your chat. Target Audience Real estate investors, wholesalers, and agents looking for pre-foreclosure leads. How it works Understand Request: The workflow receives your Telegram message. An AI Agent classifies it: are you chatting, or asking for real estate data? Calculate Dates: If you don't specify a date range, the workflow automatically calculates a default window (e.g., Today minus 5 days) to search for recent filings. Scrape Records: BrowserAct executes a background task to search the public records site using the calculated dates and scrapes all matching "Lis Pendens" filings. Format Data: A second AI Agent processes the raw legal text. It cleans up party names, extracts file numbers, and formats everything into a clean HTML list with download links. Deliver Report: The bot splits the report into multiple messages (if necessary) and sends them to your Telegram chat. How to set up Configure Credentials: Connect your Telegram, BrowserAct, Google Gemini, and OpenRouter accounts in n8n. Prepare BrowserAct: Ensure the Texas Foreclosure Leads template is saved in your BrowserAct account. Configure Telegram: Create a bot via BotFather and add the API token to your Telegram credentials. Activate: Turn on the workflow. Test: Send "Check filings" to your bot to start the search. Requirements BrowserAct* account with the *Texas Foreclosure Leads** template. Telegram** account (Bot Token). Google Gemini* & *OpenRouter** accounts. How to customize the workflow Change Search Query: Modify the BrowserAct template to search for different document types (e.g., "Divorce Decrees" or "Probate"). Adjust Date Range: Update the From_Date node to search for a longer period (e.g., -30 days). Filter Results: Add logic to the Generate response agent to only show filings from specific banks or attorneys. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video ⚖️ How to Scrape Lis Pendens Records Automatically (Texas Real Estate)
by Pawan
Who is this for? This workflow is designed for growth agencies, SaaS founders, and sales teams who want to move beyond static lead forms. It is ideal for those who need a "living" system that not only captures leads but also provides immediate value through AI-generated strategies and evolves based on performance data. How it works The template operates in two distinct phases: The Lead Engine**: A user interacts with a chat bubble. A Gemini-powered agent conversationally qualifies the lead (Name, Industry, Budget). A custom JavaScript node ensures data integrity before an "AI Council" node generates three industry-specific growth tactics. High-value leads are then routed to Slack and logged in Google Sheets. The Self-Optimization Loop**: A scheduled trigger audits lead data in Google Sheets daily. It uses Gemini to identify friction points and sends a "System Audit" report to Slack, suggesting prompt improvements to increase conversion rates. How to set up Credentials**: Connect your Google Gemini (API Key), Google Sheets (OAuth2), and Slack (OAuth2) accounts. Google Sheets**: Create a spreadsheet with headers: Lead, Suggestion, and Status. Copy the Spreadsheet ID into the Google Sheets nodes. Slack**: Invite your n8n bot to a specific channel (e.g., /invite @n8n) and select that channel in the Slack nodes. Memory**: Ensure the Window Buffer Memory node is connected to the AI Agent to maintain conversation state. Requirements Google AI (Gemini) API Key. Google Sheets for data logging. Slack for real-time notifications. n8n version 1.0+ (supporting AI Agent nodes). How to customize Scoring Logic: Adjust the Scoring Logic code node to change what constitutes a "Hot" lead. AI Strategist: Modify the prompt in the AI Council: Strategist node to provide different types of value (e.g., free audits instead of growth tactics).
by Fahmi Fahreza
AI Research Assistant Using Gemini AI and Decodo Sign up for Decodo HERE for Discount This workflow transforms your Telegram bot into a smart academic research assistant powered by Gemini AI and Decodo. It analyzes queries, interprets URLs, scrapes scholarly data, and returns concise summaries of research papers directly in chat. Who’s it for? For researchers, students, and AI enthusiasts who want to search and summarize academic content via Telegram using Google Scholar and arXiv. How it works The Telegram bot captures text, voice, or image messages. Gemini models interpret academic URLs and user intent. Decodo extracts paper details like titles, abstracts, and publication info. The AI agent summarizes results and delivers them as text or file (if too long). How to set up Add your Telegram bot credentials in the Start Telegram Bot node. Connect Google Gemini and Decodo API credentials. Replace {{INPUT_SEARCH_URL_INSIGHTS}} placeholder on Research Summary Agent's system message with your search URL insights (or use the pinned example). Test by sending a text, image, or voice message to your bot. Activate the workflow to run in real-time.
by Khairul Muhtadin
Stop wasting hours watching long videos. This n8n workflow acts as your personal "TL;DW" (Too Long; Didn't Watch) assistant. It automatically pulls YouTube transcripts using Decodo, analyzes them with Google Gemini, and sends a detailed summary straight to your Telegram. Why You Need This Save Time:** Turn a 2-hour video into a 5-minute read (95% faster). Don't Miss a Thing:** Captures key points, chapters, tools mentioned, and quotes that you might miss while skimming. Instant Results:** Get a structured summary in Telegram within 30-60 seconds. Multi-Language:** Works with any video language that has YouTube captions. Who Is This For? Creators & Marketers:** Spy on competitor strategies and extract tools without watching endless footage. Students:** Turn lecture recordings into instant study notes. Busy Pros:** Digest conference talks and webinars on the go. How It Works Send Link: You message a YouTube link to your Telegram bot. Scrape: The bot uses the Decodo API to grab the video transcript and metadata (views, chapters, etc.). Analyze: Google Gemini reads the text and writes a structured summary (overview, takeaways, tools). Deliver: You receive the formatted summary in chat. Setup Guide What You Need n8n instance** (to run the workflow) Telegram Bot Token** (free via @BotFather) Decodo Scraper API Key** (for YouTube data - Get it here) Google Gemini API Key** (for the AI - Get it here) Quick Installation Import: Load the JSON file into your n8n instance. Credentials: Add your API keys for Telegram, Decodo, and Google Gemini in the n8n credentials section. Configure: In the "Alert Admin" node, set the chatId to your Telegram User ID (find it via @userinfobot). (Optional) Change the languageCode in the Config node if you want non-English transcripts. Test: Send a YouTube link to your bot. You should see a "Processing..." message followed by your summary! Troubleshooting & Tips "Not a YouTube URL":** Make sure you are sending a standard youtube.com or youtu.be link. No Transcript:** The video must have captions (auto-generated or manual) for this to work. Customization:** You can edit the AI Prompt in the "Generate TLDR" node to change how the summary looks (e.g., "Make it funny" or "Focus on technical details"). Created by: Khaisa Studio Category: AI-Powered Automation Tags: YouTube, AI, Telegram, Summarization, Decodo, Gemini Need custom workflows? Contact us Connect with the creator: Portfolio • Workflows • LinkedIn • Medium • Threads
by Aleks Sidorecs
Who's it for Supply chain and logistics specialists, freight forwarders, importers/exporters, and risk teams who need early warning when maritime, carrier, geopolitical, or weather events threaten their cargo or trade lanes. If you've ever found out about a Red Sea diversion, an OFAC sanctions package, a Panama Canal restriction, or a Gulf hurricane after it hit your bookings - this template fixes that. It turns 12 free public feeds into a single Claude-classified Telegram alert stream, scoped to the chokepoints, lanes, and risk dimensions you actually care about. How it works The workflow runs every 6 hours and polls 12 free public RSS feeds across 4 supply chain risk dimensions: Maritime**: gCaptain, Maritime Executive, Safety4Sea, Splash247 Carrier**: The Loadstar, FreightWaves, Container News Geopolitical**: Google News searches for sanctions, tariffs, OFAC, export bans Weather**: GDACS (Global Disaster Alerts), NOAA Hurricane Center, USGS Significant Earthquakes Each new item is deduped against past runs using n8n workflow static data - no database needed. Then Claude Sonnet 4.5 (the Sonar Analyst) classifies it into a structured JSON with primary risk dimension, severity 1-5, affected region, geo node, trade lanes affected, source reliability, and recommended action (monitor / alert / escalate). Five few-shot examples cover one case from each dimension plus an "other" fallback. Items at or above your severity threshold are reformatted by a second Claude pass (the Sonar Reporter) into a scannable HTML Telegram alert with dimension-specific emoji, severity color codes, and a direct link to the source. Items below threshold end silently. How to set up Connect your Anthropic API credential to both Claude Sonnet sub-nodes - the same credential can be reused for both the Analyst and the Reporter. Connect your Telegram Bot credential (token from BotFather) and set the TELEGRAM_CHAT_ID environment variable in n8n to your operator chat ID. The Send Telegram Alert node already references this env var, so you only need to set the variable once. Open the Configuration node and adjust: severity_threshold (1-5) - minimum severity to alert on (default: 3) watched_dimensions - comma-separated dimensions: maritime, carrier, geopolitical, weather watched_regions - geographic focus list (defaults cover the major chokepoints + global lanes) post_all_clear_digest - set to true if you want a digest even when nothing critical hit Click Execute Workflow once to test. You'll see items flow through the canvas, the Analyst classify them, and the IF gate filter them. If the result is below threshold you'll see a green checkmark on End (Below Threshold) - that's expected. Toggle Active in the top-right when you're happy. The schedule trigger will fire every 6 hours. Requirements Anthropic API key** (Claude Sonnet 4.5 recommended - the prompts are tuned for it) Telegram Bot token** from BotFather A Telegram chat ID** for receiving alerts (your personal chat, a group, or a channel) That's it. No databases, no Docker, no scraping, no paid APIs. The dedupe Code node uses n8n workflow static data so nothing needs to be installed beyond a working n8n instance. How to customize the workflow Add or remove RSS feeds in the lane that matches your dimension. Each lane has its own visual sticky note so you can see at a glance which feeds belong where. To add a feed, copy any existing RSS Feed Read node, change the URL, and connect it to the Merge node. Change the schedule in the Every 6 Hours node - for high-stakes operations you might want every 2 hours; for lower-touch monitoring, daily. Edit the Sonar Analyst system prompt (in the AI Agent node) to add new risk categories, region keywords, or industry-specific few-shot examples. The current schema supports adding new crisis_type enum values and new chokepoints without breaking the IF gate. Swap Telegram for Slack, Discord, or email by replacing the final Send Telegram Alert node. The Sonar Reporter outputs HTML which renders cleanly in any of these channels. Tighten the dedupe cap (currently 40 items per run, 800 cached URLs) if your LLM cost is a concern. The cap lives at the top of the Dedupe + Tag Source Code node. Localize the alerts by changing the alert_language field in the Configuration node - the Reporter prompt respects it (currently supports english, spanish, french, german). The workflow ships as a single importable JSON, 29 nodes total, with one yellow main sticky note explaining the whole template and four neutral lane sticky notes labeling each risk dimension lane. Zero credentials are bundled - you wire your own.
by octik5
🤖 This n8n workflow automatically parses news articles from a webpage, enhances them with AI, and publishes them to a Telegram channel with a watermarked image. Unlike the RSS-based setup, this workflow directly fetches and processes content from any specified webpage. Use Cases Automatically post new website articles to your Telegram channel. Use AI to rewrite or summarize text for better readability. Add branded watermarks to images and keep your channel visually consistent. How It Works Schedule Trigger: Runs the workflow on a custom schedule. Fetch Web Page: Retrieves the HTML content of your chosen website. Extract Links: Parses article links from the HTML source. Check & Update Google Sheet: Skips already processed links and records new ones. Fetch & Clean Article: Retrieves, extracts, and formats the article text. AI Text Customization: Uses an AI agent to enhance the text. Image Watermarking: Fetches the article image and applies a watermark. Telegram Publishing: Posts the final image and AI-enhanced text to your channel. Setup Steps Google Sheet:** Create and share a sheet to track processed links. Web URL:** Enter your target webpage in the HTTP Request node. AI Agent:** Choose a model and prompt for text customization (e.g., OpenRouter or Gemini). Telegram Bot:** Add your bot token and chat ID. Run & Test:** Execute once manually, then let it run on schedule. Tips AI usage may incur costs depending on the model provider. Some AI models can be geo-restricted — check availability if you get “model not found.” Customize watermark style (font, color, size) to match your branding. Use Telegram Markdown for rich message formatting. ✅ Key Advantage: No RSS required — the workflow directly parses websites, enhances content with AI, and automates publishing to Telegram.
by Oussama
Production-ready solution for controlling AI agent usage and preventing abuse while managing costs. 🎯 Problem Solved Unlimited AI interactions → Excessive API costs Service abuse → Uncontrolled resource consumption No built-in limits → Need for usage quotas ✅ Solution Overview Two-Part System: Main Flow: User interaction tracking + AI responses Reset Flow: Automated counter resets 🔄 How It Works User Message → Track Counter → Check Limit → Allow/Block → AI Response 🛠️ Core Components Main Workflow 📱 Telegram Trigger - Receives user messages 📊 Google Sheets Counter - Tracks messages per user 🔀 Switch Logic - Checks limits (default: 3 messages) 🤖 AI Agent - Processes allowed interactions 💬 Smart Responses - Delivers AI answers or limit warnings Auto-Reset System ⏰ Schedule Trigger - Runs every configurable interval 🔄 Bulk Counter Reset - Resets all users to 0 ⚙️ Configuration Message Limits Modify Switch Node conditions: > 3 messages → Block silently = 3 messages → Send limit warning < 3 messages → Allow AI response Reset Schedules Testing: Every 1 minute Hourly: 0 * * * * Daily: 0 0 * * * Weekly: 0 0 * * 0 📋 Setup Requirements Credentials Needed: 🤖 Telegram Bot Token 📊 Google Sheets API 🧠 AI Model *Google Sheets Structure: *Column A: User ID (Telegram chat.id) Column B: Message Counter 🎯 Perfect For 💰 Cost Control - Prevent runaway API costs 🛡️ Demo/Trial Bots - Limited interactions 🏢 Customer Service - Usage quotas 🎓 Educational Bots - Daily limits 🚫 Anti-Abuse - Fair usage policies 🚀 Key Benefits ✅ Cost Management - Control AI API expenses ✅ Fair Access - Equal usage for all users ✅ Production Ready - Robust error handling ✅ Flexible Limits - Easy adjustment ✅ Auto-Reset - No manual intervention ✅ User-Friendly - Clear limit messages 📝 Quick Customization Adjust Limits: Change Switch node values Reset Timing: Modify Schedule Trigger Custom Messages: Edit Telegram response nodes User Tiers: Add columns to Google Sheets