by Jay Emp0
Overview Fetch Multiple Google Analytics GA4 metrics daily, post to Discord, update previous day’s entry as GA data finalizes over seven days. Benefits Automates daily traffic reporting Maintains single message per day, avoids channel clutter Provides near–real-time updates by editing prior messages Use Case Teams tracking website performance via Discord (or any chat tool) without manual copy–paste. Marketing managers, community moderators, growth hackers. If your manager asks you for daily marketing report every morning, you can now automate it Notes google analytics node in n8n does not provide real time data. The node updates previous values for the next 7 days discord node on n8n does not have features to update an exisiting message by message id. So we have used the discord api for this most businesses use multiple google analytics properties across their digital platforms Core Logic Schedule trigger fires once a day. Google Analytics node retrieves metrics for date ranges (past 7 days) Aggregate node collates all records. Discord node fetches the last 10 messages in the broadcast channel Code node maps existing Discord messages by to the google analytics data using the date fields For each GA record: If no message exists → send new POST to the discord channel If message exists and metrics changed, send an update patch to the existing discord message Batch loops + wait nodes prevent rate-limit. Setup Instructions Import workflow JSON into n8n. Follow the n8n guide to Create Google Analytics OAuth2 credential with access to all required GA accounts. Follow the n8n guide to Create Discord OAuth2 credential for “Get Messages” operations. Follow the Discord guide to Create HTTP Header Auth credential named “Discord-Bot” with header Key: Authorization Value: Bot <your-bot-token> In the two Set nodes in the beginning of the flow, assign discord_channel_id and google_analytics_id. Get your discord channel id by sending a text on your discord channel and then copy message link Paste the text below and you will see your message link in the form of https://discord.com/channels/server_id/channel_id/message_id , you will want to get the channel_id which is the number in the middle Find your google analytics id by going to google analytics dashboard, seeing the properties in the top right and copy paste that number to the flow Adjust schedule trigger times to your preferred report hour. Activate workflow. Customization Replace Discord HTTP Request nodes with Slack, ClickUp, WhatsApp, Telegram integrations by swapping POST/PATCH endpoints and authentication.
by Airtop
Extract Facebook Group Posts with Airtop Use Case Extracting content from Facebook Groups allows community managers, marketers, and researchers to gather insights, monitor discussions, and collect engagement metrics efficiently. This automation streamlines the process of retrieving non-sponsored post data from group feeds. What This Automation Does This automation extracts key post details from a Facebook Group feed using the following input parameters: Facebook Group URL**: The URL of the Facebook Group feed you want to scrape. Airtop Profile**: The name of your Airtop Profile authenticated to Facebook. It returns up to 5 non-sponsored posts with the following attributes for each: Post text Post URL Page/profile URL Timestamp Number of likes Number of shares Number of comments Page or profile details Post thumbnail How It Works Form Trigger: Collects the Facebook Group URL and Airtop Profile via a form. Browser Automation: Initiates a new browser session using Airtop. Navigates to the provided Facebook Group feed. Uses an AI prompt to extract post data, including interaction metrics and profile information. Structured Output: The results are returned in a defined JSON schema, ready for downstream use. Setup Requirements Airtop API Key — Free to generate. An Airtop Profile logged into Facebook. Next Steps Integrate With Analytics Tools**: Feed the output into dashboards or analytics platforms to monitor community engagement. Automate Alerts**: Trigger notifications for posts matching certain criteria (e.g., high engagement, keywords). Combine With Comment Automation**: Extend this to reply to posts or engage with users using other Airtop automations. Let me know if you’d like this saved as a .md file or included in your Airtop automation library. Read more about how to extract posts from Facebook groups
by Anna Bui
Automatically monitor LinkedIn posts from your community members and create AI-powered content digests for efficient social media curation. This template is perfect for community managers, content creators, and social media teams who need to track LinkedIn activity from their network without spending hours manually checking profiles. It fetches recent posts, extracts key information, and creates digestible summaries using AI. Good to know API costs apply** - LinkedIn API calls ($0.01-0.05 per profile check) and OpenAI processing ($0.001-0.01 per post) Rate limiting included** - Built-in random delays prevent API throttling issues Flexible scheduling** - Easy to switch from daily schedule to webhook triggers for real-time processing Requires API setup** - Need RapidAPI access for LinkedIn data and OpenAI for content processing How it works Daily profile scanning** - Automatically checks each LinkedIn profile in your Airtable for posts from yesterday Smart data extraction** - Pulls post content, engagement metrics, author information, and timestamps AI-powered summarization** - Creates 30-character previews of posts for quick content scanning Duplicate prevention** - Checks existing records to avoid storing the same post multiple times Structured storage** - Saves all processed data to Airtable with clean formatting and metadata Batch processing** - Handles multiple profiles efficiently with proper error handling and delays How to use Set up Airtable base** - Create tables for LinkedIn profiles and processed posts using the provided structure Configure API credentials** - Add your RapidAPI LinkedIn access and OpenAI API key to n8n credentials Import LinkedIn profiles** - Add community members' LinkedIn URLs and URNs to your profiles table Test the workflow** - Run manually with a few profiles to ensure everything works correctly Activate schedule** - Enable daily automation or switch to webhook triggers for real-time processing Requirements Airtable account** - For storing profile lists and managing processed posts with proper field structure RapidAPI Professional Network Data API** - Access to LinkedIn post data (requires subscription) OpenAI API account** - For intelligent content summarization and preview generation LinkedIn profile URNs** - Properly formatted LinkedIn profile identifiers for API calls Customising this workflow Change monitoring frequency** - Switch from daily to hourly checks or use webhook triggers for real-time updates Expand data extraction** - Add company information, hashtag analysis, or engagement trending Integrate notification systems** - Add Slack, email, or Discord alerts for high-engagement posts Connect content tools** - Link to Buffer, Hootsuite, or other social media management platforms for direct publishing Add filtering logic** - Set up conditions to only process posts with minimum engagement thresholds Scale with multiple communities** - Duplicate workflow for different LinkedIn communities or industry segments
by Gleb D
This n8n workflow automates the enrichment of a company list by discovering and extracting each company’s official LinkedIn URL using Bright Data’s search capabilities and Google Gemini AI for HTML parsing and result interpretation. Who is this template for? This workflow is ideal for sales, business development, and data research professionals who need to collect official LinkedIn company profiles for multiple organizations, starting from a list of company names in Google Sheets. It’s especially useful for teams who want to automate sourcing LinkedIn URLs, enrich their prospect database, or validate company data at scale. How it works Manual Trigger: The workflow is started manually (useful for controlled batch runs and testing). Read Company Names: Company names are loaded from a specified Google Sheets table. Loop Over Each Company: Each company is processed one-by-one: A custom Google Search URL is generated for each name. A Bright Data Web Unlocker request is sent to fetch Google search results for “site:linkedin.com [company name]”. Parse LinkedIn Profile URL Using AI: Google Gemini (or your specified LLM) analyzes the fetched search page and extracts the most likely official LinkedIn company profile. Result Handling: If a profile is found, it’s stored in the results. If not, an empty result is created, but you can add custom logic (notifications, retries, etc.). Batch Data Enrichment: All found company URLs are bundled into a single request for further enrichment from a Bright Data dataset. Export: The workflow appends the final, structured data for each company to another sheet in your Google Sheets file. Setup instructions 1. Replace API Keys: Insert your Bright Data API key in these nodes: Bright Data Web Request - Google Search for Company LinkedIn URL HTTP Request - Post API call to Bright Data Snapshot Progress HTTP Request - Getting data from Bright Data 2. Connect Google Sheets: Set up your Google Sheets credentials and specify the sheet for reading input and writing output. 3. Customize Output Structure: Adjust the Python code node (see sticky note in the template) if you want to include additional or fewer fields in your output. 4. Adjust for Scale or Error Handling: You can modify the logic for “not found” results (e.g., to notify a Slack channel or retry failed companies). 5. Run the Workflow: Start manually, monitor the run, and check your Google Sheet for results. Customization guidance Change Input/Output Sheets: Update the sheet names or columns if your source/target spreadsheet has a different structure. Use Another AI Model: Replace the Google Gemini node with another LLM node if preferred. Integrate Alerts: Add Slack or email nodes to notify your team when a LinkedIn profile is not found or when the process is complete.
by Vadym Nahornyi
> ⚠️ Multi-language WhatsApp Error Notifier Get instant WhatsApp alerts when any workflow fails — perfect for mobile-first monitoring and fast incident response. ✅ No coding required ✅ Works with any workflow via Error Workflow ✅ Step-by-step setup instructions included in: 🇬🇧 English 🇪🇸 Español 🇩🇪 Deutsch 🇫🇷 Français 🇷🇺 Русский 📦 What This Template Does This template sends real-time WhatsApp notifications when a workflow fails. It uses the WhatsApp Business Cloud API to deliver a preformatted error message directly to your phone. The message includes: Workflow name Error message Last executed node Example message: Error on WorkFlow: {{ $json.workflow.name }} Message: {{ $json.execution.error.message }} lastNodeExecuted: {{ $json.execution.lastNodeExecuted }} ⚙️ Prerequisites Before using this template, make sure you have: A verified Facebook Business account Access to WhatsApp Business Cloud API A sender phone number (registered in Meta) An access token (used as credentials in n8n) A pre-approved message template (or be within the 24h session window) More info from Meta Docs → 🚀 How to Use Open the template and insert your WhatsApp credentials Enter your target phone number (e.g. your own) in international format Customize the message body if needed Save the workflow but do not activate it In any other workflow → open Settings → set this as your Error Workflow 🌐 Multi-language Setup Guide Included This template includes full setup instructions with screenshots and message formatting help in: 🇬🇧 English 🇪🇸 Español 🇩🇪 Deutsch 🇫🇷 Français 🇷🇺 Русский Choose your language inside the embedded sticky note in the workflow.
by Alex Kim
Printify Automation - Update Title and Description Workflow This n8n workflow automates the process of retrieving products from Printify, generating optimized product titles and descriptions, and updating them back to the platform. It leverages OpenAI for content generation and integrates with Google Sheets for tracking and managing updates. Features Integration with Printify**: Fetch shops and products through Printify's API. AI-Powered Optimization**: Generate engaging product titles and descriptions using OpenAI's GPT model. Google Sheets Tracking**: Log and manage updates in Google Sheets. Custom Brand Guidelines**: Ensure consistent tone by incorporating brand-specific instructions. Loop Processing**: Iteratively process each product in batches. Workflow Structure Nodes Overview Manual Trigger: Manually start the workflow for testing purposes. Printify - Get Shops: Retrieves the list of shops from Printify. Printify - Get Products: Fetches product details for each shop. Split Out: Breaks down the product list into individual items for processing. Loop Over Items: Iteratively processes products in manageable batches. Generate Title and Desc: Uses OpenAI GPT to create optimized product titles and descriptions. Google Sheets Integration: Trigger: Monitors Google Sheets for changes. Log Updates: Records product updates, including old and new titles/descriptions. Conditional Logic: If Nodes: Ensure products are ready for updates and stop processing once completed. Printify - Update Product: Sends updated titles and descriptions back to Printify. Brand Guidelines + Custom Instructions: Sets brand tone and seasonal instructions. Setup Instructions Prerequisites n8n Instance: Ensure n8n is installed and configured. Printify API Key: Obtain an API key from your Printify account. Add it to n8n under HTTP Header Auth. OpenAI API Key: Obtain an API key from OpenAI. Add it to n8n under OpenAI API. Google Sheets Integration: Share your Google Sheets with the Google API service account. Configure Google Sheets credentials in n8n. Workflow Configuration Set Brand Guidelines: Update the Brand Guidelines + Custom Instructions node with your brand name, tone, and seasonal instructions. Batch Size: Configure the Loop Over Items node for optimal batch sizes. Google Sheets Configuration: Set the correct Google Sheets document and sheet names in the integration nodes. Run the Workflow: Start manually or configure the workflow to trigger automatically. Key Notes Customization**: Modify API calls to support other platforms like Printful or Vistaprint. Scalability**: Use batch processing for efficient handling of large product catalogs. Error Handling**: Configure retries or logging for any failed nodes. Output Examples Optimized Content Example Input Title**: "Classic White T-Shirt" Generated Title**: "Stylish Classic White Tee for Everyday Wear" Input Description**: "Plain white T-shirt made of cotton." Generated Description**: "Discover comfort and style with our classic white tee, crafted from premium cotton for all-day wear. Perfect for casual outings or layering." Next Steps Monitor Updates: Use Google Sheets to review logs of updated products. Expand Integration: Add support for more Printify shops or integrate with other platforms. Enhance AI Prompts: Customize prompts for different product categories or seasonal needs. Feel free to reach out for additional guidance or troubleshooting!
by lin@davoy.tech
This workflow template, "n8n Error Report to LINE," is designed to streamline error handling by sending real-time notifications to your LINE account whenever an error occurs in any of your n8n workflows. By integrating with the LINE Messaging API , this template ensures you stay informed about workflow failures, allowing you to take immediate action and minimize downtime. Whether you're a developer managing multiple workflows or a business owner relying on automation, this template provides a simple yet powerful way to monitor and resolve errors efficiently. Who Is This Template For? Developers: Who manage complex n8n workflows and need real-time error notifications. DevOps Teams: Looking to enhance monitoring and incident response for automated systems. Business Owners: Who rely on n8n workflows for critical operations and want to ensure reliability. Automation Enthusiasts: Seeking tools to simplify error tracking and improve workflow performance. What Problem Does This Workflow Solve? When automating processes with n8n, errors can occur due to various reasons such as misconfigurations, API changes, or unexpected inputs. Without proper error handling, these issues may go unnoticed, leading to delays or disruptions. This workflow solves that problem by: 1) Automatically detecting errors in your n8n workflows. 2) Sending instant notifications to your LINE account with details about the failed workflow, including its name and execution URL. Allowing you to quickly identify and resolve issues, ensuring smooth operation of your automated systems. What This Workflow Does 1) Error Trigger: The workflow is triggered automatically whenever an error occurs in any n8n workflow configured to use this error-handling flow. 2) Send Notification via LINE: Using the LINE Push API , the workflow sends a message to your LINE account with key details about the error, such as the workflow name and execution URL. You can also customize the notification message to include additional information or format it to suit your preferences. Setup Guide Pre-Requisites Access to the LINE Developers Console with a registered bot and access to the Push API. https://developers.line.biz/console/ [API Reference]( https://developers.line.biz/en/reference/messaging-api/#send-narrowcast-message) Basic knowledge of n8n workflows and JSON formatting. An active n8n instance where you can configure error workflows. Step-by-Step Setup Configure the Error Trigger: Set this workflow as the default error workflow in your n8n instance. https://docs.n8n.io/flow-logic/error-handling/ Set Up LINE Push API: Replace <UID HERE> in the HTTP Request node with your LINE user ID to ensure notifications are sent to the correct account.
by Jitesh Dugar
Overview Advanced AI-powered stock analysis workflow that combines multi-timeframe technical analysis with real-time news sentiment to generate actionable BUY/SELL/HOLD recommendations. Uses sophisticated algorithms to process price data, news sentiment, and market context for informed trading decisions. Core Features Multi-Timeframe Technical Analysis 4-Hour Charts** - Intraday trend analysis and entry timing Daily Charts** - Primary trend identification and key levels Weekly Charts** - Long-term context and major trend direction Moving Average Analysis** - 5, 10, and 20-period trend indicators Support/Resistance Levels** - Dynamic price level identification Volume Analysis** - Trading activity and momentum confirmation AI-Powered News Sentiment Analysis Real-Time News Processing** - Latest market-moving headlines Sentiment Scoring** - Numerical sentiment rating (-1 to +1 scale) Impact Assessment** - News relevance to stock performance Multi-Source Analysis** - Comprehensive news coverage evaluation Context-Aware Processing** - Financial market-specific sentiment analysis Intelligent Recommendation Engine Professional Trading Logic** - Multi-timeframe alignment analysis Risk/Reward Calculations** - Minimum 1:2 ratio requirements Entry/Exit Price Targets** - Specific actionable price levels Stop-Loss Recommendations** - Risk management guidelines Confidence Scoring** - Recommendation strength assessment Technical Capabilities Data Sources & APIs TwelveData API** - Professional-grade price and volume data NewsAPI Integration** - Comprehensive news coverage Perplexity AI** - Additional sentiment context and analysis Chart-Img API** - Visual chart generation for analysis Real-Time Processing** - Live market data integration AI Models & Analysis GPT-4 Integration** - Advanced natural language processing Custom Sentiment Engine** - Financial market-tuned sentiment analysis Multi-Model Approach** - Cross-validation of recommendations Algorithmic Trading Logic** - Professional-grade decision frameworks Visual Analysis Tools Interactive Charts** - TradingView-style chart generation Technical Indicators** - Visual representation of analysis Dark Theme Support** - Professional trading interface Multiple Timeframes** - Comprehensive visual analysis Use Cases & Applications Individual Traders Day Trading Signals** - Short-term entry/exit recommendations Swing Trading Analysis** - Multi-day position guidance Risk Management** - Stop-loss and position sizing advice Market Timing** - Optimal entry point identification Investment Research Due Diligence** - Comprehensive stock analysis Sentiment Monitoring** - News impact assessment Technical Screening** - Multi-criteria stock evaluation Portfolio Optimization** - Individual stock recommendations Automated Trading Systems Signal Generation** - Systematic buy/sell/hold alerts Risk Controls** - Automated stop-loss calculations Multi-Asset Analysis** - Scalable across stock universe Backtesting Support** - Historical recommendation validation Financial Advisors & Analysts Client Reporting** - Professional analysis documentation Research Automation** - Streamlined analysis workflow Decision Support** - Data-driven recommendation framework Market Commentary** - AI-generated insights and rationale Key Benefits Professional-Grade Analysis Institutional Quality** - Bank-level analytical frameworks Multi-Dimensional** - Technical + fundamental + sentiment analysis Real-Time Processing** - Live market data integration Objective Decision Making** - Removes emotional bias from analysis Time Efficiency Instant Analysis** - Seconds vs hours of manual research Automated Processing** - Continuous market monitoring Scalable Operations** - Analyze multiple stocks simultaneously 24/7 Availability** - Round-the-clock market analysis Risk Management Built-in Stop Losses** - Automatic risk level calculation Position Sizing** - Risk-appropriate recommendation sizing Multi-Timeframe Validation** - Reduces false signals Conservative Approach** - Defaults to HOLD when uncertain Setup Requirements API Keys Needed TwelveData API - Free tier available at twelvedata.com NewsAPI Key - Free tier available at newsapi.org OpenAI API - For GPT-4 analysis capabilities Perplexity API - Additional sentiment analysis Chart-Img API - Optional chart visualization (chart-img.com) Configuration Steps API Integration - Add your API keys to respective nodes Symbol Format - Supports company names or stock symbols Risk Parameters - Customize stop-loss and target calculations Notification Setup - Configure alert delivery methods Testing & Validation - Verify API connections and data flow Advanced Features Natural Language Processing Company Name Recognition** - Automatic symbol conversion Context Understanding** - Market-aware news interpretation Multi-Language Support** - Global news source analysis Entity Extraction** - Key information identification Error Handling & Reliability API Failure Recovery** - Graceful degradation strategies Data Validation** - Input/output quality checks Rate Limit Management** - Automatic throttling controls Backup Data Sources** - Redundant information feeds Customization Options Timeframe Selection** - Adjustable analysis periods Risk Tolerance** - Configurable risk/reward ratios Sentiment Weighting** - Balance technical vs fundamental analysis Alert Thresholds** - Custom trigger conditions Important Disclaimers This tool provides educational and informational analysis only. All trading decisions should: Consider your personal risk tolerance and financial situation Be validated with additional research and professional advice Account for market volatility and potential losses Follow proper risk management principles Performance Optimization Speed Enhancements Parallel Processing** - Simultaneous data retrieval Caching Strategies** - Reduced API call frequency Efficient Algorithms** - Optimized calculation methods Memory Management** - Scalable resource usage Accuracy Improvements Multi-Source Validation** - Cross-reference data points Historical Backtesting** - Performance validation Continuous Learning** - Algorithm refinement Market Adaptation** - Evolving analysis criteria Transform your investment research with AI-powered analysis that combines the speed of automation with the depth of professional-grade financial analysis.
by Jimleuk
This n8n template lets you summarize team member activity on Slack for the past week and generates a report. For remote teams, chat is a crucial communication tool to ensure work gets done but with so many conversations happening at once and in multiple threads, ideas, information and decisions usually live in the moment and get lost just as quickly - and all together forgotten by the weekend! Using this template, this doesn't have to be the case. Have AI crawl through last week's activity, summarize all threads and generate a casual and snappy report to bring the team back into focus for the current week. A project manager's dream! How it works A scheduled trigger is set to run every Monday at 6am to gather all team channel messages within the last week. Each message thread are grouped by user and data mined for replies. Combined, an AI analyses the raw messages to pull out interesting observations and highlights. The summarized threads of the user are then combined together and passed to another AI agent to generate a higher level overview of their week. These are referred to as the individual reports. Next, all individual reports are summarized together into a team weekly report. This allows understanding of group and similar activities. Finally, the team weekly report is posted back to the channel. The timing is important as it should be the first message of the week and ready for the team to glance over coffee. How to use Ideally works best per project and where most of the comms happens on a single channel. Avoid combining channels and instead duplicate this workflow for more channels. You may need to filter for specific team members if you want specific team updates. Customise the report to suit your organisation, team or the channel. You may prefer to be more formal if clients or external stakeholders are also present. Requirements Slack for chat platform Gemini for LLM (or switch for other models) Customising this workflow If the slack channel is busy enough already, consider posting the final report to email. Pull in project metrics to include in your report. As extra context, it may be interesting to tie the messages to production performance. Use an AI Agent to query for knowledgebase or tickets relevant to the messages. This may be useful for attaching links or references to add context. Channel not so busy or way too busy for 1 week? Play with the scheduled trigger and set an interval which works for your team.
by Kanaka Kishore Kandregula
Boost Sales with Automated Magento 2 Product and Coupon Notifications This n8n workflow automatically posts new Magento products & coupons to Telegram while preventing duplicates. Key benefits: ✅ Increase conversions with time-sensitive alerts (creates urgency) ✅ Reduce missed opportunities with 24/7 monitoring ✅ Improve customer engagement through rich media posts ✅ Save hours per week by automating manual posting Why This Works: Triggers impulse buys with real-time notifications Eliminates human error in duplicate posting Scales effortlessly as your catalog grows Provides analytics through database tracking Perfect for e-commerce stores wanting to: Announce new arrivals instantly Promote limited-time offers effectively Maintain consistent social presence Track performance through MySQL This workflow automatically: ✅ Detects new products AND coupons in Magento ✅ Prevents duplicate postings with MySQL tracking ✅ Posts rich formatted alerts to Telegram ✅ Runs on a customizable schedule ✨ Key Features For Products: Product name, price, and image Direct store link Media gallery support For Coupons: Coupon code and status Usage limits (times used/available) Active/inactive status indicator Core System: 🔒 MySQL duplicate prevention⏰ 1-hour schedule (customizable)📱 Telegram notifications with Markdown 🛠️ Configuration Guide Database Setup CREATE TABLE IF NOT EXISTS posted_items (item_id INT PRIMARY KEY, item_type ENUM('product', 'coupon') NOT NULL, item_value VARCHAR(255), posted BOOLEAN DEFAULT FALSE, created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP, updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP); Required Credentials Magento API (HTTP Header Auth) MySQL Database Telegram Bot Sticky Notes `❗ IMPORTANT SETUP NOTES ❗ For products: Ensure 'url_key' exists in custom_attributes For coupons: Magento REST API must expose coupon rules MySQL user needs INSERT/SELECT privileges Telegram bot must be added to your channel first 🔄 SCHEDULING: - Default: Checks every 1 hours at :00 - Adjust in Schedule Trigger node ` ⚙️ Technical Details Workflow Logic: Checks for new products/coupons via Magento API Verifies against MySQL database Only posts if record doesn't exist Updates database after successful post Error Handling: Automatic skip if product/coupon exists Empty result handling Connection timeout protection 🌟 Why This Template? Complete Solution**: Handles both products AND coupons Battle-Tested**: Prevents all duplicates reliably Ready-to-Use**: Just add your credentials Fully Customizable**: Easy to modify for different needs Perfect for e-commerce stores using Magento 2 who want automated, duplicate-free social notifications!
by Vadym Nahornyi
How it works Automatically sends Telegram notifications when any n8n workflow fails. Includes workflow name, error message, and execution ID in the alert. Setup Complete setup instructions included in the workflow's sticky note in 5 languages: 🇬🇧 English 🇪🇸 Español 🇩🇪 Deutsch 🇫🇷 Français 🇷🇺 Русский Features Monitors all workflows 24/7 Instant Telegram notifications Zero configuration needed Just add your bot token and chat ID Important ⚠️ Keep this workflow active 24/7 to capture all errors.
by Davide
How It Works This workflow creates an AI-powered Telegram chatbot with session management, allowing users to: Start new conversations** (/new). Check current sessions** (/current). Resume past sessions** (/resume). Get summaries** (/summary). Ask questions** (/question). Key Components: Session Management**: Uses Google Sheets to track active/expired sessions (storing SESSION IDs and STATE). /new creates a session; /resume reactivates past ones. AI Processing**: OpenAI GPT-4 generates responses with contextual memory (via Simple Memory node). Summarization: Condenses past conversations when requested. Data Logging**: All interactions (prompts/responses) are saved to Google Sheets for audit and retrieval. User Interaction**: Telegram commands trigger specific actions (e.g., /question [query] fetches answers from session history). Main Advantages 1. Multi-session Handling Each user can create, manage, and switch between multiple sessions independently, perfect for organizing different conversations without confusion. 2. Persistent Memory Conversations are stored in Google Sheets, ensuring that chat history and session states are preserved even if the server or n8n instance restarts. 3. Commands for Full Control With intuitive commands like /new, /current, /resume, /summary, and /question, users can manage sessions easily without needing a web interface. 4. Smart Summarization and Q&A Thanks to OpenAI models, the workflow can summarize entire conversations or answer specific questions about past discussions, saving time and improving the chatbot’s usability. 5. Easy Setup and Scalability By using Google Sheets instead of a database, the workflow is easy to clone, modify, and deploy — ideal for quick prototyping or lightweight production uses. 6. Modular and Extensible Each logical block (new session, get current session, resume, summarize, ask question) is modular, making it easy to extend the workflow with additional features like analytics, personalized greetings, or integrations with CRM systems. Setup Steps Prerequisites: Telegram Bot Token**: Create via BotFather. Google Sheets**: Duplicate this template. Two sheets: Session (active/inactive sessions) and Database (chat logs). OpenAI API Key**: For GPT-4 responses. Configuration: Telegram Integration: Add your bot token to the Telegram Trigger and Telegram Send nodes. Google Sheets Setup: Authenticate the Google Sheets nodes with OAuth. Ensure sheet names (Session, Database) and column mappings match the template. OpenAI & Memory: Add your API key to the OpenAI Chat Model nodes. Adjust contextWindowLength in the Simple Memory node for conversation history length. Testing: Use Telegram commands to test: /new: Starts a session. /question [query]: Tests AI responses. /summary: Checks summarization. Deployment: Activate the workflow; the bot will respond to Telegram messages in real-time. Need help customizing? Contact me for consulting and support or add me on Linkedin.