by Thibaud
How it works: Schedule Trigger** on a daily basis (configured at 7.30 am) Connects Google Contacts** to get personal information from them Field Checker** on birthday date & firstnames. And see if there is any celebration for today Send a Telegram Notification** and display the message on a Google Home Speaker via Home Assistant if any celebration has matched Set up steps: Download* the workflow and *import** it into your n8n instance Configure accounts** for Google Contacts, Telegram and Home Assistant And you will be good to go
by Robert Breen
This workflow is designed for creators, marketers, and agencies who want to automate content publishing while keeping quality control through human review. It integrates four powerful tools — Google Sheets, OpenAI, GoToHuman, and Blotato — to deliver a seamless AI-assisted, human-approved, auto-publishing system for LinkedIn. ⚙️ What This Workflow Does 📅 Pulls Today’s Topic from Google Sheets You store ideas in a spreadsheet with a date column. The workflow runs daily (or manually) and selects the row matching today’s date. 🧠 Generates a Caption with OpenAI The selected idea is passed to GPT-4 via an AI Agent node. OpenAI returns a short, emoji-rich LinkedIn caption (1–2 sentences). The result is saved back to the sheet. 👤 Sends the Caption for Human Review via GoToHuman A human reviewer sees the AI-generated caption. They approve or reject it using a GoToHuman review template. Only approved captions move forward. 🚀 Publishes the Approved Caption to LinkedIn via Blotato The caption is posted to a LinkedIn account via Blotato's API. No additional input is required — it's fully automated after approval. 🔧 Setup Requirements ✅ Google Sheets Create or copy the provided sample sheet. Connect your Google Sheets account in n8n using OAuth2. ✅ OpenAI Create an API key at platform.openai.com. Add it to n8n as an OpenAI credential. ✅ GoToHuman Create an account and a Review Template at gotohuman.com. Add your API credential in n8n and use your reviewTemplateId in the node. ✅ Blotato Create an account at blotato.com. Get your API key and Account ID. Insert them into the HTTP Request node that publishes the LinkedIn post. 🧪 Testing the Workflow Use the Manual Trigger node for step-by-step debugging. Review nodes like AI Agent, Ask Human for Approval, and Post to LinkedIn to verify output. Once confirmed, activate the schedule for fully hands-free publishing. 👋 Built By Robert Breen Founder of Ynteractive — Automation, AI, and Data Strategy 🌐 Website: https://ynteractive.com 📧 Email: robert@ynteractive.com 🔗 LinkedIn: https://www.linkedin.com/in/robert-breen-29429625/ 📺 YouTube: YnteractiveTraining 🏷 Tags linkedin openai gotohuman social automation ai content approval workflow google sheets blotato marketing automation
by Nskha
A robust n8n workflow designed to enhance Telegram bot functionality for user management and broadcasting. It facilitates automatic support ticket creation, efficient user data storage in Redis, and a sophisticated system for message forwarding and broadcasting. How It Works Telegram Bot Setup: Initiate the workflow with a Telegram bot configured for handling different chat types (private, supergroup, channel). User Data Management: Formats and updates user data, storing it in a Redis database for efficient retrieval and management. Support Ticket Creation: Automatically generates chat tickets for user messages and saves the corresponding topic IDs in Redis. Message Forwarding: Forwards new messages to the appropriate chat thread, or creates a new thread if none exists. Support Forum Management: Handles messages within a support forum, differentiating between various chat types and user statuses. Broadcasting System: Implements a broadcasting mechanism that sends channel posts to all previous bot users, with a system to filter out blocked users. Blocked User Management: Identifies and manages blocked users, preventing them from receiving broadcasted messages. Versatile Channel Handling: Ensures that messages from verified channels are properly managed and broadcasted to relevant users. Set Up Steps Estimated Time**: Around 30 minutes. Requirements**: A Telegram bot, a Redis database, and Telegram group/channel IDs are necessary. Configuration**: Input the Telegram bot token and relevant group/channel IDs. Configure message handling and user data processing according to your needs. Detailed Instructions**: Sticky notes within the workflow provide extensive setup information and guidance. Live Demo Workflow Bot: Telegram Bot Link (Click here) Support Group: Telegram Group Link (Click here) Broadcasting Channel: Telegram Channel Link (Click here) Keywords: n8n workflow, Telegram bot, chat ticket system, Redis database, message broadcasting, user data management, support forum automation
by Romain Jouhannet
This workflow imports Productboard data into Snowflake, automating data extraction, mapping, and updates for features, companies, and notes. It supports scheduled weekly updates, data cleansing, and Slack notifications summarizing the latest insights. Features Fetches data from Productboard (features, companies, notes). Maps and processes data for Snowflake tables. Automates table creation, truncation, and updates. Summarizes new and unprocessed notes. Sends weekly Slack notifications with key insights. Setup Configure Productboard and Snowflake credentials in n8n. Update Snowflake table schemas to match your setup. Replace Slack channel ID and dashboard URL in the notification node. Activate the workflow and set the desired schedule.
by Geekaz / Kazen
Who is this for? This template is designed for social media managers, content creators, data analysts, and anyone who wants to automatically save and analyze their Meta Threads posts in Notion. It’s particularly useful for: Building a personal archive of your Threads content. Training AI models using your social media data. Tracking your online presence and engagement. What this workflow does This workflow uses the Meta Threads API to automatically retrieve your posts and import them into a Notion database. It retrieves the post content, date, and time, and stores them in designated properties within your Notion database. Setup Get Threads Access Token and ID: Obtain a long-lived access token and your Threads ID from the Meta Threads developer platform. This token auto-refreshes, ensuring uninterrupted workflow operation. Configure Credentials and Date Range: In the “Set Credentials” node (using edit fields), enter your token and ID. Set the since and until parameters in the “Set Date Range” node to specify the post import period. Connect to Notion and Create a Database: Connect to your Notion workspace and create a database with these properties (customize with the “Create Properties” node): a. Title: Threads post URL (Notion entry title). b. Threads ID: Unique post ID (prevents duplicate imports). c. Username: Post author (for future multi-account/source management). d. Post Date: Original post date. e. Source (Multi-Select): “Threads” tag (for future multi-platform import and filtering). f. Created: Import date and time. g. Import Check (Optional): For use with a separate post-categorization workflow.
by Femi Ad
Description AI-Powered Business Idea Generation & Social Media Content Strategy Workflow This intelligent content discovery and strategy system features 15 nodes that automatically monitor Reddit communities, analyze business opportunities, and generate targeted social media content for AI automation agencies and entrepreneurs. It leverages AI classification, structured analysis, and automated content creation to transform community discussions into actionable business insights and marketing materials. Core Components Reddit Intelligence: Multi-subreddit monitoring across AI automation, n8n, and entrepreneur communities with keyword-based filtering. AI Classification Engine: Intelligent categorization of posts into "Questions" vs "Requests" using LangChain text classification. Dual Analysis System: Specialized AI agents for educational content (questions) and sales-focused content (service requests). Content Strategy Generator: Automated creation of LinkedIn and Twitter content tailored to different audience engagement strategies. Telegram Integration: Real-time delivery of formatted content strategies and business insights. Structured Output Processing: JSON-formatted analysis with relevancy scores, feasibility assessments, and actionable content recommendations. Target Users • AI Automation Agency Owners seeking consistent lead generation and thought leadership content • Entrepreneurs wanting to identify market opportunities and position themselves as industry experts • Content Creators in the automation/AI space needing data-driven content strategies • Business Development Professionals looking for systematic opportunity identification • Digital Marketing Agencies serving tech and automation clients Setup Requirements To get started, you'll need: Reddit API Access: OAuth2 credentials for accessing Reddit's API and monitoring multiple subreddits. Required APIs: • OpenRouter (for AI model access - supports GPT-4, Claude, and other models) • Reddit OAuth2 API (for community monitoring and data extraction) n8n Prerequisites: • Version 1.7+ with LangChain nodes enabled • Webhook configuration for Telegram integration • Proper credential storage and management setup Telegram Bot: Create via @BotFather for receiving formatted content strategies and business insights. Disclaimer: This template uses LangChain nodes and Reddit API integration. Ensure your n8n instance supports these features and verify API rate limits for production use. Step-by-Step Setup Guide Install n8n: Ensure you're running n8n version 1.7 or higher with LangChain node support enabled. Set Up API Credentials: • Create Reddit OAuth2 application at reddit.com/prefs/apps • Set up OpenRouter account and obtain API key • Store credentials securely in n8n credential manager Create Telegram Bot: • Go to Telegram, search for @BotFather • Create new bot and note the token • Configure webhook pointing to your n8n instance Import the Workflow: • Copy the workflow JSON from the template submission • Import into your n8n dashboard • Verify all nodes are properly connected Configure Monitoring Settings: • Adjust subreddit targets (currently: ArtificialIntelligence, n8n, entrepreneur) • Set keyword filters for relevant topics • Configure post limits and sorting preferences Customize AI Analysis: • Update system prompts to match your business expertise • Adjust relevancy and feasibility scoring criteria • Modify content generation templates for your brand voice Test the Workflow: • Run manual execution to verify Reddit data collection • Check AI classification and analysis outputs • Confirm Telegram delivery of formatted content Schedule Automation: • Set up daily trigger (currently configured for 12 PM) • Monitor execution logs for any API rate limit issues • Adjust frequency based on content volume needs Usage Instructions Automated Discovery: The workflow runs daily at 12 PM, scanning three key subreddits for relevant posts about AI automation, business opportunities, and n8n workflows. Intelligent Classification: Posts are automatically categorized as either "Questions" (educational opportunities) or "Requests" (potential service leads) using AI text classification. Dual Analysis Approach: • Questions → Educational content strategy with relevancy and detail scoring • Requests → Sales-focused content with relevancy and feasibility scoring Content Strategy Generation: Each analyzed post generates: • 3 LinkedIn posts (thought leadership, case studies, educational frameworks) • 3 Twitter posts (quick insights, engagement questions, thread starters) Telegram Delivery: Receive formatted content strategies with: • Post summaries and business context • Relevancy/feasibility scores • Ready-to-use social media content • Strategic recommendations Content Customization: Adapt generated content for different tones (business, educational, technical) and posting schedules. Workflow Features Multi-Platform Monitoring: Simultaneous tracking of 3 key Reddit communities with customizable keyword filters. AI-Powered Classification: Automatic categorization of posts into actionable content types. Dual Scoring System: • Relevancy scores (0.05-0.95) for business alignment • Detail/Feasibility scores (0.05-0.95) for content quality assessment Content Variety: Generates both educational and sales-focused social media strategies. Structured Output: JSON-formatted analysis for easy integration with other systems. Real-time Delivery: Instant Telegram notifications with formatted content strategies. Scalable Monitoring: Easy addition of new subreddits and keyword filters. Error Handling: Comprehensive validation with graceful failure management. Performance Specifications • Monitoring Frequency: Daily automated execution with manual trigger capability • Post Analysis: 5 posts per subreddit (15 total daily) • Content Generation: 6 social media posts per analyzed opportunity • Classification Accuracy: AI-powered with structured output validation • Delivery Method: Real-time Telegram integration • Scoring Range: 0.05-0.95 scale for relevancy and feasibility assessment Why This Workflow? Systematic Opportunity Identification: Never miss potential business opportunities or content ideas from key communities. AI-Enhanced Analysis: Leverage advanced language models for intelligent content categorization and strategy generation. Time-Efficient Content Creation: Transform community discussions into ready-to-use social media content. Data-Driven Insights: Quantified scoring helps prioritize opportunities and content strategies. Automated Lead Intelligence: Identify potential service requests and educational content opportunities automatically. Workflow Image Need help customizing this workflow for your specific use case? As a fellow entrepreneur passionate about automation and business development, I'd be happy to consult. Connect with me on LinkedIn: https://www.linkedin.com/in/femi-adedayo-h44/ or email for support. Let's make your AI automation agency even more efficient!
by Influencers Club
How it works: Get multi social platform data for loyalty program customers with their email and send personalized creator, partner and ambassador outreach. Step by step workflow to enrich loyalty program emails with multi social (Instagram, Tiktok, Youtube, Twitter, Onlyfans, Twitch and more) profiles, analytics and metrics using the influencers.club API and adding them to an email workflow via SendGrid. Set up: Salesforce (can be swapped for any CRM or loyalty programme software) Influencers.club API Sendgrid (can be swapped for any email marketing sending service like MailChimp, Drip, etc)
by Juan Carlos Cavero Gracia
This automation template turns any long video into multiple viral-ready short clips and auto-schedules them to TikTok, Instagram Reels, and YouTube Shorts. It works with both vertical and horizontal inputs and respects the original input resolution (no unnecessary upscaling), cropping or letterboxing intelligently when needed. The workflow automatically extracts between 3 and 6 clips (based on video length and the most engaging segments) and schedules one short per consecutive day—e.g., 3 clips → the next 3 days, 6 clips → the next 6 days. Note: This workflow uses OpenAI Whisper for word-level transcription, Google’s Gemini for clip selection and metadata, and Upload-Post’s FFmpeg API for GPU-accelerated cutting/cropping and social scheduling. You can use the same Upload-Post API token for both FFmpeg jobs and publishing uploads. Upload-Post also offers a generous free trial with no credit card required.* Who Is This For? Creators & Editors:** Batch-convert long talks/podcasts into daily Shorts/Reels/TikToks. Agencies & Social Teams:** Turn webinars/interviews into a reliable short-form stream. Brands & Founders:** Maintain a steady posting cadence with minimal hands-on editing. What Problem Does This Workflow Solve? Manual clipping is slow and inconsistent. This workflow: Finds Hooks Automatically:** AI picks 3–6 high-retention segments from transcript + timestamps (count scales with video length/quality). Cuts Cleanly:** Absolute-second FFmpeg timing to avoid mid-word cuts. Vertical & Horizontal Friendly:** Handles both orientations and respects source resolution. Schedules for You:** Posts one clip per day on consecutive days. How It Works Form Upload: Submit your long video. Audio Extraction: FFmpeg job extracts audio for accurate ASR. Whisper Transcription: Word-level timestamps enable precise clipping. AI Clip Mining (Gemini): Detects 3–6 “viral” moments (15–60s) and generates titles/descriptions. Cut & Crop (FFmpeg): GPU pipeline produces clean clips; preserves input resolution/orientation when possible and crops/pads appropriately for target platforms. Status & Download: Polls job status and retrieves the final clips. Auto-Scheduling (Consecutive Days): Schedules one short per day starting tomorrow, for as many days as clips were produced (e.g., 3 clips → 3 days, 6 clips → 6 days) at a configurable time (default 20:00 Europe/Madrid). Setup OpenAI (Whisper): Add your OpenAI API credentials. Google Gemini: Add Gemini credentials used by the AI Agent node. Upload-Post (free trial no credit card required): Generate your api token https://app.upload-post.com/ connect your social media accounts and add your API token credentials in n8n (same token works for FFmpeg jobs and publishing). Scheduling: Adjust posting time/intervals and timezone (Europe/Madrid by default). Metadata Mapping: Titles/descriptions are auto-generated per platform; tweak as needed. Requirements Accounts:** n8n, OpenAI, Google (Gemini), Upload-Post, and social platform connections. API Keys:** OpenAI token, Gemini credentials, Upload-Post token. Budget:** Whisper + Gemini inference + FFmpeg compute + optional posting costs. Features Word-Accurate Cuts:** Absolute-second timecodes with subtle pre/post-roll. Orientation-Aware:** Supports vertical and horizontal inputs; preserves source resolution where possible. Platform-Optimized Output:** 9:16-ready delivery with smart crop/pad behavior. Consecutive-Day Scheduler:** 3–6 clips → 3–6 consecutive posting days, automatically. Retry & Polling:** Built-in waits and status checks for robust processing. Modular:** Swap models, adjust clip count/length, or add/remove platforms quickly. Turn long-form video into a consistent sequence of Shorts/Reels/TikToks—automatically, day after day, while respecting your source resolution.
by ing.Seif
This n8n workflow allows you to generate AI images using Nano Banana PRO through a Telegram bot interface, with optional automatic publishing to social media platforms. Users interact with the workflow entirely via Telegram commands and forms. The workflow supports multiple image generation modes and can automatically enhance prompts, compress images, generate captions, and publish content to Facebook, Instagram, and X. This template is especially useful for product visuals, lifestyle scenes, and content creation workflows where manual image generation and posting would otherwise be repetitive. How it works A user sends a command to the Telegram bot (text-to-image, image-to-image, or multi-image fusion). The workflow collects the required inputs (text prompt, uploaded images, aspect ratio, quality). If enabled, an AI Agent enhances the user prompt before image generation. The workflow sends the request to Kie.ai, which runs the Nano Banana PRO image model. The workflow waits for the image generation task to complete and retrieves the result. The generated image is downloaded and sent back to the user via Telegram. Optionally, the image is compressed using TinyPNG. If social sharing is enabled: An AI Agent generates platform-optimized captions. The image and captions are published automatically to selected platforms (Facebook, Instagram, X) via Blotato. How to use Create a Telegram bot using @BotFather and add the bot token to the Telegram Trigger credentials. Configure the required API credentials (see Requirements). Activate the workflow in n8n. Send a command to your Telegram bot: /text_to_image /image_to_image /multi_image Follow the Telegram form prompts to generate and optionally publish images. Requirements The following services are required for the workflow to function: Telegram Bot** – user interaction Kie.ai API** – Nano Banana PRO image generation (Get access) Cloudinary** – image hosting for uploaded files (Create an account) OpenAI API** – prompt enhancement and caption generation Optional services: TinyPNG** – image compression (Get an API key) Blotato** – social media publishing (Connect accounts) Customising the workflow You can disable image compression by removing the TinyPNG nodes. Social media auto-publishing can be disabled by removing the Blotato nodes. Prompt enhancement behavior can be adjusted in the AI Agent system prompt. Additional platforms or posting logic can be added after the caption generation step. The workflow can be adapted to other AI image models by replacing the Kie.ai API calls. Notes This is a self-hosted n8n workflow. All API keys and credentials must be provided by the user. The workflow is modular and can be partially enabled depending on your use case.
by mustafa kendigüzel
How it works This automated workflow discovers trending Instagram posts and creates similar AI-generated content. Here's the high-level process: 1. Content Discovery & Analysis Scrapes trending posts from specific hashtags Analyzes visual elements using AI Filters out videos and duplicates 2. AI Content Generation Creates unique images based on trending content Generates engaging captions with relevant hashtags Maintains brand consistency while being original 3. Automated Publishing Posts content directly to Instagram Monitors publication status Sends notifications via Telegram Set up steps Setting up this workflow takes approximately 15-20 minutes: 1. API Configuration (7-10 minutes) Instagram Business Account setup Telegram Bot creation API key generation (OpenAI, Replicate, Rapid Api) 2. Database Setup (3-5 minutes) Create required database table Configure PostgreSQL credentials 3. Workflow Configuration (5-7 minutes) Set scheduling preferences Configure notification settings Test connection and permissions Detailed technical specifications and configurations are available in sticky notes within the workflow.
by Oneclick AI Squad
This workflow monitors brand mentions across multiple platforms (Twitter/X, Reddit, News) and automatically detects reputation crises based on sentiment analysis and trend detection. How it works Multi-platform monitoring: Every 10 minutes, scans Twitter/X, Reddit, and news sites for brand mentions Data normalization: Converts all platform data into unified format Smart filtering: Removes duplicates and already-analyzed mentions AI sentiment analysis: Analyzes each mention for: Sentiment score (positive/negative/neutral) Amplification factors (engagement, verified accounts, news sources) Crisis-level phrases and keywords Trend detection: Compares current sentiment to 24-hour baseline: Detects sharp sentiment drops Identifies negative mention spikes Calculates impact surge Crisis classification: Assigns severity (CRITICAL/HIGH/MEDIUM/LOW) Automated response: For crises, triggers immediate alerts: Executive crisis brief with action plan Slack alerts to crisis team Email to leadership and PR team JIRA ticket creation Crisis event logging Setup steps Connect platforms: Twitter/X: Add OAuth credentials to "Monitor Twitter/X" node Reddit: Add OAuth credentials to "Monitor Reddit" node News API: Get API key from newsapi.org and add to "Monitor News & Blogs" node Configure brand monitoring: Update brand name and handles in search queries Add additional platforms if needed (LinkedIn, Facebook, Instagram) Set up alerting: Slack: Add credentials and update channel names Email: Add SMTP settings and recipient lists JIRA: Add credentials and project ID Adjust sensitivity: Modify sentiment keyword dictionaries in "AI Sentiment Analysis Engine" Adjust crisis threshold scores Customize amplification factors Test thoroughly: Run manual execution with test data Verify alert routing and content Test false positive handling Activate: Enable for continuous 24/7 monitoring Key Features: Multi-platform monitoring** (every 10 minutes): Twitter/X, Reddit, and News sites Data normalization** that converts different platform formats into unified structure AI sentiment analysis** engine that evaluates: Sentiment keywords (critical, severe, moderate, mild negative/positive) Amplification factors (engagement, verified accounts, follower counts) Impact scoring based on reach and influence Baseline trend detection** that tracks 24-hour sentiment history and detects: Sharp sentiment drops (15+ points = crisis) Negative mention spikes (30%+ increase) Impact surges Automated crisis response workflow**: Aggregates crisis mentions into executive brief Generates detailed action plan based on severity Sends Slack alerts to crisis team Emails leadership with comprehensive brief Creates JIRA ticket for tracking Logs all events for analysis Two-path routing**: Crisis-level events trigger full response workflow, while routine mentions are logged for trend analysis
by Jimmy Lee
This workflow gathers papers in Arxiv and specific arxiv category AI helps to make summarized form of newsletter and send it to subscriber using gmail Arxive paper trend newsletter Setup Supabase Table schema user_email: Text - Mandatory arxiv_cat: [Text] interested_papers: [Text] keyword: [Text] Example { "id": 8, "created_at": "2024-09-24T12:31:17.09491+00:00", "user_email": "test@test.com", "arxiv_cat": [ "cs.AI", "cs.LG,cs.AR" ], "interested_papers": null, "keyword": [ "AI architecture which includes long context problem" ] } Qdrant vector store default setup Setup for sub workflows Get arxiv category by AI for given keyword Get arxiv categories Get arxiv papers this week and scoring by AI Filter by keyword within given documents Extract paper information Write newsletter by AI