N8N Automated Twitter Reply Bot Workflow
N8N Automated Twitter Reply Bot Workflow
For latest version, check: dziura.online/automation
Latest documentation can be find here
You must have Apify community node installed before pasting the JSON to your workflow.
Overview This n8n workflow creates an intelligent Twitter/X reply bot that automatically scrapes tweets based on keywords or communities, analyzes them using AI, generates contextually appropriate replies, and posts them while avoiding duplicates. The bot operates on a schedule with intelligent timing and retry mechanisms.
Key Features Automated tweet scraping** from Twitter/X using Apify actors
AI-powered reply generation** using LLM (Large Language Model)
Duplicate prevention** via MongoDB storage
Smart scheduling** with timezone awareness and natural posting patterns
Retry mechanism** with failure tracking
Telegram notifications** for status updates
Manual trigger** option via Telegram command
Required Credentials & Setup 1. Telegram Bot
Create a bot via @BotFather on Telegram
Get your Telegram chat ID to receive status messages
Credential needed**: Telegram account (Bot token) 2. MongoDB Database
Set up a MongoDB database to store replied tweets and prevent duplicates
Create a collection (default name: collection_name)
Credential needed**: MongoDB account (Connection string)
Tutorial**: MongoDB Connection Guide 3. Apify Account
Sign up at Apify.com
Primary actors used**:
Search Actor: api-ninja/x-twitter-advanced-search - For keyword-based tweet scraping (ID: 0oVSlMlAX47R2EyoP)
Community Actor: api-ninja/x-twitter-community-search-post-scraper - For community-based tweet scraping (ID: upbwCMnBATzmzcaNu)
Credential needed**: Apify account (API token) 4. OpenRouter (LLM Provider)
Sign up at OpenRouter.ai
Used for AI-powered tweet analysis and reply generation
Model used**: x-ai/grok-3 (configurable)
Credential needed**: OpenRouter account (API key) 5. Twitter/X API
Set up developer account at developer.x.com
Note**: Free tier limited to ~17 posts per day
Credential needed**: X account (OAuth2 credentials)
Workflow Components Trigger Nodes
1. Schedule Trigger
Purpose**: Runs automatically every 20 minutes
Smart timing**: Only active between 7 AM - 11:59 PM (configurable timezone)
Randomization**: Built-in probability control (~28% execution chance) to mimic natural posting patterns 2. Manual Trigger
Purpose**: Manual execution for testing 3. Telegram Trigger
Purpose**: Manual execution via /reply command in Telegram
Usage**: Send /reply to your bot to trigger the workflow manually Data Processing Flow
1. MongoDB Query (Find documents)
Purpose**: Retrieves previously replied tweet IDs to avoid duplicates
Collection**: collection_name (configure to match your setup)
Projection**: Only fetches tweet_id field for efficiency 2. Data Aggregation (Aggregate1)
Purpose**: Consolidates tweet IDs into a single array for filtering 3. Keyword/Community Selection (Keyword/Community List)
Purpose**: Defines search terms and communities
Configuration**: Edit the JSON to include your keywords and Twitter community IDs
Format:{
"keyword_community_list": [
"SaaS",
"Entrepreneur",
"1488663855127535616" // Community ID (19-digit number)
],
"failure": 0
}
4. Random Selection (Randomized community, keyword)
Purpose**: Randomly selects one item from the list to ensure variety 5. Routing Logic (If4)
Purpose**: Determines whether to use Community search or Keyword search
Logic**: Uses regex to detect 19-digit community IDs vs keywords Tweet Scraping (Apify Actors)
Community Search Actor
Actor**: api-ninja/x-twitter-community-search-post-scraper
Purpose**: Scrapes tweets from specific Twitter communities
Configuration:{
"communityIds": ["COMMUNITY_ID"],
"numberOfTweets": 40
}
Search Actor
Actor**: api-ninja/x-twitter-advanced-search
Purpose**: Scrapes tweets based on keywords
Configuration:{
"contentLanguage": "en",
"engagementMinLikes": 10,
"engagementMinReplies": 5,
"numberOfTweets": 20,
"query": "KEYWORD",
"timeWithinTime": "2d",
"tweetTypes": ["original"],
"usersBlueVerifiedOnly": true
}
Filtering System (Community filter)
The workflow applies multiple filters to ensure high-quality replies:
Text length**: >60 characters (substantial content)
Follower count**: >100 followers (audience reach)
Engagement**: >10 likes, >3 replies (proven engagement)
Language**: English only
Views**: >100 views (visibility)
Duplicate check**: Not previously replied to
Recency**: Within 2 days (configurable in actor settings) AI-Powered Reply Generation
LLM Chain (Basic LLM Chain)
Purpose**: Analyzes filtered tweets and generates contextually appropriate replies
Model**: Grok-3 via OpenRouter (configurable)
Features**:
Engagement potential scoring
User authority analysis
Timing optimization
Multiple reply styles (witty, informative, supportive, etc.)
<100 character limit for optimal engagement
Output Parser (Structured Output Parser)
Purpose**: Ensures consistent JSON output format
Schema:{
"selected_tweet_id": "tweet_id_here",
"screen_name": "author_screen_name",
"reply": "generated_reply_here"
}
Posting & Notification System
Twitter Posting (Create Tweet)
Purpose**: Posts the generated reply as a Twitter response
Error handling**: Catches API limitations and rate limits Status Notifications
Success**: Notifies via Telegram with tweet link and reply text
Failure**: Notifies about API limitations or errors
Format**: HTML-formatted messages with clickable links Database Storage (Insert documents)
Purpose**: Saves successful replies to prevent future duplicates
Fields stored**: tweet_id, screen_name, reply, tweet_url, timestamp Retry Mechanism
The workflow includes intelligent retry logic:
Failure Counter (If5, Increment Failure Counter1)
Logic**: If no suitable tweets found, increment failure counter
Retry limit**: Maximum 3 retries with different random keywords
Wait time**: 3-second delay between retries Final Failure Notification
Trigger**: After 4 failed attempts
Action**: Sends Telegram notification about unsuccessful search
Recovery**: Manual retry available via /reply command
Configuration Guide Essential Settings to Modify
MongoDB Collection Name: Update collection_name in MongoDB nodes
Telegram Chat ID: Replace 11111111111 with your actual chat ID
Keywords/Communities: Edit the list in Keyword/Community List node
Timezone: Update timezone in Code node (currently set to Europe/Kyiv)
Actor Selection: Enable only one actor (Community OR Search) based on your needs Filter Customization
Adjust filters in Community filter node based on your requirements:
Minimum engagement thresholds
Text length requirements
Time windows
Language preferences LLM Customization
Modify the AI prompt in Basic LLM Chain to:
Change reply style and tone
Adjust engagement criteria
Modify scoring algorithms
Set different character limits
Usage Tips Start small: Begin with a few high-quality keywords/communities
Monitor performance: Use Telegram notifications to track success rates
Adjust filters: Fine-tune based on the quality of generated replies
Respect limits: Twitter's free tier allows ~17 posts/day
Test manually: Use /reply command for testing before scheduling
Troubleshooting Common Issues
No tweets found: Adjust filter criteria or check keywords
API rate limits: Reduce posting frequency or upgrade Twitter API plan
MongoDB connection: Verify connection string and collection name
Apify quota: Monitor Apify usage limits
LLM failures: Check OpenRouter credits and model availability Best Practices
Monitor your bot's replies for quality and appropriateness
Regularly update keywords to stay relevant
Keep an eye on engagement metrics
Adjust timing based on your audience's activity patterns
Maintain a balanced posting frequency to avoid appearing spammy
Documentation Links Full Documentation**: Google Doc Guide
Latest Version**: dziura.online/automation
MongoDB Setup Tutorial**: YouTube Guide
This workflow provides a comprehensive solution for automated, intelligent Twitter engagement while maintaining quality and avoiding spam-like behavior.
Related Templates
CrunchBase Competitor Intelligence Tracker
Automated system for monitoring and analyzing competitor activities, funding rounds, and market movements using CrunchB...
Summarize SERPBear data with AI (via Openrouter) and save it to Baserow
Who's this for? If you own a website and need to analyze your keyword rankings If you need to create a keyword report on...
Lookup IP Geolocation Details with IP-API.com via Webhook
This n8n template enables you to instantly retrieve detailed geolocation information for any given IP address by simply ...
🔒 Please log in to import templates to n8n and favorite templates
Workflow Visualization
Loading...
Preparing workflow renderer
Comments (0)
Login to post comments