by Oneclick AI Squad
This guide details the setup and functionality of an automated workflow designed to monitor the health, uptime, and SLA compliance of travel supplier APIs, specifically the Amadeus Flight API and Booking.com Hotel API. The workflow runs every 10 minutes, processes health and SLA data, and sends alerts or logs based on the status. What It Monitors API Health**: UP/DOWN status with health indicators. Uptime Tracking**: Real-time availability percentage. SLA Compliance**: Automatic breach detection and alerts. Performance Rating**: Classified as EXCELLENT, GOOD, AVERAGE, or POOR. Features Runs every 10 minutes automatically. Monitors the Amadeus Flight API with a 99.5% SLA target. Monitors the Booking.com Hotel API with a 99.0% SLA target. Smart Alerts that notify via WhatsApp only on SLA breaches. Logging of results for both breaches and normal status. Workflow Steps Monitor Schedule**: Triggers the workflow every 10 minutes automatically. Amadeus Flight API**: Tests the Amadeus Flight API (GET: https://api.amadeus.com) simultaneously. Booking Hotel API**: Tests the Booking.com Hotel API (GET: https://distribution-xml.booking.com) simultaneously. Calculate Health & SLA**: Processes health, uptime, and SLA data. Alert Check**: Routes to appropriate responses based on breach status. SLA Breach Alert**: Sends an alert with throwError if an SLA breach occurs. Normal Status Log**: Records results with throwError for healthy status. Send Message**: Sends a WhatsApp message for breach alerts. How to Use Copy the JSON configuration of the workflow. Import it into your n8n workspace. Activate the workflow. Monitor results in the execution logs and WhatsApp notifications. The workflow will automatically start tracking your travel suppliers and alert you via WhatsApp only when SLA thresholds are breached. Please double-check responses to ensure accuracy. Requirements n8n account and instance setup. API credentials for Amadeus Flight API (e.g., https://api.amadeus.com). API credentials for Booking.com Hotel API (e.g., https://distribution-xml.booking.com). WhatsApp integration for sending alerts. Customizing this Workflow Adjust the Monitor Schedule interval to change the frequency (e.g., every 5 or 15 minutes). Modify SLA targets in the Calculate Health & SLA node to align with your service agreements (e.g., 99.9% for Amadeus). Update API endpoints or credentials in the Amadeus Flight API and Booking Hotel API nodes for different suppliers. Customize the Send Message node to integrate with other messaging platforms (e.g., Slack, email). Enhance the Normal Status Log to include additional metrics or export logs to a database.
by n8n Team
This n8n workflow, which runs every Monday at 5:00 AM, initiates a comprehensive process to monitor and analyze network security by scrutinizing IP addresses and their associated ports. It begins by fetching a list of watched IP addresses and expected ports through an HTTP request. Each IP address is then processed in a sequential loop. For every IP, the workflow sends a GET request to Shodan, a renowned search engine for internet-connected devices, to gather detailed information about the IP. It then extracts the data field from Shodan's response, converting it into an array. This array contains information on all ports Shodan has data for regarding the IP. A filter node compares the ports returned from Shodan with the expected list obtained initially. If a port doesn't match the expected list, it is retained for further processing; otherwise, it's filtered out. For each such unexpected port, the workflow assembles data including the IP, hostnames from Shodan, the unexpected port number, service description, and detailed data from Shodan like HTTP status code, date, time, and headers. This collected data is then formatted into an HTML table, which is subsequently converted into Markdown format. Finally, the workflow generates an alert in TheHive, a popular security incident response platform. This alert contains details like the title indicating unexpected ports for the specific IP, a description comprising the Markdown table with Shodan data, medium severity, current date and time, tags, Traffic Light Protocol (TLP) set to Amber, a new status, type as 'Unexpected open port', the source as n8n, a unique source reference combining the IP with the current Unix time, and enabling follow and JSON parameters options. This comprehensive workflow thus aids in the proactive monitoring and management of network security.
by vinci-king-01
Competitor Price Monitoring Dashboard with AI and Real-time Alerts 🎯 Target Audience E-commerce managers and pricing analysts Retail business owners monitoring competitor pricing Marketing teams tracking market positioning Product managers analyzing competitive landscape Data analysts conducting pricing intelligence Business strategists making pricing decisions 🚀 Problem Statement Manual competitor price monitoring is inefficient and often leads to missed opportunities or delayed responses to market changes. This template solves the challenge of automatically tracking competitor prices, detecting significant changes, and providing actionable insights for strategic pricing decisions. 🔧 How it Works This workflow automatically monitors competitor product prices using AI-powered web scraping, analyzes price trends, and sends real-time alerts when significant changes are detected. Key Components Scheduled Trigger - Runs the workflow at specified intervals to maintain up-to-date price data AI-Powered Scraping - Uses ScrapeGraphAI to intelligently extract pricing information from competitor websites Price Analysis Engine - Processes historical data to detect trends and anomalies Alert System - Sends notifications via Slack and email when price changes exceed thresholds Dashboard Integration - Stores all data in Google Sheets for comprehensive analysis and reporting 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the price was recorded | "2024-01-15T10:30:00Z" | | competitor_name | String | Name of the competitor | "Amazon" | | product_name | String | Product name and model | "iPhone 15 Pro 128GB" | | current_price | Number | Current price in USD | 999.00 | | previous_price | Number | Previous recorded price | 1099.00 | | price_change | Number | Absolute price difference | -100.00 | | price_change_percent | Number | Percentage change | -9.09 | | product_url | URL | Direct link to product page | "https://amazon.com/iphone15" | | alert_triggered | Boolean | Whether alert was sent | true | | trend_direction | String | Price trend analysis | "Decreasing" | 🛠️ Setup Instructions Estimated setup time: 15-20 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Slack workspace for notifications (optional) Email service for alerts (optional) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for price monitoring data Configure the sheet name (default: "Price Monitoring") 4. Configure Competitor URLs Update the websiteUrl parameters in ScrapeGraphAI nodes Add URLs for each competitor you want to monitor Customize the user prompt to extract specific pricing data Set appropriate price thresholds for alerts 5. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials (SendGrid, SMTP, etc.) Define alert thresholds and notification preferences Test notification delivery 6. Configure Schedule Trigger Set monitoring frequency (hourly, daily, etc.) Choose appropriate time zones for your business hours Consider competitor website rate limits 7. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Test alert notifications with sample data 🔄 Workflow Customization Options Modify Monitoring Targets Add or remove competitor websites Change product categories or specific products Adjust monitoring frequency based on market volatility Extend Price Analysis Add more sophisticated trend analysis algorithms Implement price prediction models Include competitor inventory and availability tracking Customize Alert System Set different thresholds for different product categories Create tiered alert systems (info, warning, critical) Add SMS notifications for urgent price changes Output Customization Add data visualization and reporting features Implement price history charts and graphs Create executive dashboards with key metrics 📈 Use Cases Dynamic Pricing**: Adjust your prices based on competitor movements Market Intelligence**: Understand competitor pricing strategies Promotion Planning**: Time your promotions based on competitor actions Inventory Management**: Optimize stock levels based on market conditions Customer Communication**: Proactively inform customers about price changes 🚨 Important Notes Respect competitor websites' terms of service and robots.txt Implement appropriate delays between requests to avoid rate limiting Regularly review and update your monitoring parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider legal implications of automated price monitoring 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Price parsing errors: Review the Code node's JavaScript logic Rate limiting: Adjust monitoring frequency and implement delays Alert delivery failures: Check notification service credentials Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Slack API documentation for notification setup
by Airtop
Extract Facebook Group Posts with Airtop Use Case Extracting content from Facebook Groups allows community managers, marketers, and researchers to gather insights, monitor discussions, and collect engagement metrics efficiently. This automation streamlines the process of retrieving non-sponsored post data from group feeds. What This Automation Does This automation extracts key post details from a Facebook Group feed using the following input parameters: Facebook Group URL**: The URL of the Facebook Group feed you want to scrape. Airtop Profile**: The name of your Airtop Profile authenticated to Facebook. It returns up to 5 non-sponsored posts with the following attributes for each: Post text Post URL Page/profile URL Timestamp Number of likes Number of shares Number of comments Page or profile details Post thumbnail How It Works Form Trigger: Collects the Facebook Group URL and Airtop Profile via a form. Browser Automation: Initiates a new browser session using Airtop. Navigates to the provided Facebook Group feed. Uses an AI prompt to extract post data, including interaction metrics and profile information. Structured Output: The results are returned in a defined JSON schema, ready for downstream use. Setup Requirements Airtop API Key — Free to generate. An Airtop Profile logged into Facebook. Next Steps Integrate With Analytics Tools**: Feed the output into dashboards or analytics platforms to monitor community engagement. Automate Alerts**: Trigger notifications for posts matching certain criteria (e.g., high engagement, keywords). Combine With Comment Automation**: Extend this to reply to posts or engage with users using other Airtop automations. Let me know if you’d like this saved as a .md file or included in your Airtop automation library. Read more about how to extract posts from Facebook groups
by Mark Shcherbakov
Video Guide I prepared a detailed guide that shows the whole process of building an AI tool to analyze Instagram Reels using n8n. Youtube Link Who is this for? This workflow is ideal for social media analysts, digital marketers, and content creators who want to leverage data-driven insights from their Instagram Reels. It's particularly useful for those looking to automate the analysis of video performance to inform strategy and content creation. What problem does this workflow solve? Analyzing video performance on Instagram can be tedious and time-consuming, requiring multiple steps and data extraction. This workflow automates the process of fetching, analyzing, and recording insights from Instagram Reels, making it simpler for users to track engagement metrics without manual intervention. What this workflow does This workflow integrates several services to analyze Instagram Reels, allowing users to: Automatically fetch recent Reels from specified creators. Analyze the most-watched videos for insights. Store and manage data in Airtable for easy access and reporting. Initial Trigger: The process begins with a manual trigger that can later be modified for scheduled automation. Data Retrieval: It connects to Airtable to fetch a list of creators and their respective Instagram Reels. Video Analysis: It handles the fetching, downloading, and uploading of videos for analysis using an external service, simplifying performance tracking through a structured query process. Record Management: It saves relevant metrics and insights into Airtable, ensuring that users can access and organize their video analytics effectively. Setup Create accounts: Set up Airtable, Edify, n8n, and Gemini accounts. Prepare triggers and modules: Replace credentials in each node accordingly. Configure data flow: Ensure modules are set to fetch and analyze the correct data fields as outlined in the guide. Test the workflow: Run the scenario manually to confirm that data is fetched and analyzed correctly.
by David
Who might benfit from this workflow? Do you have to record your working hours yourself? Then this n8n workflow in combination with an iOS shortcut will definitely help you. Once set up, you can use a shortcut, which can be stored as an app icon on your home screen, to record the start, end and duration of your break. How it works Once setup you can tap the iOS shortcut on your iPhone. You will see a menu containing three options: "Track Start", "Track Break" and "Track End". After time is tracked iOS will display you a notification about the successful operation. How to set it up Copy the notion database to your notion workspace (Top right corner). Copy the n8n workflow to your n8n workspace In the notion nodes in the n8n workflow, add your notion credentials and select the copied notion database. Download the iOS Shortcut from our documentation page Edit the shortcut and paste the url of your n8n Webhook trigger node to the first "Text" node of the iOS shortcut flow. It is a best practice to use authentication. You can do so by adding "Header" auth to the webhook node and to the shrotcut. You need help implementing this or any other n8n workflow? Feel free to contact me via LinkedIn or my business website. You want to start using n8n? Use this link to register for n8n (This is an affiliate link)
by Vadym Nahornyi
How it works Automatically sends Telegram notifications when any n8n workflow fails. Includes workflow name, error message, and execution ID in the alert. Setup Complete setup instructions included in the workflow's sticky note in 5 languages: 🇬🇧 English 🇪🇸 Español 🇩🇪 Deutsch 🇫🇷 Français 🇷🇺 Русский Features Monitors all workflows 24/7 Instant Telegram notifications Zero configuration needed Just add your bot token and chat ID Important ⚠️ Keep this workflow active 24/7 to capture all errors.
by Jay Emp0
Overview Fetch Multiple Google Analytics GA4 metrics daily, post to Discord, update previous day’s entry as GA data finalizes over seven days. Benefits Automates daily traffic reporting Maintains single message per day, avoids channel clutter Provides near–real-time updates by editing prior messages Use Case Teams tracking website performance via Discord (or any chat tool) without manual copy–paste. Marketing managers, community moderators, growth hackers. If your manager asks you for daily marketing report every morning, you can now automate it Notes google analytics node in n8n does not provide real time data. The node updates previous values for the next 7 days discord node on n8n does not have features to update an exisiting message by message id. So we have used the discord api for this most businesses use multiple google analytics properties across their digital platforms Core Logic Schedule trigger fires once a day. Google Analytics node retrieves metrics for date ranges (past 7 days) Aggregate node collates all records. Discord node fetches the last 10 messages in the broadcast channel Code node maps existing Discord messages by to the google analytics data using the date fields For each GA record: If no message exists → send new POST to the discord channel If message exists and metrics changed, send an update patch to the existing discord message Batch loops + wait nodes prevent rate-limit. Setup Instructions Import workflow JSON into n8n. Follow the n8n guide to Create Google Analytics OAuth2 credential with access to all required GA accounts. Follow the n8n guide to Create Discord OAuth2 credential for “Get Messages” operations. Follow the Discord guide to Create HTTP Header Auth credential named “Discord-Bot” with header Key: Authorization Value: Bot <your-bot-token> In the two Set nodes in the beginning of the flow, assign discord_channel_id and google_analytics_id. Get your discord channel id by sending a text on your discord channel and then copy message link Paste the text below and you will see your message link in the form of https://discord.com/channels/server_id/channel_id/message_id , you will want to get the channel_id which is the number in the middle Find your google analytics id by going to google analytics dashboard, seeing the properties in the top right and copy paste that number to the flow Adjust schedule trigger times to your preferred report hour. Activate workflow. Customization Replace Discord HTTP Request nodes with Slack, ClickUp, WhatsApp, Telegram integrations by swapping POST/PATCH endpoints and authentication.
by Jitesh Dugar
Overview Advanced AI-powered stock analysis workflow that combines multi-timeframe technical analysis with real-time news sentiment to generate actionable BUY/SELL/HOLD recommendations. Uses sophisticated algorithms to process price data, news sentiment, and market context for informed trading decisions. Core Features Multi-Timeframe Technical Analysis 4-Hour Charts** - Intraday trend analysis and entry timing Daily Charts** - Primary trend identification and key levels Weekly Charts** - Long-term context and major trend direction Moving Average Analysis** - 5, 10, and 20-period trend indicators Support/Resistance Levels** - Dynamic price level identification Volume Analysis** - Trading activity and momentum confirmation AI-Powered News Sentiment Analysis Real-Time News Processing** - Latest market-moving headlines Sentiment Scoring** - Numerical sentiment rating (-1 to +1 scale) Impact Assessment** - News relevance to stock performance Multi-Source Analysis** - Comprehensive news coverage evaluation Context-Aware Processing** - Financial market-specific sentiment analysis Intelligent Recommendation Engine Professional Trading Logic** - Multi-timeframe alignment analysis Risk/Reward Calculations** - Minimum 1:2 ratio requirements Entry/Exit Price Targets** - Specific actionable price levels Stop-Loss Recommendations** - Risk management guidelines Confidence Scoring** - Recommendation strength assessment Technical Capabilities Data Sources & APIs TwelveData API** - Professional-grade price and volume data NewsAPI Integration** - Comprehensive news coverage Perplexity AI** - Additional sentiment context and analysis Chart-Img API** - Visual chart generation for analysis Real-Time Processing** - Live market data integration AI Models & Analysis GPT-4 Integration** - Advanced natural language processing Custom Sentiment Engine** - Financial market-tuned sentiment analysis Multi-Model Approach** - Cross-validation of recommendations Algorithmic Trading Logic** - Professional-grade decision frameworks Visual Analysis Tools Interactive Charts** - TradingView-style chart generation Technical Indicators** - Visual representation of analysis Dark Theme Support** - Professional trading interface Multiple Timeframes** - Comprehensive visual analysis Use Cases & Applications Individual Traders Day Trading Signals** - Short-term entry/exit recommendations Swing Trading Analysis** - Multi-day position guidance Risk Management** - Stop-loss and position sizing advice Market Timing** - Optimal entry point identification Investment Research Due Diligence** - Comprehensive stock analysis Sentiment Monitoring** - News impact assessment Technical Screening** - Multi-criteria stock evaluation Portfolio Optimization** - Individual stock recommendations Automated Trading Systems Signal Generation** - Systematic buy/sell/hold alerts Risk Controls** - Automated stop-loss calculations Multi-Asset Analysis** - Scalable across stock universe Backtesting Support** - Historical recommendation validation Financial Advisors & Analysts Client Reporting** - Professional analysis documentation Research Automation** - Streamlined analysis workflow Decision Support** - Data-driven recommendation framework Market Commentary** - AI-generated insights and rationale Key Benefits Professional-Grade Analysis Institutional Quality** - Bank-level analytical frameworks Multi-Dimensional** - Technical + fundamental + sentiment analysis Real-Time Processing** - Live market data integration Objective Decision Making** - Removes emotional bias from analysis Time Efficiency Instant Analysis** - Seconds vs hours of manual research Automated Processing** - Continuous market monitoring Scalable Operations** - Analyze multiple stocks simultaneously 24/7 Availability** - Round-the-clock market analysis Risk Management Built-in Stop Losses** - Automatic risk level calculation Position Sizing** - Risk-appropriate recommendation sizing Multi-Timeframe Validation** - Reduces false signals Conservative Approach** - Defaults to HOLD when uncertain Setup Requirements API Keys Needed TwelveData API - Free tier available at twelvedata.com NewsAPI Key - Free tier available at newsapi.org OpenAI API - For GPT-4 analysis capabilities Perplexity API - Additional sentiment analysis Chart-Img API - Optional chart visualization (chart-img.com) Configuration Steps API Integration - Add your API keys to respective nodes Symbol Format - Supports company names or stock symbols Risk Parameters - Customize stop-loss and target calculations Notification Setup - Configure alert delivery methods Testing & Validation - Verify API connections and data flow Advanced Features Natural Language Processing Company Name Recognition** - Automatic symbol conversion Context Understanding** - Market-aware news interpretation Multi-Language Support** - Global news source analysis Entity Extraction** - Key information identification Error Handling & Reliability API Failure Recovery** - Graceful degradation strategies Data Validation** - Input/output quality checks Rate Limit Management** - Automatic throttling controls Backup Data Sources** - Redundant information feeds Customization Options Timeframe Selection** - Adjustable analysis periods Risk Tolerance** - Configurable risk/reward ratios Sentiment Weighting** - Balance technical vs fundamental analysis Alert Thresholds** - Custom trigger conditions Important Disclaimers This tool provides educational and informational analysis only. All trading decisions should: Consider your personal risk tolerance and financial situation Be validated with additional research and professional advice Account for market volatility and potential losses Follow proper risk management principles Performance Optimization Speed Enhancements Parallel Processing** - Simultaneous data retrieval Caching Strategies** - Reduced API call frequency Efficient Algorithms** - Optimized calculation methods Memory Management** - Scalable resource usage Accuracy Improvements Multi-Source Validation** - Cross-reference data points Historical Backtesting** - Performance validation Continuous Learning** - Algorithm refinement Market Adaptation** - Evolving analysis criteria Transform your investment research with AI-powered analysis that combines the speed of automation with the depth of professional-grade financial analysis.
by Max
N8N Automated Twitter Reply Bot Workflow For latest version, check: dziura.online/automation Latest documentation can be find here You must have Apify community node installed before pasting the JSON to your workflow. Overview This n8n workflow creates an intelligent Twitter/X reply bot that automatically scrapes tweets based on keywords or communities, analyzes them using AI, generates contextually appropriate replies, and posts them while avoiding duplicates. The bot operates on a schedule with intelligent timing and retry mechanisms. Key Features Automated tweet scraping** from Twitter/X using Apify actors AI-powered reply generation** using LLM (Large Language Model) Duplicate prevention** via MongoDB storage Smart scheduling** with timezone awareness and natural posting patterns Retry mechanism** with failure tracking Telegram notifications** for status updates Manual trigger** option via Telegram command Required Credentials & Setup 1\. Telegram Bot Create a bot via @BotFather on Telegram Get your Telegram chat ID to receive status messages Credential needed**: Telegram account (Bot token) 2\. MongoDB Database Set up a MongoDB database to store replied tweets and prevent duplicates Create a collection (default name: collection\_name) Credential needed**: MongoDB account (Connection string) Tutorial**: MongoDB Connection Guide 3\. Apify Account Sign up at Apify.com Primary actors used**: Search Actor: api-ninja/x-twitter-advanced-search - For keyword-based tweet scraping (ID: 0oVSlMlAX47R2EyoP) Community Actor: api-ninja/x-twitter-community-search-post-scraper - For community-based tweet scraping (ID: upbwCMnBATzmzcaNu) Credential needed**: Apify account (API token) 4\. OpenRouter (LLM Provider) Sign up at OpenRouter.ai Used for AI-powered tweet analysis and reply generation Model used**: x-ai/grok-3 (configurable) Credential needed**: OpenRouter account (API key) 5\. Twitter/X API Set up developer account at developer.x.com Note**: Free tier limited to ~17 posts per day Credential needed**: X account (OAuth2 credentials) Workflow Components Trigger Nodes 1\. Schedule Trigger Purpose**: Runs automatically every 20 minutes Smart timing**: Only active between 7 AM - 11:59 PM (configurable timezone) Randomization**: Built-in probability control (~28% execution chance) to mimic natural posting patterns 2\. Manual Trigger Purpose**: Manual execution for testing 3\. Telegram Trigger Purpose**: Manual execution via /reply command in Telegram Usage**: Send /reply to your bot to trigger the workflow manually Data Processing Flow 1\. MongoDB Query (Find documents) Purpose**: Retrieves previously replied tweet IDs to avoid duplicates Collection**: collection\_name (configure to match your setup) Projection**: Only fetches tweet\_id field for efficiency 2\. Data Aggregation (Aggregate1) Purpose**: Consolidates tweet IDs into a single array for filtering 3\. Keyword/Community Selection (Keyword/Community List) Purpose**: Defines search terms and communities Configuration**: Edit the JSON to include your keywords and Twitter community IDs Format:{ "keyword\_community\_list": \[ "SaaS", "Entrepreneur", "1488663855127535616" // Community ID (19-digit number) \], "failure": 0 } 4\. Random Selection (Randomized community, keyword) Purpose**: Randomly selects one item from the list to ensure variety 5\. Routing Logic (If4) Purpose**: Determines whether to use Community search or Keyword search Logic**: Uses regex to detect 19-digit community IDs vs keywords Tweet Scraping (Apify Actors) Community Search Actor Actor**: api-ninja/x-twitter-community-search-post-scraper Purpose**: Scrapes tweets from specific Twitter communities Configuration:{ "communityIds": \["COMMUNITY\_ID"\], "numberOfTweets": 40 } Search Actor Actor**: api-ninja/x-twitter-advanced-search Purpose**: Scrapes tweets based on keywords Configuration:{ "contentLanguage": "en", "engagementMinLikes": 10, "engagementMinReplies": 5, "numberOfTweets": 20, "query": "KEYWORD", "timeWithinTime": "2d", "tweetTypes": \["original"\], "usersBlueVerifiedOnly": true } Filtering System (Community filter) The workflow applies multiple filters to ensure high-quality replies: Text length**: >60 characters (substantial content) Follower count**: >100 followers (audience reach) Engagement**: >10 likes, >3 replies (proven engagement) Language**: English only Views**: >100 views (visibility) Duplicate check**: Not previously replied to Recency**: Within 2 days (configurable in actor settings) AI-Powered Reply Generation LLM Chain (Basic LLM Chain) Purpose**: Analyzes filtered tweets and generates contextually appropriate replies Model**: Grok-3 via OpenRouter (configurable) Features**: Engagement potential scoring User authority analysis Timing optimization Multiple reply styles (witty, informative, supportive, etc.) <100 character limit for optimal engagement Output Parser (Structured Output Parser) Purpose**: Ensures consistent JSON output format Schema:{ "selected\_tweet\_id": "tweet\_id\_here", "screen\_name": "author\_screen\_name", "reply": "generated\_reply\_here" } Posting & Notification System Twitter Posting (Create Tweet) Purpose**: Posts the generated reply as a Twitter response Error handling**: Catches API limitations and rate limits Status Notifications Success**: Notifies via Telegram with tweet link and reply text Failure**: Notifies about API limitations or errors Format**: HTML-formatted messages with clickable links Database Storage (Insert documents) Purpose**: Saves successful replies to prevent future duplicates Fields stored**: tweet\_id, screen\_name, reply, tweet\_url, timestamp Retry Mechanism The workflow includes intelligent retry logic: Failure Counter (If5, Increment Failure Counter1) Logic**: If no suitable tweets found, increment failure counter Retry limit**: Maximum 3 retries with different random keywords Wait time**: 3-second delay between retries Final Failure Notification Trigger**: After 4 failed attempts Action**: Sends Telegram notification about unsuccessful search Recovery**: Manual retry available via /reply command Configuration Guide Essential Settings to Modify MongoDB Collection Name: Update collection\_name in MongoDB nodes Telegram Chat ID: Replace 11111111111 with your actual chat ID Keywords/Communities: Edit the list in Keyword/Community List node Timezone: Update timezone in Code node (currently set to Europe/Kyiv) Actor Selection: Enable only one actor (Community OR Search) based on your needs Filter Customization Adjust filters in Community filter node based on your requirements: Minimum engagement thresholds Text length requirements Time windows Language preferences LLM Customization Modify the AI prompt in Basic LLM Chain to: Change reply style and tone Adjust engagement criteria Modify scoring algorithms Set different character limits Usage Tips Start small: Begin with a few high-quality keywords/communities Monitor performance: Use Telegram notifications to track success rates Adjust filters: Fine-tune based on the quality of generated replies Respect limits: Twitter's free tier allows ~17 posts/day Test manually: Use /reply command for testing before scheduling Troubleshooting Common Issues No tweets found: Adjust filter criteria or check keywords API rate limits: Reduce posting frequency or upgrade Twitter API plan MongoDB connection: Verify connection string and collection name Apify quota: Monitor Apify usage limits LLM failures: Check OpenRouter credits and model availability Best Practices Monitor your bot's replies for quality and appropriateness Regularly update keywords to stay relevant Keep an eye on engagement metrics Adjust timing based on your audience's activity patterns Maintain a balanced posting frequency to avoid appearing spammy Documentation Links Full Documentation**: Google Doc Guide Latest Version**: dziura.online/automation MongoDB Setup Tutorial**: YouTube Guide This workflow provides a comprehensive solution for automated, intelligent Twitter engagement while maintaining quality and avoiding spam-like behavior.
by Alex Kim
Printify Automation - Update Title and Description Workflow This n8n workflow automates the process of retrieving products from Printify, generating optimized product titles and descriptions, and updating them back to the platform. It leverages OpenAI for content generation and integrates with Google Sheets for tracking and managing updates. Features Integration with Printify**: Fetch shops and products through Printify's API. AI-Powered Optimization**: Generate engaging product titles and descriptions using OpenAI's GPT model. Google Sheets Tracking**: Log and manage updates in Google Sheets. Custom Brand Guidelines**: Ensure consistent tone by incorporating brand-specific instructions. Loop Processing**: Iteratively process each product in batches. Workflow Structure Nodes Overview Manual Trigger: Manually start the workflow for testing purposes. Printify - Get Shops: Retrieves the list of shops from Printify. Printify - Get Products: Fetches product details for each shop. Split Out: Breaks down the product list into individual items for processing. Loop Over Items: Iteratively processes products in manageable batches. Generate Title and Desc: Uses OpenAI GPT to create optimized product titles and descriptions. Google Sheets Integration: Trigger: Monitors Google Sheets for changes. Log Updates: Records product updates, including old and new titles/descriptions. Conditional Logic: If Nodes: Ensure products are ready for updates and stop processing once completed. Printify - Update Product: Sends updated titles and descriptions back to Printify. Brand Guidelines + Custom Instructions: Sets brand tone and seasonal instructions. Setup Instructions Prerequisites n8n Instance: Ensure n8n is installed and configured. Printify API Key: Obtain an API key from your Printify account. Add it to n8n under HTTP Header Auth. OpenAI API Key: Obtain an API key from OpenAI. Add it to n8n under OpenAI API. Google Sheets Integration: Share your Google Sheets with the Google API service account. Configure Google Sheets credentials in n8n. Workflow Configuration Set Brand Guidelines: Update the Brand Guidelines + Custom Instructions node with your brand name, tone, and seasonal instructions. Batch Size: Configure the Loop Over Items node for optimal batch sizes. Google Sheets Configuration: Set the correct Google Sheets document and sheet names in the integration nodes. Run the Workflow: Start manually or configure the workflow to trigger automatically. Key Notes Customization**: Modify API calls to support other platforms like Printful or Vistaprint. Scalability**: Use batch processing for efficient handling of large product catalogs. Error Handling**: Configure retries or logging for any failed nodes. Output Examples Optimized Content Example Input Title**: "Classic White T-Shirt" Generated Title**: "Stylish Classic White Tee for Everyday Wear" Input Description**: "Plain white T-shirt made of cotton." Generated Description**: "Discover comfort and style with our classic white tee, crafted from premium cotton for all-day wear. Perfect for casual outings or layering." Next Steps Monitor Updates: Use Google Sheets to review logs of updated products. Expand Integration: Add support for more Printify shops or integrate with other platforms. Enhance AI Prompts: Customize prompts for different product categories or seasonal needs. Feel free to reach out for additional guidance or troubleshooting!
by Dustin
Short an simple: This Workflow will sync (add and delete) your Liked Songs to an custom playlist that can be shared. Setup: Create an app on the Spotify Developer Dashboard. Create Spotify Credentials - Just click on one of the Spotify Nodes in the Workflow an click on "create new credentials" and follow the guide. Create the Spotify Playlist that you want to sync to. Copy the exact name of you playlist, go into Node "Edit set Vars" and replace the value "CHANGE MEEEE" with your playlist name. Set your Spotify Credentiels on every Spotify Node. (Should be marekd with Yellow and Red Notes) Do you use Gotify? - No: Delete the Gotify Nodes (all the way to the right end of the Workflow) - Yes: Customize the Gotify Nodes to your needs.