by Roshan Ramani
๐ค GitHub Auto-Assign Bot Streamline your open source project with intelligent issue assignment automation. โจ What It Does Automatically assigns GitHub issues to contributors who comment "assign me" - eliminating manual triage work and creating a fair, first-come-first-served system. ๐ Key Features Smart Detection**: Monitors both new issues and comments for assignment requests Conflict Prevention**: Checks existing assignments before making new ones Auto-Labeling**: Adds "assigned" labels for better tracking Self-Service Assignment**: Contributors claim issues with simple "assign me" command Polite Responses**: Automatically notifies when issues are already assigned ๐ฏ Perfect For Open source maintainers Development teams managing GitHub repos Projects with active contributor communities Anyone reducing manual issue management โ๏ธ Setup Requirements GitHub repository with issues enabled n8n instance with GitHub OAuth credentials 5 minutes configuration time ๐ How Contributors Use It Find an unassigned issue Comment assign me Get automatically assigned Start coding immediately โ no maintainer approval needed! โ Benefits Reduces maintainer workload** - No manual assignments Faster contributor onboarding** - Instant self-service Prevents conflicts** - Built-in assignment checking Scales automatically** - Works across unlimited issues Improves contributor experience** - Simple, clear process โก Workflow Triggers New GitHub issues containing "assign me" New comments with "assign me" on existing issues Automatic label management Conflict resolution responses > Transform your GitHub workflow - Perfect for growing open source projects and development teams!
by Niranjan G
Automated GitHub Scanner for Exposed AWS IAM Keys Overview This n8n workflow automatically scans GitHub for exposed AWS IAM access keys associated with your AWS account, helping security teams quickly identify and respond to potential security breaches. When compromised keys are found, the workflow generates detailed security reports and sends Slack notifications with actionable remediation steps. ๐ Key Features Automated AWS IAM Key Scanning**: Regularly checks for exposed AWS access keys on GitHub Real-time Security Alerts**: Sends immediate Slack notifications when compromised keys are detected Comprehensive Security Reports**: Generates detailed reports with exposure information and risk assessment Actionable Remediation Steps**: Provides clear instructions for securing compromised credentials Continuous Monitoring**: Maintains ongoing surveillance of your AWS environment ๐ Workflow Steps List AWS Users: Retrieves all users from your AWS account Split Users for Processing: Processes each user individually Get User Access Keys: Retrieves access keys for each user Filter Active Keys Only: Focuses only on currently active access keys Search GitHub for Exposed Keys: Scans GitHub repositories for exposed access keys Aggregate Search Results: Consolidates and deduplicates search findings Check For Compromised Keys: Determines if any keys have been exposed Generate Security Report: Creates detailed security reports for compromised keys Extract AWS Usernames: Extracts usernames from AWS response for notification Format Slack Alert: Prepares comprehensive Slack notifications Send Slack Notification: Delivers alerts with actionable information Continue Scanning: Maintains continuous monitoring cycle ๐ ๏ธ Setup Requirements Prerequisites Active n8n instance AWS account with IAM permissions GitHub account/token for searching repositories Slack workspace for notifications Required Credentials AWS Credentials: IAM user with permissions to list users and access keys Access Key ID and Secret Access Key GitHub Credentials: Personal Access Token with search permissions Slack Credentials: Webhook URL for your notification channel โ๏ธ Configuration AWS Configuration: Configure the "List AWS Users" node with your AWS credentials Ensure proper IAM permissions for listing users and access keys GitHub Configuration: Set up the "Search GitHub for Exposed Keys" node with your GitHub token Adjust search parameters if needed Slack Configuration: Configure the Slack node with your webhook URL Customize notification format if desired ๐ Usage Running the Workflow Manual Execution: Click "Execute Workflow" to run an immediate scan Scheduled Execution: Set up a schedule to run periodic scans (recommended daily or weekly) Repository Compatibility This workflow is compatible with both public and private GitHub repositories to which you have access. It will scan all repositories you have permission to view based on your GitHub credentials. Handling Alerts When a compromised key is detected: Review the Slack notification for details about the exposure Follow the recommended remediation steps: Deactivate the compromised key immediately Create a new key if needed Investigate the exposure source Update any services using the compromised key โ ๏ธ Disclaimer This workflow template is provided for reference purposes only to demonstrate how to automate AWS IAM key exposure scanning. Please note: The scanning process may produce false positives as it only matches potential AWS access key patterns Always verify any reported exposures manually before taking action Disabling or deleting access keys without proper verification could have significant negative impacts on your environment Understand which systems and applications rely on identified access keys before deactivating them This template should be customized to fit your specific environment and security policies IMPORTANT: Use this workflow with caution and only after thoroughly understanding your AWS environment. The authors of this template are not responsible for any disruptions or damages resulting from its use. ๐ Security Considerations This workflow requires access to sensitive AWS credentials Store all credentials securely within n8n Review and rotate access keys regularly ๐ Customization Options Adjust GitHub search parameters for more targeted scanning Customize Slack notification format and content Modify security report generation for your specific needs Integrate with additional notification channels (email, MS Teams, etc.) Optional: Enabling Interactive Slack Buttons The Slack Block Kit notification format supports interactive buttons that can be implemented if you want to perform actions directly from Slack: Disable Key: This button can be configured to automatically disable the compromised AWS IAM access key View Details: This button can be set up to show additional information about the exposure Acknowledge: This button can be used to mark the alert as acknowledged To make these buttons functional: Set up a Slack Socket Mode App: Create a Slack app in the Slack API Console Enable Socket Mode and Interactive Components Subscribe to the block_actions event to capture button clicks Create an n8n Webhook Endpoint: Add a new webhook node to receive Slack button click events Create separate workflows for each button action Implement AWS Key Disabling: For the "Disable Key" button, create a workflow that uses the n8n HTTP Request node to call the AWS IAM UpdateAccessKey API Example HTTP request that can be implemented in n8n: Method: POST URL: https://iam.amazonaws.com/ Query Parameters: Action: UpdateAccessKey AccessKeyId: AKIAIOSFODNN7EXAMPLE Status: Inactive UserName: {{$json.username}} Version: 2010-05-08 Update the Slack Message Format: Modify the Format Slack Alert node to include your webhook URL in the button action values Add callback_id and action_id values to identify which button was clicked This implementation allows for immediate response to security incidents directly from the Slack interface, reducing response time and improving security posture.
by Nukeador
Who is this for? BlueSky users looking to automate the publication of new posts based on new items from a RSS feed. What this workflow does This will create a BlueSky post with each new RSS feed item, including the feed title, post image, link and content (up to 200 characters) Setup You'll need to generate a BlueSky app password Configure your feed URL in the first node Configure your credentials in the second node How to customize this workflow to your needs You can modify the message posted in the `Create post node, changing the JSON text` value, in case you want to include only the feed item title instead of the content. If you RSS feed doesn't provide an image, you can define a static one on the `Download image` node.
by Shiv Gupta
๐ต TikTok Post Scraper via Keywords | Bright Data + Sheets Integration ๐ Workflow Description Automatically scrapes TikTok posts based on keyword search using Bright Data API and stores comprehensive data in Google Sheets for analysis and monitoring. ๐ How It Works This workflow operates through a simple, automated process: Keyword Input:** User submits search keywords through a web form Data Scraping:** Bright Data API searches TikTok for posts matching the keywords Processing Loop:** Monitors scraping progress and waits for completion Data Storage:** Automatically saves all extracted data to Google Sheets Result Delivery:** Provides comprehensive post data including metrics, user info, and media URLs โฑ๏ธ Setup Information Estimated Setup Time: 10-15 minutes This includes importing the workflow, configuring credentials, and testing the integration. Most of the process is automated once properly configured. โจ Key Features ๐ Keyword-Based Search Search TikTok posts using specific keywords ๐ Comprehensive Data Extraction Captures post metrics, user profiles, and media URLs ๐ Google Sheets Integration Automatically organizes data in spreadsheets ๐ Automated Processing Handles scraping progress monitoring ๐ก๏ธ Reliable Scraping Uses Bright Data's professional infrastructure โก Real-time Updates Live status monitoring and data processing ๐ Data Extracted | Field | Description | Example | |-------|-------------|---------| | url | TikTok post URL | https://www.tiktok.com/@user/video/123456 | | post_id | Unique post identifier | 7234567890123456789 | | description | Post caption/description | Check out this amazing content! #viral | | digg_count | Number of likes | 15400 | | share_count | Number of shares | 892 | | comment_count | Number of comments | 1250 | | play_count | Number of views | 125000 | | profile_username | Creator's username | @creativity_master | | profile_followers | Creator's follower count | 50000 | | hashtags | Post hashtags | #viral #trending #fyp | | create_time | Post creation timestamp | 2025-01-15T10:30:00Z | | video_url | Direct video URL | https://video.tiktok.com/tos/... | ๐ Setup Instructions Step 1: Prerequisites n8n instance (self-hosted or cloud) Bright Data account with TikTok scraping dataset access Google account with Sheets access Basic understanding of n8n workflows Step 2: Import Workflow Copy the provided JSON workflow code In n8n: Go to Workflows โ + Add workflow โ Import from JSON Paste the JSON code and click Import The workflow will appear in your n8n interface Step 3: Configure Bright Data In n8n: Navigate to Credentials โ + Add credential โ Bright Data API Enter your Bright Data API credentials Test the connection to ensure it's working Update the workflow nodes with your dataset ID: gd_lu702nij2f790tmv9h Replace BRIGHT_DATA_API_KEY with your actual API key Step 4: Configure Google Sheets Create a new Google Sheet or use an existing one Copy the Sheet ID from the URL In n8n: Credentials โ + Add credential โ Google Sheets OAuth2 API Complete OAuth setup and test connection Update the Google Sheets node with your Sheet ID Ensure the sheet has a tab named "Tiktok by keyword" Step 5: Test the Workflow Activate the workflow using the toggle switch Access the form trigger URL to submit a test keyword Monitor the workflow execution in n8n Verify data appears in your Google Sheet Check that all fields are populated correctly โ๏ธ Configuration Details Bright Data API Settings Dataset ID:** gd_lu702nij2f790tmv9h Discovery Type:** discover_new Search Method:** keyword Results per Input:** 2 posts per keyword Include Errors:** true Workflow Parameters Wait Time:** 1 minute between status checks Status Check:** Monitors until scraping is complete Data Format:** JSON response from Bright Data Error Handling:** Automatic retry on incomplete scraping ๐ Usage Guide Running the Workflow Access the form trigger URL provided by n8n Enter your desired keyword (e.g., "viral dance", "cooking tips") Submit the form to start the scraping process Wait for the workflow to complete (typically 2-5 minutes) Check your Google Sheet for the extracted data Best Practices Use specific, relevant keywords for better results Monitor your Bright Data usage to stay within limits Regularly backup your Google Sheets data Test with simple keywords before complex searches Review extracted data for accuracy and completeness ๐ง Troubleshooting Common Issues ๐จ Scraping Not Starting Verify Bright Data API credentials are correct Check dataset ID matches your account Ensure sufficient credits in Bright Data account ๐จ No Data in Google Sheets Confirm Google Sheets credentials are authenticated Verify sheet ID is correct Check that the "Tiktok by keyword" tab exists ๐จ Workflow Timeout Increase wait time if scraping takes longer Check Bright Data dashboard for scraping status Verify keyword produces available results ๐ Use Cases Content Research Research trending content and hashtags in your niche to inform your content strategy. Competitor Analysis Monitor competitor posts and engagement metrics to understand market trends. Influencer Discovery Find influencers and creators in specific topics or industries. Market Intelligence Gather data on trending topics, hashtags, and user engagement patterns. ๐ Security Notes Keep your Bright Data API credentials secure Use appropriate Google Sheets sharing permissions Monitor API usage to prevent unexpected charges Regularly rotate API keys for better security Comply with TikTok's terms of service and data usage policies ๐ Ready to Use! Your TikTok scraper is now configured and ready to extract valuable data. Start with simple keywords and gradually expand your research as you become familiar with the workflow. Need Help? Visit the n8n community forum or check the Bright Data documentation for additional support and advanced configuration options. For any questions or support, please contact: Email or fill out this form
by Roninimous
This workflow integrates iOS Shortcuts with n8n to create a simple, automatic location-based reminder system. When the user arrives at a specified location, an automation in the Shortcuts app sends a webhook trigger to n8n. If the trigger matches predefined date and time conditions, n8n sends a Telegram message reminder to the user. This is perfect for repetitive weekly tasks like taking out the bins, customized with conditions for day and time. Key Features Location-Based Trigger: Uses iOS Shortcuts automation to start the workflow upon arrival at a specific location. Time and Day Validation: Logic in n8n checks current weekday and time to ensure reminders are sent only when appropriate. Telegram Integration: Sends reminders directly to your Telegram account using your bot. Minimal Setup: Uses native iOS and simple webhook setup in n8n. How It Works iOS Shortcut Trigger: When the user arrives at a designated location, the iOS shortcut sends a GET request to the n8n webhook. n8n Webhook Node: Receives the request and triggers the workflow. Conditional Check: An IF node checks if the current time is after 4:00 PM and it's a Wednesday (or any other configured condition). Telegram Node: If the condition passes, n8n sends a message like "Don't forget to take the bins out." to your Telegram bot. Setup Instructions Create a Telegram Bot: Use @BotFather to create a bot and obtain your bot token. Add Telegram API credentials in n8n with your bot token. Setup iOS Shortcut: Open the Shortcuts app on your iPhone. Go to the Automation tab โ Tap + โ Create Personal Automation. Choose Arrive โ Select a location. Add action: Get Contents of URL. Method: GET, URL: your n8n Webhook URL (e.g. https://n8n.yourdomain.com/webhook/your-path). Save the automation. (You can also test the automation by pressing the Play button) Import Workflow into n8n: Load the provided workflow JSON. Set your webhook path and Telegram credentials. Adjust the logic in the IF node to your usecase. In my case, I check if today is Wednesday and after 4 PM until Midnight. Expose n8n Publicly: Ensure your n8n instance is publicly accessible via HTTPS so the shortcut can reach it. Customization Guidance Change Reminder Message: Modify the text inside the Telegram node to suit different reminders. Add More Conditions: Extend the logic to support more days, hours, or different trigger messages. Add Multi-Channel Output: Send reminders via email, SMS, or Slack in addition to Telegram. Use More Triggers: Expand to other types of shortcut triggers (e.g. NFC tag, leaving location, time of day). Security and Implementation Webhook Protection: Avoid using easily guessable webhook URLs. Secure Telegram Token: Store your bot token securely in n8n credentials, not in plain workflow text. Limit Shortcut Scope: Only trigger the shortcut at trusted locations or with secure iCloud sync. Automation Permissions: Ensure your iPhone allows shortcut automations to run without confirmation. Benefits Automates repetitive location-based reminders without user interaction. Provides a lightweight, native solution using iOS and n8n with no extra apps. Keeps you on track for routine tasks like garbage days, medicine reminders, or arrival-based tasks. Easily extendable for multiple locations or trigger conditions.
by Ajith joseph
๐ค Create a Telegram Bot with Mistral AI and Conversation Memory A sophisticated Telegram bot that provides AI-powered responses with conversation memory. This template demonstrates how to integrate any AI API service with Telegram, making it easy to swap between different AI providers like OpenAI, Anthropic, Google AI, or any other API-based AI model. ๐ง How it works The workflow creates an intelligent Telegram bot that: ๐ฌ Maintains conversation history for each user ๐ง Provides contextual AI responses using any AI API service ๐ฑ Handles different message types and commands ๐ Manages chat sessions with clear functionality ๐ Easily adaptable to any AI provider (OpenAI, Anthropic, Google AI, etc.) โ๏ธ Set up steps ๐ Prerequisites ๐ค Telegram Bot Token (from @BotFather) ๐ AI API Key (from any AI service provider) ๐ n8n instance with webhook capability ๐ ๏ธ Configuration Steps ๐ค Create Telegram Bot Message @BotFather on Telegram Create new bot with /newbot command Save the bot token for credentials setup ๐ง Choose Your AI Provider OpenAI: Get API key from OpenAI platform Anthropic: Sign up for Claude API access Google AI: Get Gemini API key NVIDIA: Access LLaMA models Hugging Face: Use inference API Any other AI API service ๐ Set up Credentials in n8n Add Telegram API credentials with your bot token Add Bearer Auth/API Key credentials for your chosen AI service Test both connections ๐ Deploy Workflow Import the workflow JSON Customize the AI API call (see customization section) Activate the workflow Set webhook URL in Telegram bot settings โจ Features ๐ Core Functionality ๐จ Smart Message Routing**: Automatically categorizes incoming messages (commands, text, non-text) ๐ง Conversation Memory**: Maintains chat history for each user (last 10 messages) ๐ค AI-Powered Responses**: Integrates with any AI API service for intelligent replies โก Command Support**: Built-in /start and /clear commands ๐ฑ Message Types Handled ๐ฌ Text Messages**: Processed through AI model with context ๐ง Commands**: Special handling for bot commands โ Non-text Messages**: Polite error message for unsupported content ๐พ Memory Management ๐ค User-specific chat history storage ๐ Automatic history trimming (keeps last 10 messages) ๐ Global state management across workflow executions ๐ค Bot Commands /start ๐ฏ - Welcome message with bot introduction /clear ๐๏ธ - Clears conversation history for fresh start Regular text ๐ฌ - Processed by AI with conversation context ๐ง Technical Details ๐๏ธ Workflow Structure ๐ก Telegram Trigger - Receives all incoming messages ๐ Message Filtering - Routes messages based on type/content ๐พ History Management - Maintains conversation context ๐ง AI Processing - Generates intelligent responses ๐ค Response Delivery - Sends formatted replies back to user ๐ค AI API Integration (Customizable) Current Example (NVIDIA): Model: mistralai/mistral-nemotron Temperature: 0.6 (balanced creativity) Max tokens: 4096 Response limit: Under 200 words ๐ Easy to Replace with Any AI Service: OpenAI Example: { "model": "gpt-4", "messages": [...], "temperature": 0.7, "max_tokens": 1000 } Anthropic Claude Example: { "model": "claude-3-sonnet-20240229", "messages": [...], "max_tokens": 1000 } Google Gemini Example: { "contents": [...], "generationConfig": { "temperature": 0.7, "maxOutputTokens": 1000 } } ๐ก๏ธ Error Handling โ Non-text message detection and appropriate responses ๐ง API failure handling โ ๏ธ Invalid command processing ๐จ Customization Options ๐ค AI Provider Switching To use a different AI service, modify the "NVIDIA LLaMA Chat Model" node: ๐ Change the URL in HTTP Request node ๐ง Update the request body format in "Prepare API Request" node ๐ Update authentication method if needed ๐ Adjust response parsing in "Save AI Response to History" node ๐ง AI Behavior ๐ Modify system prompt in "Prepare API Request" node ๐ก๏ธ Adjust temperature and response parameters ๐ Change response length limits ๐ฏ Customize model-specific parameters ๐พ Memory Settings ๐ Adjust history length (currently 10 messages) ๐ค Modify user identification logic ๐๏ธ Customize data persistence approach ๐ญ Bot Personality ๐ Update welcome message content โ ๏ธ Customize error messages and responses โ Add new command handlers ๐ก Use Cases ๐ง Customer Support**: Automated first-line support with context awareness ๐ Educational Assistant**: Homework help and learning support ๐ฅ Personal AI Companion**: General conversation and assistance ๐ผ Business Assistant**: FAQ handling and information retrieval ๐ฌ AI API Testing**: Perfect template for testing different AI services ๐ Prototype Development**: Quick AI chatbot prototyping ๐ Notes ๐ Requires active n8n instance for webhook handling ๐ฐ AI API usage may have rate limits and costs (varies by provider) ๐พ Bot memory persists across workflow restarts ๐ฅ Supports multiple concurrent users with separate histories ๐ Template is provider-agnostic - easily switch between AI services ๐ ๏ธ Perfect starting point for any AI-powered Telegram bot project ๐ง Popular AI Services You Can Use | Provider | Model Examples | API Endpoint Style | |----------|---------------|-------------------| | ๐ข OpenAI | GPT-4, GPT-3.5 | https://api.openai.com/v1/chat/completions | | ๐ต Anthropic | Claude 3 Opus, Sonnet | https://api.anthropic.com/v1/messages | | ๐ด Google | Gemini Pro, Gemini Flash | https://generativelanguage.googleapis.com/v1beta/models/ | | ๐ก NVIDIA | LLaMA, Mistral | https://integrate.api.nvidia.com/v1/chat/completions | | ๐ Hugging Face | Various OSS models | https://api-inference.huggingface.co/models/ | | ๐ฃ Cohere | Command, Generate | https://api.cohere.ai/v1/generate | Simply replace the HTTP Request node configuration to switch providers!
by Lucรญa Maio Brioso
๐งโ๐ผ Who is this for? If youโre using Notion to manage a database (like saving links, tasks, notes, or anything really), and itโs starting to get messy with duplicate entries, this workflow is for you. Itโs especially useful if you want to keep things tidy without doing any manual cleanup. ๐ง What problem is this workflow solving? Notion doesnโt have a built-in way to find or remove duplicates, so you either clean them up manually ๐ฉ or just let them pile up. This workflow automatically finds entries that share the same property (like a URL or title) and archives the extra copies, keeping just one. โ๏ธ What this workflow does Pulls all pages from a Notion database. Identifies duplicates based on a property you choose. Archives the duplicate pages (which is like soft-deleting them). Keeps one version of each duplicate group. It includes two optional triggers: Run it every day โฐ Or trigger it automatically when a new page is added to the database โก ๐ ๏ธ Setup Connect your Notion account in n8n. Select your database in the Notion nodes. In the โFormat items properlyโ node, replace "SET YOUR PROPERTY HERE" with a reference to the property you want to use for detecting duplicates. I recommend using the n8n property drag-and-drop feature. Enable whichever trigger you prefer โ or both. And thatโs it. It runs on its own after that. ๐งฉ How to customize this workflow to your needs Use a different property for detecting duplicates by updating the Set node. Want to tag duplicates instead of archiving them? Just replace the last Notion node with an update operation. Adjust the schedule to run it hourly, weekly, or whenever suits your setup.
by Richard Uren
Create Products in Shopify from a Google Sheet This workflow creates products in your Shopify store from a google sheet. It also enables inventory tracking and sets the quantity of an inventory item at your store's default location. This is a great way to get test data into test or staging stores to try out apps, update templates or try out new designs. This Automation will only import new products. It will skip existing products if the slug matches an existing product's handle (Shopify's term for a slug). Setup Notes The Google Sheet has the following columns : title - free text description - free text company - free text category - free text status - ACTIVE, DRAFT or ARCHIVE slug - used in the product url, text with no spaces, can also use hyphen. price - sale price of the products compare_at_price - compare at price for products sku - unique code for each product stock_on_hand - quantity of this item available for purchase. Use those labels in the first row of your sheet and N8N will create one object per row with the column names as object fields. Update GraphQL nodes with your Shopify store URL 1) Replace the URL in all GraphQL nodes with the URL for your Shopify store. 2) These GraphQL requests all use the Shopify 2025-04 GraphQL Admin API.
by Juan Carlos Cavero Gracia
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Description See the transformation in action! Here's an example of what this workflow can achieve: This automation template is designed for content creators, social media managers, and anyone looking to breathe new life into old family photos and historical images. It transforms any old black and white or sepia photograph into a colorized, animated video using cutting-edge AI technology, then automatically publishes the results across multiple social media platforms including Facebook, Instagram, YouTube, and X (Twitter). The workflow combines powerful AI services to create engaging content from vintage photographs: first enhancing and colorizing the image using FLUX Kontext, then bringing it to life with realistic animations using Kling Video AI, and finally distributing the results across your social media channels automatically. Note: The estimated cost per workflow execution is approximately $0.29 USD, covering the AI processing for both image colorization and video animation. The upload-post node only works for self-hosted n8n instances, but you can use the standard HTTP request node for uploading content on n8n Cloud.* Who Is This For? Content Creators & Social Media Managers:** Transform historical content into engaging videos that capture audience attention and drive engagement across platforms. Family History Enthusiasts:** Bring old family photos to life by adding color and motion, creating emotional connections with your audience. Marketing Professionals:** Leverage nostalgic content for brand storytelling, using vintage aesthetics to create compelling social media campaigns. Digital Artists & Photo Restorers:** Streamline the process of enhancing and sharing restored vintage photographs with automated AI enhancement. Social Media Influencers:** Create unique, eye-catching content from historical images that stands out in crowded social feeds. What Problem Does This Workflow Solve? Creating engaging social media content from old photos typically requires multiple manual steps: photo restoration, colorization, animation, and then individual posting to each platform. This workflow addresses these challenges by: Automating Photo Enhancement:** Uses advanced AI (FLUX Kontext) to automatically colorize and enhance old photographs, removing artifacts and improving quality. Creating Dynamic Content:** Transforms static images into animated videos using Kling Video AI, making historical photos come alive with natural movements. Streamlining Multi-Platform Publishing:** Automatically distributes the final animated videos across Facebook, Instagram, YouTube, and X with a single workflow execution. Saving Time & Effort:** Eliminates the need for manual photo editing, video creation, and individual social media posting. How It Works Photo Upload: Users submit old photographs through a simple web form, with optional custom animation descriptions. Image Enhancement: The workflow uploads the photo to imgbb, then sends it to FLUX Kontext AI for colorization and quality enhancement. Animation Creation: The colorized image is processed by Kling Video AI to create a 5-second animated video with natural movements. Cloud Storage: The final video is automatically saved to Google Drive for backup and easy access. Multi-Platform Publishing: The animated video is simultaneously posted to Facebook, Instagram, YouTube, and X using the upload-post service. Setup FAL.AI API Key: Sign up at fal.ai and add your API key to the HTTP Request nodes for both FLUX Kontext and Kling Video AI services. ImgBB API Token: Create a free account at api.imgbb.com to get an API token for image hosting, then update the "Upload Image to imgbb" node. Google Drive Connection: Connect your Google Drive account to enable automatic video storage and backup. Upload-Post Service: Create an account at upload-post.com to get your API credentials for multi-platform social media posting. Important: The upload-post node currently only works with self-hosted n8n instances. For n8n Cloud users, replace the upload-post node with standard HTTP request nodes to publish to each social media platform individually. Form Customization: (Optional) Modify the form fields in the "Photo Upload Form" node to collect additional information or customize the user experience. Requirements Accounts:** n8n, FAL.AI, ImgBB, Google Drive, upload-post.com API Keys & Credentials:** FAL.AI API Key, ImgBB API Token, Google Drive OAuth2, Upload-post.com API Token & User ID File Types:** Supports JPG, PNG image formats for photo uploads Cost:** Approximately $0.29 USD per workflow execution for AI processing Transform your old photographs into viral social media content with this powerful AI-driven workflow that handles everything from restoration to distribution automatically.
by Airtop
Trump-o-meter: Extract and Evaluate Truth Social Posts Use Case Automatically extracting posts from Donald Trump's Truth Social account and estimating their potential impact on the U.S. stock market enables teams to monitor high-profile communications that may influence financial markets. This automation streamlines intelligence gathering for analysts, traders, and policy observers. What This Automation Does This automation retrieves up to 3 posts from Donald Trump's Truth Social profile and outputs structured information including: Author name Image URL Post text Post URL Estimated stock market impact: Direction: positive, negative, or neutral Magnitude: None, Small, Medium, Large How It Works Creates a browser session on Truth Social using an Airtop profile. Navigates to https://truthsocial.com/@realDonaldTrump. Uses a natural language prompt with a defined JSON schema to extract structured data for up to 3 posts. Splits the results into individual post items. Filters posts that contain actual content and have a non-zero estimated market impact. Sends selected posts and impact summaries to a Slack channel. Terminates the browser session to clean up. Setup Requirements Airtop API Key โ free to generate. An Airtop Profile that is connected and logged into Truth Social. A Slack workspace and authorized app with write permissions to a target channel. Next Steps Integrate with Trading Signals**: Link output to financial alert systems or dashboards for timely insights. Expand Monitoring**: Extend to other high-impact accounts (e.g., politicians, CEOs). Enhance Analysis**: Add sentiment scoring or topic classification for deeper context. Legal Disclaimer This tool is intended solely for informational and analytical purposes. The market impact estimations provided are speculative and should not be construed as financial advice. Do not make investment decisions based on this automation. Always consult with a licensed financial advisor before making any trades. Read more about Trump-o-meter automation
by Elie Kattar
Multi-Channel Customer Support Automation Suite Transform your customer support operations with this enterprise-grade automation workflow that unifies, categorizes, and intelligently routes support tickets from multiple channels. ๐ฏ Overview This comprehensive n8n workflow automates your entire customer support pipeline, reducing response times by up to 80% while ensuring no customer inquiry goes unnoticed. It seamlessly integrates email, web forms, and webhooks into a single, intelligent support system that works 24/7. ๐ก Key Benefits Unified Inbox**: Consolidate support requests from email, web forms, chat, and social media into one streamlined workflow Instant Response**: Automatically acknowledge tickets with intelligent, category-specific responses within seconds Smart Routing**: Use AI-powered categorization to route tickets to the right team instantly Priority Detection**: Automatically identify and escalate urgent issues and VIP customers Team Collaboration**: Real-time Slack notifications with color-coded priority alerts Zero Setup Hassle**: Pre-configured with industry best practices and ready to deploy ๐ Core Features Intelligent Ticket Processing Automatic categorization into billing, technical, account, feature requests, and complaints Sentiment analysis to detect frustrated customers Priority assignment based on keywords, customer status, and urgency indicators Custom tagging for easy tracking and reporting Multi-Channel Integration IMAP email monitoring for support inboxes Webhook endpoints for web forms and chat widgets Expandable architecture for social media channels Unified message format regardless of source Automated Response System Category-specific email templates Personalized responses with ticket IDs Smart logic to skip auto-responses for urgent/negative cases Customizable templates for your brand voice Team Notifications & Escalation Real-time Slack alerts with full ticket context Color-coded priorities (red/urgent, orange/high, green/normal) One-click actions to view or claim tickets Automatic escalation rules for time-sensitive issues CRM & Analytics Ready Pre-configured for major CRM systems (Zendesk, HubSpot, Salesforce) Comprehensive logging for performance metrics Error handling with admin notifications Built-in success/failure tracking ๐ Use Cases SaaS Companies: Handle subscription issues, technical bugs, and feature requests with specialized routing to product, engineering, and billing teams. E-commerce: Manage order inquiries, shipping issues, and returns while maintaining high customer satisfaction scores. Agencies: Provide white-label support services with customizable branding and client-specific routing rules. Startups: Scale support operations without hiring additional staff by automating 70% of routine inquiries. ๐ ๏ธ Technical Specifications Channels Supported**: Email (IMAP), Web Forms, Webhooks, expandable to social media Response Time**: < 2 seconds for auto-responses Categorization Accuracy**: 85%+ with keyword matching, 95%+ with AI enhancement Scalability**: Handles 1,000+ tickets/day on standard n8n infrastructure Integration Ready**: Slack, all major CRMs, SMTP, custom APIs ๐ฐ ROI & Impact Typical results from implementing this workflow: 80% reduction** in first response time 60% decrease** in ticket handling time 40% of tickets** resolved automatically 95% customer satisfaction** for auto-responded tickets Save 20+ hours/week** of manual ticket sorting ๐ What's Included Complete n8n workflow JSON (ready to import) 5 pre-configured auto-response templates Intelligent categorization rules for common support scenarios Priority detection algorithms Slack notification formatting Error handling and recovery logic Setup documentation and customization guide ๐ง Requirements n8n instance (self-hosted or cloud) Email account with IMAP/SMTP access Slack workspace (for notifications) CRM system (optional but recommended) ๐ฆ Quick Setup Import the workflow JSON Configure email and Slack credentials Customize auto-response templates Connect your CRM Go live in under 30 minutes Perfect for businesses handling 50-5,000 support tickets monthly who want to deliver exceptional customer service while reducing operational costs.
by Extruct AI
Whoโs it for: Investors, analysts, and startup enthusiasts who need a complete overview of startups, including industry, product, funding, and leadership information. How it works / What it does: Enter a startupโs name into the form, and the workflow will automatically collect and organize details such as the companyโs industry, product, investors, and key decision-makers. All this information is neatly updated in your Google Sheet, making it easy to track and compare startups. How to set up: Sign up for Extruct at www.extruct.ai/. Open the Extruct table template, copy the table ID from the URL, and save it. Copy the Google Sheets template to your own Drive. Paste the table ID into the variables node in your n8n flow. Set up Bearer authentication in each HTTP Request node using your Extruct API token. In the Google Sheets node, paste your template link and connect your Google account. Run the flow once to reveal the mapping fields, then match each field to the correct column. Activate the flow and add startups via the form. Requirements: Extruct account and API token Extruct table template Google account with Google Sheets How to customize the workflow: Add new columns in both the Extruct table and your Google Sheet, then map them in the Google Sheets node to track additional startup data.