by ScoutNow
Monitor and record your personal or competitorsโ social media growth with this scalable n8n automation template. Using official APIs from X (formerly Twitter) and YouTube, the workflow fetches daily follower and subscriber counts, stores them in a structured n8n Data Table, and now sends automated weekly summary emails via Gmail. Built with extensibility in mind, this workflow is ready for future updates to support additional platforms like Instagram, TikTok, LinkedIn, and more. Features ๐ Daily Social Media Tracking Automatically collects X follower counts and YouTube subscriber numbers based on usernames, not channel IDs. ๐ฆ Data Table Logging Cleanly stores daily metrics in a dedicated n8n Data Table with timestamps. ๐ง Weekly Email Reports (New) Sends a concise weekly summary of growth trends using the Gmail node. ๐ Easy Customization Swap out usernames in a couple of fields โ no deep edits required. ๐ ๏ธ Extensible Design Structure is ready to support more platforms (e.g., TikTok, Instagram, LinkedIn). ๐งฉ API-Based Accuracy Uses official APIs from X and YouTube for real-time, reliable data. Setup Instructions 1. Get API Credentials X API (Bearer Token)** โ developer.x.com YouTube API Key** โ Google Cloud Console Gmail Credentials* โ Enable the *Gmail API** in your Google Cloud project and configure OAuth2 credentials for use in n8n. 2. Configure in n8n Import the template In the HTTP Request nodes: Add Bearer Auth (for X API) Add Query Auth with your YouTube API Key (?key=<your_api_key>) In the Gmail node: Connect your Gmail account via OAuth2 credentials Customize recipient email(s) and message format Edit these fields to track your accounts: xUsername โ your X / Twitter handle ytChannelUsername โ your YouTube channelโs username 3. Create the Data Table Inside the n8n dashboard, create a table with the following fields: | Field Name | Type | | ------------------- | -------- | | date | DateTime | | xFollowersCount | Number | | ytSubscriberCount | Number | How It Works Trigger: Daily cron node starts the workflow. Fetch X Followers: Grabs follower count using the X API. Fetch YouTube Subscribers: Retrieves sub count using the YouTube API. Store Data: Logs all values into your Data Table with a timestamp. Weekly Email Summary: Once a week, the Gmail node compiles recent data and emails a growth report. Future Expansion: The structure is ready to include more platforms. Use Cases Track your brand or competitorโs social growth daily Receive weekly email reports on follower/subscriber changes Build custom dashboards or growth charts Compare performance across platforms Generate automated growth reports Requirements X API Bearer Token YouTube API Key Gmail API Credentials Access to n8n (cloud or self-hosted) Delivery Options You can extend this template to: Post daily growth summaries to Slack or Telegram Auto-update a Google Sheet or Notion database Send more detailed weekly reports Add new platforms (Instagram, TikTok, LinkedIn, etc.) Details | Node | Function | | ------------ | ----------------------------------- | | HTTP Request | Pull data from APIs | | Cron | Trigger workflow daily | | Data Table | Store historical growth data | | Gmail | Send weekly email reports | | Set | Define usernames and settings |
by Dmytro
Description This automation template enables you to publish content from a Google Drive folder directly to multiple social platforms โ TikTok, Instagram, YouTube, LinkedIn, Telegram, Bluesky, X (Twitter), and Threads. By connecting PostPulse with n8n, you can transform a manual posting routine into a seamless automated workflow, ensuring consistent cross-platform publishing without repetitive tasks. โ ๏ธ *Disclaimer:* This workflow uses the community node *@postpulse/n8n-nodes-postpulse*. Make sure community nodes are enabled in your n8n instance before importing and using this template. ๐ To install it: Go to Settings โ Community Nodes โ Install and enter:"@postpulse/n8n-nodes-postpulse". ๐ก For more details, see n8n Integration Guide: PostPulse Developers โ n8n Integration. Who Is This For? Marketers** who want to manage multiple accounts at once. Creators** who store media on Google Drive and want to quickly expand it. Teams** for whom it is important to centralize the content plan and have a transparent publishing system. What Problem Does This Workflow Solve? Instead of manually uploading photos or videos to TikTok, Instagram, YouTube or other social networks, you get: Centralizing uploads:** Add your media once to Google Drive, and the system takes care of publishing it everywhere. Multi-platform posting:** Publish simultaneously to TikTok, Instagram, YouTube, LinkedIn, Telegram, Bluesky, X, and Threads. Streamlined scheduling:** Schedule future posts with PostPulse directly through n8n. Error reduction:** Avoid mistakes caused by copy-pasting across platforms. How It Works File Upload: Place your media file (image or video) into a designated Google Drive folder. File Processing: n8n automatically downloads the file and prepares it for upload. Account Retrieval: PostPulse retrieves your connected accounts (TikTok, Instagram, YouTube, etc.). Media Upload: The file is uploaded to PostPulse via n8n. Automation: PostPulse automatically distributes it to TikTok, Instagram, YouTube, LinkedIn, Telegram, Bluesky, X, and Threads. Publishing: PostPulse schedules or directly publishes the post to the selected platforms. Setup Connect PostPulse to n8n Request your OAuth client key and secret from support@post-pulse.com. Add your PostPulse account in n8n Credentials. Connect Google Drive Create a Google Cloud project. Enable the Google Drive API. Configure OAuth credentials and connect your Google Drive account to n8n. Configure Google Drive Trigger Point it to the folder where you will upload your media. Upload Media Node Add the PostPulse โUpload Mediaโ node to process files from Google Drive. Schedule Posts Add the PostPulse โSchedule Postโ node. Map content, media path, and connected account IDs. (Optional) Metadata from Google Sheets Use Google Sheets as a source of captions, hashtags, or scheduling details. Requirements Connected Accounts at PostPulse (TikTok, Instagram, YouTube, LinkedIn, Telegram, Bluesky, X, Threads). OAuth client key and secret requested from support@post-pulse.com. Google Cloud Project with the Google Drive API enabled and valid OAuth credentials. โจ With this workflow, PostPulse and n8n become your all-in-one automation hub for social publishing.
by Dariusz Koryto
Automated FTP File Migration with Smart Cleanup and Email Notifications Overview This n8n workflow automates the secure transfer of files between FTP servers on a scheduled basis, providing enterprise-grade reliability with comprehensive error handling and dual notification systems (email + webhook). Perfect for data migrations, automated backups, and multi-server file synchronization. What it does This workflow automatically discovers, filters, transfers, and safely removes files between FTP servers while maintaining complete audit trails and sending detailed notifications about every operation. Key Features: Scheduled Execution**: Configurable timing (daily, hourly, weekly, or custom cron expressions) Smart File Filtering**: Regex-based filtering by file type, size, date, or name patterns Safe Transfer Protocol**: Downloads โ Uploads โ Validates โ Cleans up source Dual Notifications**: Email alerts + webhook integration for both success and errors Comprehensive Logging**: Detailed audit trail of all operations with timestamps Error Recovery**: Automatic retry logic with exponential backoff for network issues Production Ready**: Built-in safety measures and extensive documentation Use Cases ๐ข Enterprise & IT Operations Data Center Migration**: Moving files between different hosting environments Backup Automation**: Scheduled transfers to secondary storage locations Multi-Site Synchronization**: Keeping files in sync across geographic locations Legacy System Integration**: Bridging old and new systems through automated transfers ๐ Business Operations Document Management**: Automated transfer of contracts, reports, and business documents Media Asset Distribution**: Moving images, videos, and marketing materials between systems Data Pipeline**: Part of larger ETL processes for business intelligence Compliance Archiving**: Moving files to compliance-approved storage systems ๐ง Development & DevOps Build Artifact Distribution**: Deploying compiled applications across environments Configuration Management**: Synchronizing config files between servers Log File Aggregation**: Collecting logs from multiple servers for analysis Automated Deployment**: Moving release packages to production servers How it works ๐ Workflow Steps Schedule Trigger โ Initiates workflow at specified intervals File Discovery โ Lists files from source FTP server with optional recursion Smart Filtering โ Applies customizable filters (type, size, date, name patterns) Secure Download โ Retrieves files to temporary n8n storage with retry logic Safe Upload โ Transfers files to destination with directory auto-creation Transfer Validation โ Verifies successful upload before proceeding Source Cleanup โ Removes original files only after confirmed success Comprehensive Logging โ Records all operations with detailed metadata Dual Notifications โ Sends email + webhook notifications for success/failure ๐ Error Handling Flow Network Issues** โ Automatic retry with exponential backoff (3 attempts) Authentication Problems** โ Immediate email alert with troubleshooting steps Permission Errors** โ Detailed logging with recommended actions Disk Space Issues** โ Safe failure with source file preservation File Corruption** โ Integrity validation with rollback capability Setup Requirements ๐ Credentials Needed Source FTP Server Host, port, username, password Read permissions required SFTP recommended for security Destination FTP Server Host, port, username, password Write permissions required Directory creation permissions SMTP Email Server SMTP host and port (e.g., smtp.gmail.com:587) Authentication credentials For success and error notifications Monitoring API (Optional) Webhook URL for system integration Authentication tokens if required โ๏ธ Configuration Steps Import Workflow โ Load the JSON template into your n8n instance Configure Credentials โ Set up all required FTP and SMTP connections Customize Schedule โ Adjust cron expression for your timing needs Set File Filters โ Configure regex patterns for your file types Configure Paths โ Set source and destination directory structures Test Thoroughly โ Run with test files before production deployment Enable Monitoring โ Activate email notifications and logging Customization Options ๐ Scheduling Examples 0 2 * * * # Daily at 2 AM 0 */6 * * * # Every 6 hours 0 8 * * 1-5 # Weekdays at 8 AM 0 0 1 * * # Monthly on 1st */15 * * * * # Every 15 minutes ๐ File Filter Patterns Documents \\.(pdf|doc|docx|xls|xlsx)$ Images \\.(jpg|jpeg|png|gif|svg)$ Data Files \\.(csv|json|xml|sql)$ Archives \\.(zip|rar|7z|tar|gz)$ Size-based (add as condition) {{ $json.size > 1048576 }} # Files > 1MB Date-based (recent files only) {{ $json.date > $now.minus({days: 7}) }} ๐ Directory Organization // Date-based structure /files/{{ $now.format('YYYY/MM/DD') }}/ // Type-based structure /files/{{ $json.name.split('.').pop() }}/ // User-based structure /users/{{ $json.owner || 'system' }}/ // Hybrid approach /{{ $now.format('YYYY-MM') }}/{{ $json.type }}/ Template Features ๐ก๏ธ Safety & Security Transfer Validation**: Confirms successful upload before source deletion Error Preservation**: Source files remain intact on any failure Audit Trail**: Complete logging of all operations with timestamps Credential Security**: Secure storage using n8n's credential system SFTP Support**: Encrypted transfers when available Retry Logic**: Automatic recovery from transient network issues ๐ง Notification System Success Notifications: Confirmation email with transfer details File metadata (name, size, transfer time) Next scheduled execution information Webhook payload for monitoring systems Error Notifications: Immediate email alerts with error details Troubleshooting steps and recommendations Failed file information for manual intervention Webhook integration for incident management ๐ Monitoring & Analytics Execution Logs**: Detailed history of all workflow runs Performance Metrics**: Transfer speeds and success rates Error Tracking**: Categorized failure analysis Audit Reports**: Compliance-ready activity logs Production Considerations ๐ Performance Optimization File Size Limits**: Configure timeouts based on expected file sizes Batch Processing**: Handle multiple files efficiently Network Optimization**: Schedule transfers during off-peak hours Resource Monitoring**: Track n8n server CPU, memory, and disk usage ๐ง Maintenance Regular Testing**: Validate credentials and connectivity Log Review**: Monitor for patterns in errors or performance Credential Rotation**: Update passwords and keys regularly Documentation Updates**: Keep configuration notes current Testing Protocol ๐งช Pre-Production Testing Phase 1: Test with 1-2 small files (< 1MB) Phase 2: Test error scenarios (invalid credentials, network issues) Phase 3: Test with representative file sizes and volumes Phase 4: Validate email notifications and logging Phase 5: Full production deployment with monitoring โ ๏ธ Important Testing Notes Disable Source Deletion** during initial testing Use test directories to avoid production data impact Monitor execution logs** carefully during testing Validate email delivery** to ensure notifications work Test rollback procedures** before production use Support & Documentation This template includes: 8 Comprehensive Sticky Notes** with visual documentation Detailed Node Comments** explaining every configuration option Error Handling Guide** with common troubleshooting steps Security Best Practices** for production deployment Performance Tuning** recommendations for different scenarios Technical Specifications n8n Version**: 1.0.0+ Node Count**: 17 functional nodes + 8 documentation sticky notes Execution Time**: 2-10 minutes (depending on file sizes and network speed) Memory Usage**: 50-200MB (scales with file sizes) Supported Protocols**: FTP, SFTP (recommended) File Size Limit**: Up to 150MB per file (configurable) Concurrent Files**: Processes files sequentially for stability Who is this for? ๐ฏ Primary Users System Administrators** managing file transfers between servers DevOps Engineers** automating deployment and backup processes IT Operations Teams** handling data migration projects Business Process Owners** requiring automated file management ๐ผ Industries & Use Cases Healthcare**: Patient data archiving and compliance reporting Financial Services**: Secure document transfer and regulatory reporting Manufacturing**: CAD file distribution and inventory data sync E-commerce**: Product image and catalog management Media**: Asset distribution and content delivery automation
by Anthony
Description This workflow automates video distribution to 9 social platforms simultaneously using Blotato's API. It includes both a scheduled publisher (checks Google Sheets for videos marked "Ready") and a subworkflow (can be called from other workflows). Perfect for creators and marketers who want to eliminate manual posting across Instagram, YouTube, TikTok, Facebook, LinkedIn, Threads, Twitter, Bluesky, and Pinterest. How It Works Scheduled Publisher Workflow Schedule Trigger โ Runs daily at 10 PM (configurable). Fetch Video โ Pulls video URL and description from Google Sheets where "ReadyToPost" = "Ready". Upload to Blotato โ Sends video to Blotato's media service. Broadcast to 9 Platforms โ Publishes simultaneously to all connected social accounts. Update Sheet โ Changes "ReadyToPost" to "Finished" so it won't repost. Subworkflow: Video Publisher (Reusable) Receive Input โ Gets URL, title, and description from parent workflow. Fetch Credentials โ Pulls Blotato API key from n8n Data Table. Upload & Distribute โ Uploads to Blotato, then posts to all platforms. Completion Signal โ Returns to parent workflow when done. > ๐ก Tip: The subworkflow can be called from ANY workflow - great for posting videos generated by AI workflows, webhook triggers, or manual forms. Test Workflow (Optional) Form Submission โ Upload a video file with title and description. Upload to Dropbox โ Generates shareable URL via "[SUB] Dropbox Upload Link" subworkflow. Trigger Publisher โ Calls the subworkflow above to distribute the video. Setup Instructions Estimated Setup Time: 20-25 minutes Step 1: Blotato Account Setup Create account at Blotato Dashboard Connect all your social media accounts (most time-consuming step) Go to Settings and copy your account IDs for each platform Go to API Settings and copy your API key Step 2: Configure Workflow Update Social IDs: Open "Assign Social Media IDs" node Replace placeholder IDs with your actual Blotato account IDs: { "instagram_id": "YOUR_ID", "youtube_id": "YOUR_ID", "tiktok_id": "YOUR_ID", ... } Create Data Table: Create n8n Data Table named "Credentials" Add columns: service and token Add row: service = blotato, token = YOUR_API_KEY Set Up Google Sheet: Create sheet with columns: URL VIDEO, ReadyToPost, Description, Titre (Title) Add video data Set ReadyToPost to "Ready" for videos you want to post Connect Your Sheet: Update "Get my video" node with your Google Sheet ID > โ๏ธ Pro Tip: If you don't need the scheduled version, just use the subworkflow and call it from other workflows. Use Cases AI Video Workflows:** Automatically post videos generated by Veo, Sora, or other AI models to all platforms. Content Schedulers:** Queue videos in Google Sheets, let the scheduler post them automatically. Batch Publishing:** Generate 10 videos, mark them all "Ready", and let the workflow distribute them. Marketing Campaigns:** Coordinate multi-platform launches with a single click. Agencies:** Manage multiple client accounts by swapping Blotato credentials in the Data Table. Customization Options Remove Unused Platforms:** Disconnect any social media nodes you don't use (speeds up execution). Change Schedule:** Modify the Schedule Trigger to run multiple times per day or on specific days. Different File Hosts:** Replace Dropbox with Google Drive, S3, or Cloudinary in the test workflow. Platform-Specific Captions:** Add IF nodes before each platform to customize descriptions or add hashtags. Add Approval Step:** Insert a WhatsApp or Telegram notification before posting for manual review. Watermarks:** Add a Code node to overlay branding before uploading to Blotato. Important Notes โ ๏ธ Two Workflows in One File: Lines 1-600: Scheduled publisher (checks Google Sheets) Lines 600+: Subworkflow (called by other workflows) โ ๏ธ Data Table vs. Hardcoding: Scheduled workflow: Hardcoded API keys in HTTP nodes Subworkflow: Uses Data Table for API keys (recommended approach) โ ๏ธ Why Use the Subworkflow? Can be called from ANY workflow Easier to manage API keys (one place to update) More flexible for complex automation systems
by David Olusola
๐ฐ Auto-Send PDF Invoice When Stripe Payment is Received This workflow automatically generates a PDF invoice every time a successful payment is received in Stripe, then emails the invoice to the customer via Gmail. Perfect for freelancers, SaaS businesses, and service providers who want to automate billing without manual effort. โ๏ธ How It Works Stripe Payment Webhook Listens for successful payment events (payment_intent.succeeded). Triggers the workflow whenever a new payment is made. Normalize Payment Data A Code node extracts and formats details like: Payment ID Amount & currency Customer name & email Payment date Description Generates a unique invoice number. Generate Invoice HTML A Code node builds a professional invoice template in HTML. Data is dynamically inserted (amount, customer info, invoice number). Output prepared for PDF generation. Send Invoice Email The Gmail node sends an email to the customer. Invoice is attached as a PDF file. Includes a confirmation message with payment details. ๐ ๏ธ Setup Steps 1. Stripe Webhook In your Stripe Dashboard: Navigate to Developers โ Webhooks Add a new endpoint with your Webhook URL from the n8n Webhook node. Select event: payment_intent.succeeded 2. Gmail Setup In n8n, connect your Gmail OAuth2 credentials. Emails will be sent directly from your Gmail account. 3. Customize Invoice Open the Generate Invoice HTML node. Replace "Your Company Name" with your actual business name. Adjust invoice branding, colors, and layout as needed. ๐ง Example Email Sent Subject: Invoice INV-123456789 - Payment Confirmation Body: Dear John Doe, Thank you for your payment! Please find your invoice attached. Payment Details: Amount: USD 99.00 Payment ID: pi_3JXXXXXXXX Date: 2025-08-29 Best regards, Your Company Name (Attached: invoice_INV-123456789.pdf) โก With this workflow, every Stripe payment automatically creates and delivers a polished PDF invoice โ no manual work required.
by vinci-king-01
How it works This workflow automatically monitors government regulatory changes and provides comprehensive compliance tracking and executive alerts. Key Steps Scheduled Monitoring - Runs daily at 9 AM to check for new regulatory changes from government sources. AI-Powered Scraping - Uses ScrapeGraphAI to extract regulatory information from Federal Register and government websites. Impact Assessment - Analyzes each regulation for business impact, risk factors, and compliance requirements. Compliance Tracking - Creates detailed tracking records with assigned teams, deadlines, and action items. Executive Alerts - Sends prioritized alerts to relevant teams based on impact level and urgency. Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for web scraping capabilities. Set up email connections - Configure email service to send executive alerts to compliance and legal teams. Customize monitoring targets - Update the government website URLs to monitor specific agencies or regulatory bodies. Adjust alert recipients - Configure email distribution lists for different impact levels (Critical, High, Medium, Low). Set up compliance tracking - Integrate with your project management system for task assignment and progress tracking. Key Features Automated Impact Assessment** - Uses AI to evaluate regulatory impact on your business sectors Priority-Based Alerts** - Sends urgent notifications for critical regulations requiring immediate attention Compliance Task Generation** - Automatically creates compliance checklists and action items Team Assignment** - Routes regulations to appropriate teams based on impact level Deadline Tracking** - Monitors comment deadlines, effective dates, and review timelines
by Matheus Pedrosa
Workflow Overview Keeping API documentation updated is a challenge, especially when your endpoints are powerful n8n webhooks. This project solves that problem by turning your n8n instance into a self-documenting API platform. This workflow acts as a central engine that scans your entire n8n instance for designated webhooks and automatically generates a single, beautiful, and interactive HTML documentation page. By simply adding a standard Set node with specific metadata to any of your webhook workflows, you can make it instantly appear in your live documentation portal, complete with code examples and response schemas. The final output is a single, callable URL that serves a professional, dark-themed, and easy-to-navigate documentation page for all your automated webhook endpoints. Key Features: Automatic Discovery:** Scans all active workflows on your instance to find endpoints designated for documentation. Simple Configuration via a Set Node:** No custom nodes needed! Just add a Set node named API_DOCS to any workflow you want to document and fill in a simple JSON structure. Rich HTML Output:** Dynamically generates a single, responsive, dark-mode HTML page that looks professional right out of the box. Interactive UI:** Uses Bootstrap accordions, allowing users to expand and collapse each endpoint to keep the view clean and organized. Developer-Friendly:** Automatically generates a ready-to-use cURL command for each endpoint, making testing and integration incredibly fast. Zero Dependencies:** The entire solution runs within n8n. No need to set up or maintain external documentation tools like Swagger UI or Redoc. Setup Instructions: This solution has two parts: configuring the workflows you want to document, and setting up this generator workflow. Part 1: In Each Workflow You Want to Document Next to your Webhook trigger node, add a Set node. Change its name to API_DOCS. Create a single variable named jsonOutput (or docsData) and set its type to JSON. Paste the following JSON structure into the value field and customize it with your endpoint's details: { "expose": true, "webhookPath": "PASTE_YOUR_WEBHOOK_PATH_HERE", "method": "POST", "summary": "Your Endpoint Summary", "description": "A clear description of what this webhook does.", "tags": [ "Sales", "Automation" ], "requestBody": { "exampleKey": "exampleValue" }, "successCode": 200, "successResponse": { "status": "success", "message": "Webhook processed correctly." }, "errorCode": 400, "errorResponse": { "status": "error", "message": "Invalid input." } } Part 2: In This Generator Workflow n8n API Node: Configure the GetWorkflows node with your n8n API credentials. It needs permission to read workflows. Configs Node: Customize the main settings for your documentation page, like the title (name_doc), version, and a short description. Webhook Trigger: The Webhook node at the start (default path is /api-doc) provides the final URL for your documentation page. Copy this URL and open it in your browser. Required Credentials: n8n API Credentials: To allow this workflow to read your other workflows.
by Nexio_2000
This n8n template demonstrates how to export all icons metadata from an Iconfinder account into an organized format with previews, names, iconset names and tags. It generates HTML and CSV outputs. Good to know Iconfinder does not provide a built-in feature to export all icon data at once for contributors, which motivated the creation of this workflow. The workflow exports all iconsets for selected user account and can handle large collections. Preview image URLs are extracted in a consistent size (e.g., 128x128) for easy viewing. Basic icon metadata, including tags and iconset names is included for reference or further automation. How it works The workflow fetches all iconsets from your Iconfinder account. The workflow loops through all your iconsets, handling pagination automatically if an iconset contains more than 100 icons. Each icon is processed to retrieve its metadata, including name, tags, preview image URLs, and the iconset name it belongs to. An HTML file with a preview table and a CSV file with all icon details are generated. How to use Retrieve your User ID โ A dedicated node in the workflow is available to fetch your Iconfinder user ID. This ensures the workflow knows which contributor account to access. Setup API access โ The workflow includes a setup node where you provide your Iconfinder API key. This node passes the authorization token to all subsequent HTTP request nodes, so you donโt need to manually enter it multiple times. Trigger the workflow โ You can start it manually or attach it to a different trigger, such as a webhook or schedule. Export Outputs โ The workflow generates an HTML file with preview images and a CSV file containing all metadata. Both files are ready for download or further processing. Requirements Iconfinder account with an API key. Customising this workflow You can adjust the preview size or choose which metadata to include in HTML and CSV outputs. Combine with other workflows to automate asset cataloging.
by Trung Tran
Automated Slack Channel Audit Workflow with Chatbot and GPT-4.1 > Automatically scans all public Slack channels weekly to detect those with no activity in the past 30 days, then generates and sends a detailed inactivity report to admins for review and action. Helps keep your Slack workspace clean, relevant, and clutter-free. ๐งโ๐ผ Whoโs it for This workflow is built for: Slack Workspace Admins** IT or Ops Managers** HR/Compliance Teams** โฆwho want to maintain a clean and active Slack workspace by regularly reviewing inactive channels. โ๏ธ How it works / What it does This automated n8n workflow: Runs weekly via a scheduled trigger. Fetches all public Slack channels in the workspace. Checks message history of each channel for activity. Filters channels that have had no discussion in the past 30 days. Generates a Slack-friendly report with key metadata (name, member count, purpose, etc.). Sends the report to a Slack channel for admin review and possible action (e.g., archive, engage, repurpose). ๐ ๏ธ How to set up Configure your Slack App Go to https://api.slack.com/apps โ Create App Add the following OAuth scopes to your Bot Token: channels:read โ to get list of public channels channels:history โ to fetch message history users:read โ to personalize report (optional) chat:write โ to post the report to a Slack channel Install the app in your workspace Copy the Bot User OAuth Token Add it to your n8n Slack credentials under "Slack API" Customize the schedule in the "Weekly Schedule Trigger" node to control report frequency. Connect your Slack workspace to the workflow using your credentials. โ Requirements n8n (self-hosted or cloud) Slack App with: channels:read channels:history chat:write Active channels and member data A designated Slack channel to receive the report ๐ง How to customize the workflow | Component | Customization Options | |----------|------------------------| | โฑ๏ธ Schedule Trigger | Change to daily, monthly, etc. | | ๐ Inactivity Threshold | Modify Filter channel with last discussion 30 days ago to 60/90 days | | ๐ Report Formatting | Tweak the Consume slack report node to change formatting or summary | | ๐ฌ Output Channel | Change target channel in Send Channel Inactivity Report | | ๐ซ Auto-archiving | Add logic to archive channels with 0 members or activity (using Slack API) | ๐ Slack Permissions Reference | Action | Slack Scope Required | |--------|-----------------------| | Get all public channels | channels:read | | Get message history of a channel | channels:history | | Post message to Slack | chat:write | | Get user info (optional) | users:read |
by Jimleuk
Tired of being let down by the Google Drive Trigger? Rather not exhaust system resources by polling every minute? Then this workflow is for you! Google drive is a great storage option for automation due to its relative simplicity, cheap costs and readily-available integrations. Using Google Drive as a trigger is the next logically step but many n8n users quickly realise the built-in Google Drive trigger just isn't that reliable. Disaster! Typically, the workaround is to poll the Google Drive search API in short intervals but the trade off is wasted server resources during inactivity. The ideal solution is of course, push notifications but they seem quite complicated to implement... or are they? This template demonstrates that setting up Google Push Notifications for Google Drive File Changes actually isn't that hard! Using this approach, Google sends a POST request every time something in a drive changes which solves reliability of events and efficiency of resources. How it works We begin with registering a Notification channel (webhook) with the Google Drive API. 2 key pieces of information is (a) the webhook URL which notifications will be pushed to and (b) because we want to scope to a single location, the driveId. Good to know that you can register as many as you like using http calls but you have to manage them yourself, there's no google dashboard for notification channels! The registration data along with the startPageToken are saved in workflowStaticData - This is a convenient persistence which we can use to hold small bits of data between executions. Now, whenever files or folders are created or updated in our target Google Drive, Google will send push notifications to our webhook trigger in this template. Once triggered, we need still need to call Google Drive's Changes.list to get the actual change events which were detected. we can do this with the HTTP request node. The Changes API will also return the nextPageToken - a marker to establish where next to get the new batch of changes. It's important that we use this token the next time we request from the changes API and so, we'll update the workflowStaticData with this updated value. Unfortunately, the changes.list API isn't able to filter change events by folder or action and so be sure to do your own set of filtering steps to get the files you want. Finally with the valid change events, optionally fetch the file metadata which gives you more attributes to play with. For example, you may want to know if the change event was triggered by n8n, in which case you'll want to check "ModifiedByMe" value. How to use Start with Step 1 and fill in the "Set Variables" node and Click on the Manual Execute Trigger. This will create a single Google Drive Notification Channel for a specific drive. Activate the workflow to start recieving events from Google Drive. To test, perform an action eg. create a file, on the target drive. Watch the webhook calls come pouring in! Once you have the desired events, finish off this template to do something with the changed files. Requirements Google Drive Credentials. Note this workflow also works on Shared Drives. Optimising This Workflow With bulk actions, you'll notice that Google gradually starts to send increasingly large amounts of push notifications - sometimes numbering in the hundreds! For cloud plan users, this could easily exhaust execution limits if lots of changes are made in the same drive daily. One approach is to implement a throttling mechanism externally to batch events before sending them to n8n. This throttling mechanism is outside the scope of this template but quite easy to achieve with something like Supabase Edge Functions.
by Gaetano Castaldo
Web-to-Odoo Lead Funnel (UTM-ready) Create crm.lead records in Odoo from any webform via a secure webhook. The workflow validates required fields, resolves UTMs by name (source, medium, campaign) and writes standard lead fields in Odoo. Clean, portable, and production-ready. Key features โ Secure Webhook with Header Auth (x-webhook-token) โ Required fields validation (firstname, lastname, email) โ UTM lookup by name (utm.source, utm.medium, utm.campaign) โ Clean consolidation before create (name, contact_name, email_from, phone, description, type, UTM IDs) โ Clear HTTP responses: 200 success / 400 bad request Prerequisites Odoo with Leads enabled (CRM โ Settings โ Leads) Odoo API Key** for your user (use it as the password) n8n Odoo credentials: URL, DB name, Login, API Key Public URL** for the webhook (ngrok/Cloudflare/reverse proxy). Ensure WEBHOOK_URL / N8N_HOST / N8N_PROTOCOL / N8N_PORT are consistent Header Auth secret** (e.g., x-webhook-token: <your-secret>) How it works Ingest โ The Webhook receives a POST at /webhook(-test)/lead-webform with Header Auth. Validate โ An IF node checks required fields; if missing โ respond with 400 Bad Request. UTM lookup โ Three Odoo getAll queries fetch IDs by name: utm.source โ source_id utm.medium โ medium_id utm.campaign โ campaign_id If a record is not found, the corresponding ID remains null. Consolidate โ Merge + Code nodes produce a single clean object: { name, contact_name, email_from, phone, description, type: "lead", campaign_id, source_id, medium_id } Create in Odoo โ Odoo node (crm.lead โ create) writes the lead with standard fields + UTM Many2one IDs. Respond โ Success node returns 200 with { status: "ok", lead_id }. Payload (JSON) Required: firstname, lastname, email Optional: phone, notes, source, medium, campaign { "firstname": "John", "lastname": "Doe", "email": "john.doe@example.com", "phone": "+393331234567", "notes": "Wants a demo", "source": "Ads", "medium": "Website", "campaign": "Spring 2025" } Quick test curl -X POST "https://<host>/webhook-test/lead-webform" \ -H "Content-Type: application/json" \ -H "x-webhook-token: <secret>" \ -d '{"firstname":"John","lastname":"Doe","email":"john@ex.com", "phone":"+39333...", "notes":"Demo", "source":"Ads","medium":"Website","campaign":"Spring 2025"}' Notes Recent Odoo versions do not use the mobile field on leads/partners: use phone instead. Keep secrets and credentials out of the template; the user will set their own after import. If you want to auto-create missing UTM records, add an IF after each getAll and a create on utm.*.
by Easy8.ai
Use this workflow to enrich lead records from Easy Redmine with Lusha data and sync updated fields back to the CRM. About Workflow This workflow connects Easy Redmine CRM with Lusha via API to enrich lead records. It fetches lead data from Easy Redmine, queries Lusha for matching contact information, transforms the results (e.g., phone, employee count, LinkedIn), and updates the original records in Easy Redmine CRM automatically. Use Case Built for CRM and sales teams using Easy Redmine, this automation replaces manual Lusha lookups and lead updates. It ensures that enriched data is consistently added to your CRMโboosting lead quality and reducing data maintenance time. How it works Schedule Trigger** => Runs daily at a set hour (e.g., 10:00) Get Leads from Easy Redmine** => Pulls leads via saved query (e.g., today's leads) using the easy_leads resource Split Out** => Breaks down the batch of leads into single records for enrichment Get Data from Lusha** => Sends HTTP requests to Lushaโs API using lead fields (email, name, company) Filter Leads Found in Lusha** => Skips leads with missing or failed Lusha enrichment Contact Data Transformation for CRM** => Maps and transforms contact data: Extracts phones, email, LinkedIn, job info Converts employee ranges like [1000, 5000] into 5000 Flattens nested Lusha response into CRM-friendly format Update Leads in Easy Redmine CRM** => Sends a PUT request to update each lead How to use Import the workflow into your n8n instance Set credentials for: Easy Redmine API Lusha API (HTTP Header Auth) Update the Get Leads from Easy Redmine node with your query ID Ensure Lusha API fields (email, name, company) match available lead fields Adjust the transformation logic in the Code node if needed Update the custom field ID in the final PUT request if your CRM uses different IDs Test with sample data before activating the automation Example Use Cases Sales Intelligence**: Enrich new leads with verified phone and social info CRM Hygiene**: Keep records consistent and complete without manual edits Lead Scoring**: Add employee count and LinkedIn for better segmentation Requirements Easy Redmine API access Lusha API access API credentials for both platforms Customization Options Add more fields from Lusha (job title, company website, etc.) Include error notifications if updates fail Add filters to target only leads missing phone or LinkedIn info Use Merge node if fetching multiple lead segments Workflow Improvement Suggestions Rename technical node labels for better readability Secure credentials via environment variables or vault Handle rate limits or retries if using large lead batches This workflow automates lead enrichment, ensuring CRM records are updated with accurate, verified dataโwithout the manual copy-paste.