by vinci-king-01
How it works This workflow automatically extracts data from invoice documents (PDFs and images) and processes them through a comprehensive validation and approval system. Key Steps Multi-Input Triggers - Accepts invoices via email attachments or direct file uploads through webhook. AI-Powered Extraction - Uses ScrapeGraphAI to extract structured data from invoice documents. Data Cleaning & Validation - Processes and validates extracted data against business rules. Approval Workflow - Routes invoices requiring approval through a multi-stage approval process. System Integration - Automatically sends validated invoices to your accounting system. Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for invoice data extraction. Set up Telegram connection - Connect your Telegram account for approval notifications. Configure email trigger - Set up IMAP connection for processing emailed invoices. Customize validation rules - Adjust business rules, amount thresholds, and vendor lists. Set up accounting system integration - Configure the HTTP request node with your accounting system's API endpoint. Test the workflow - Upload a sample invoice to verify the extraction and approval process. Features Multi-format support**: PDF, PNG, JPG, JPEG, TIFF, BMP Intelligent validation**: Business rules, duplicate detection, amount thresholds Approval automation**: Multi-stage approval workflow with role-based routing Data quality scoring**: Confidence levels and completeness analysis Audit trail**: Complete processing history and metadata tracking
by Jorge MartΓnez
Automate tweet engagement on X (formerly Twitter) Description Automate professional engagement on X (formerly Twitter) by searching for, filtering, liking, and replying to tweets that match your key topics. This workflow enables you to engage consistently and efficiently with relevant conversations, using your defined professional role and the power of GPT for filtering and replies. Save time and maintain high-quality interactions, while staying focused on your business or personal brand interests. How it Works Rotating Topic Selection The workflow selects one search term from your list on each run, using a rotating index based on the date. Search Tweets & Extract Essentials Searches X (formerly Twitter) for tweets matching the chosen topic, then extracts only the tweet id and text for further processing. GPTβBased Filtering with Role Context Filters tweets based on your role and strict criteria, removing non-English tweets, memes, spam, Grok-generated content, political posts, internships, and more. Engagement Loop For every filtered tweet, the workflow likes the post, generates a professional, concise reply with GPT (matching language and context), and posts the reply. Wait nodes ensure compliance with Twitterβs API rate limits (can be adjusted for paid API tiers). Requirements X (Twitter) API credentials (for searching, liking, and replying to tweets) OpenAI API key (for GPT-based steps) Setup Steps Obtain your X (Twitter) API credentials. Obtain your OpenAI API key. Configure the schedule in the trigger node to your desired frequency (e.g., every 3 days or daily). Set your list of topics and professional role in the variables node. How to Customize the Workflow (Optional) Adjust prompts** in the GPT nodes to fine-tune filtering and reply style. Upgrade your Twitter API plan** to increase request limits and search for more tweets per run. Change tweet processing logic:** For high-volume engagement (e.g., analyzing 100+ tweets per run), consider switching to a per-tweet loop for advanced filtering and response handling. This workflow enables scalable, professional, and targeted engagement on X (formerly Twitter), fully customizable to your audience and objectives.
by David Levesque
Here's the corrected English text: Dropbox Folder Monitoring Workflow As we don't have (yet?) a Dropbox node "Watching new files" or "Watching folder", I created this central workflow to do it. How it works Triggered by Dropbox webhook I respond immediately to Dropbox to avoid webhook disabling Then I add/duplicate one branch per monitored folder, according to my needs In my case, I need to monitor several folders, like "vocal notes to process", "transcriptions to LinkedIn posts" or "quotes to add". This workflow shows 2 types of folder monitoring: Way #1: Each file in the monitored folder calls a sub-workflow Way #2: We get all files from the monitored folder and compare them to a database. If the file is not listed in DB, i supposed it's new one. Way #1 - We get all files from the monitored folder I set a variable folder_to_watch to indicate which folder to monitor. This step is here just to be homogeneous and allow setting the folder path only once in this branch. I list the folder files We keep only files (exclude folders) Then I call the specialized sub-workflow Way #2 - We want only new files from the monitored folder I set a variable folder_to_watch to indicate which folder to monitor I list the folder files and keep only files Meanwhile, I query my DB to get known files about this folder (I send the query to NocoDB (folder_to_watch,eq,{{ $json.folder_to_watch }})) Now I can exclude old files and keep only new ones by merging (I compare from Dropbox file id - as the file could be renamed by the user) I add the new file in DB to be sure to recognize it next time - I save the JSON Dropbox data: { "id":"{{ $json.id }}", "name":"{{ $json.name }}", "lastModifiedClient": "{{ $json.lastModifiedClient }}", "lastModifiedServer": "{{ $json.lastModifiedServer }}", "rev": "{{ $json.rev }}", "contentSize": {{ $json.contentSize }}, "type": "{{ $json.type }}", "contentHash": "{{ $json.contentHash }}", "pathLower": "{{ $json.pathLower }}", "pathDisplay": "{{ $json.pathDisplay }}", "isDownloadable": {{ $json.isDownloadable }} } And now I can call my sub-workflow :) My DB Columns details: folder_to_watch data (json/text) timestamp file_id (Dropbox file ID, to ease future searches) My vision: I have only one workflow in my n8n that monitors Dropbox folders/files This workflow calls the required sub-workflow specialized for the tasks required I will have as many branches as I have folders to monitor (if I have 5 different folders to watch, I will get 5 branches and 5 sub-workflows)
by John Pranay Kumar Reddy
β¨ Summary Efficiently monitor Kubernetes environments by sending only unique error logs from Grafana Loki to Slack. Reduces alert fatigue while keeping your team informed about critical log events. π§βπ» Whoβs it for DevOps or SRE engineers running EKS/GKE/AKS Anyone using Grafana Loki and Promtail for centralized logging Teams that want Slack alerts but hate alert spam π What it does This n8n workflow queries your Loki logs every 5 minutes, filters only the critical ones (error, timeout, exception, etc.), removes duplicate alerts within the batch, and sends clean alerts to a Slack channel with full metadata (pod, namespace, node, container, log, timestamp). π§ How it works π Schedule Trigger Every 5 minutes (customizable) π Loki HTTP Query Pulls logs from the last 10 minutes Keyword match: error, failed, oom, etc. π§Ή Log Parsing Extracts log fields (pod, container, etc.) Skips empty/malformed results π§ Deduplication Removes repeated error messages (within query window) π€ Slack Notification Sends nicely formatted message to Slack βοΈ Requirements Tool Notes Loki- Exposed internally or externally Slack App- With chat:write OAuth n8n- Cloud or self-hosted π§ How to Set It Up Import the JSON file into n8n Update: Loki API URL (e.g., http://loki-gateway.monitoring.svc.cluster.local) Slack Bearer Token (via credentials) Target Slack channel (e.g., #k8s-alerts) (Optional) Change keywords in the query regex Activate the workflow Ensure n8n pod/container is having access to your kubernetes cluster/pods/namespaces π How to Customize Want more or fewer keywords? Adjust the regex in the Query Loki for Error Logs node. Need to increase deduplication logic? Enhance the Remove Duplicate Alerts node. Want 5-log summaries every 5 min? Fork this and add a Batch + Slack group sender. Grafana Loki logs to Slack Output
by David Ashby
π οΈ Demio Tool MCP Server Complete MCP server exposing all Demio Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. β‘ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL π§ How it Works β’ MCP Trigger: Serves as your server endpoint for AI agent requests β’ Tool Nodes: Pre-configured for every Demio Tool operation β’ AI Expressions: Automatically populate parameters via $fromAI() placeholders β’ Native Integration: Uses official n8n Demio Tool tool with full error handling π Available Operations (4 total) Every possible Demio Tool operation is included: π Event (3 operations) β’ Get an event β’ Get many events β’ Register an event π§ Report (1 operations) β’ Get a report π€ AI Integration Parameter Handling: AI agents automatically provide values for: β’ Resource IDs and identifiers β’ Search queries and filters β’ Content and data payloads β’ Configuration options Response Format: Native Demio Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic π‘ Usage Examples Connect this MCP server to any AI agent or workflow: β’ Claude Desktop: Add MCP server URL to configuration β’ Custom AI Apps: Use MCP URL as tool endpoint β’ Other n8n Workflows: Call MCP tools from any workflow β’ API Integration: Direct HTTP calls to MCP endpoints β¨ Benefits β’ Complete Coverage: Every Demio Tool operation available β’ Zero Setup: No parameter mapping or configuration needed β’ AI-Ready: Built-in $fromAI() expressions for all parameters β’ Production Ready: Native n8n error handling and logging β’ Extensible: Easily modify or add custom logic > π Free for community use! Ready to deploy in under 2 minutes.
by Michael Muenzer
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Generates relevant keywords and questions from a a customer profile. Keyword data is enriched from ahref and everything is stored in a Google Sheet. This is great for market and customer research. Understanding search intent for a well defined audience and gives relevant actionable data in a fraction of time that manual research takes. How it works We'll define a customer profile in the 'Data' node We use an OpenAI LLM to fetch relevant search intent as keywords and questions We use an SEO MCP server to fetch keyword data from ahref free tooling The fetched data is stored in the Google sheet Set up steps Copy Google Sheet template and add it in all Google Sheet nodes Make sure that n8n has read & write permissions for your Google sheet. Add your list of domains in the first column in the Google sheet Add MCP credentials for seo-mcp Add OpenAI API credentials
by LucΓa Maio Brioso
π§βπΌ Who is this for? This workflow is for anyone with two YouTube channels who wants to copy playlists from one to the other β no technical skills required. Whether you're a content creator, hobbyist, educator, or just someone managing multiple channels, this workflow helps you save time and avoid the manual work of recreating playlists video by video. π§ What problem is this workflow solving? YouTube doesn't provide an option to transfer or duplicate playlists between accounts or channels. That means if you want the same playlists in two places, you're stuck: Creating new playlists manually Searching for each video again Copy-pasting links one by one This workflow automates the entire process for you β accurately, quickly, and with no manual work. βοΈ What this workflow does Retrieves all playlists from a source YouTube channel (excluding private ones) For each playlist: Gets all its videos Filters out private or unavailable videos Creates a new playlist in the target channel with the same title Adds the videos to the new playlist Continues smoothly even if some videos fail to copy (e.g., if theyβre restricted or deleted) π οΈ Setup Create two YouTube OAuth2 credentials in n8n: One for your source channel One for your target channel Assign the credentials to the correct nodes as indicated in the sticky notes: Source nodes β source credentials Target nodes β target credentials Click βTest workflowβ to run it. > β οΈ Note: If you have many playlists or videos, you may hit YouTubeβs API quota. You can request a quota increase in your Google Cloud Console if needed. π§© How to customize this workflow to your needs βοΈ Copy only specific playlists Use a Filter node after the playlist fetch to include only certain titles or IDs. π Change the title of the copied playlists Modify the title in the Create playlist node (e.g., add β(Copy)β or a prefix). π Automate it regularly Replace the Manual Trigger with a Cron node if you want to run this periodically. π§ͺ Test safely If you're unsure, use a secondary channel as your test target before applying changes to your main account.
by Airtop
Monitoring Job Changes on LinkedIn Use Case This automation tracks job changes among your LinkedIn connections and extracts relevant details. It's ideal for triggering timely outreach, updating CRM records, or feeding lead scoring workflows based on new roles. What This Automation Does It scrapes your LinkedIn "Job Changes" feed and returns: Name of the person Their new position LinkedIn profile URL Functional category (e.g., marketing, sales, HR, executive) Each run processes 5 job changes at a time. How It Works Manual Trigger: Starts the workflow when the user clicks "Test workflow." Airtop Enrichment: Navigates to the LinkedIn job changes page and extracts: name new_position linkedin_profile_url position_function (classification such as marketing, sales, HR, etc.) Formatting: Output is structured into clean JSON for use in further workflows. Setup Requirements Airtop Profile connected to LinkedIn Airtop API key configured in n8n A LinkedIn account with a populated βJob Changesβ feed Next Steps Automate Alerts**: Add Slack, email, or CRM integrations to notify your team. Enrich and Score Leads**: Chain this with your ICP scoring workflow to evaluate new roles. Customize Scope**: Expand extraction to more than 5 job changes or add filters based on job titles or functions. Read more about Monitoring Job Changes on Linkedin.
by MattF
This workflow generates a weekly performance summary from Google Search Console, focused on brand-level SEO metrics and week-over-week trends. It provides a structured view of how each brand segment is performing, with clean formatting for quick insights. Key Features Sends a weekly email with a table showing clicks, impressions, CTR, and position β along with % change vs. the previous week. Highlights both brand and non-brand clicks separately. Color-coded % changes make it easy to spot wins (green) and losses (red) at a glance. Itβs designed to give SEO teams a consistent overview of performance by brand, helping to track directional shifts and support deeper analysis when needed. How it works Runs weekly (e.g. every Monday) to compare βLast Weekβ vs. β2 Weeks Agoβ from GSC data. Includes both brand + non-brand click breakdown. Calculates raw values and week-over-week % change for clicks, impressions, CTR, and position. Outputs a clean, formatted table with labeled rows and color-coded changes. Sends the table as part of a scheduled email (can also be adapted for Slack or other channels). Setup steps Requires connected Google Search Console data (per brand segment). Email delivery is included by default (customizable to other platforms). Update brand segmentation logic to match your tracking needs (e.g. domain, label, or custom filters). Typical setup time: ~5-10 minutes with structured input data.
by Gaurav
Automate your entire guest communication journey from booking to post-stay with personalized welcome emails, review requests, and daily operational reports. Perfect for hotels, B&Bs, and short-term rental properties looking to enhance guest experience while reducing manual work and improving operational efficiency. How it works Pre-arrival welcome emails - Automatically sends personalized welcome emails 1-2 days before guest check-in with reservation details, hotel amenities, and contact information Post-stay review requests - Sends automated review request emails 24 hours after checkout with Google Reviews links and return guest discount codes Daily staff reports - Generates comprehensive arrival/departure reports every morning at 6 AM for front desk, housekeeping, and management teams Smart tracking - Prevents duplicate emails by automatically updating tracking status in your Google Sheets database Professional templates - Uses responsive HTML email templates that work across all devices and email clients Set up steps Connect Google Sheets - Link your hotel reservation spreadsheet (must include columns for guest details, check-in/out dates, and email tracking) Configure Gmail account - Set up Gmail credentials for sending automated emails Customize hotel information - Update hotel name, contact details, and branding in the "Edit Fields" nodes Set staff email addresses - Configure recipient addresses for daily operational reports Adjust timing - Modify schedule triggers if you want different timing for emails and reports (currently set to every 6 hours for guest emails and 6 AM daily for staff reports) Time investment: ~30 minutes for initial setup, then fully automated operation.
by Airtop
README Monitor Competitor Facebook Ads with Airtop Use Case Monitor a competitorβs active Facebook ads and get a weekly HTML intelligence brief by email β saving time on manual research and helping you spot messaging, offers, and creative trends quickly. What This Automation Does Runs weekly on a set schedule. Uses Airtop to visit the competitorβs Facebook Ad Library page and extract up to 30 active ads. Summarizes each ad with key points: message, topic, CTA, duration active, language, target audience. Sends the compiled HTML report via Gmail. How It Works Schedule Trigger β Fires once a week at the configured time. Airtop Extraction β Loads the Ad Library URL and runs a prompt to extract and format the ads into HTML. Email Delivery β Sends the HTML report to your specified recipient using Gmail. Setup Requirements Airtop API Key β Generate here. Airtop Credential in n8n β Add your API key under βAirtopβ in n8n. Gmail OAuth2 Credential β Connect the Gmail account to send reports. Competitorβs Ad Library URL β Replace the default view_all_page_id in the workflow with your target. Next Steps Duplicate the Airtop step for multiple competitors. Enrich reports by visiting ad landing pages for deeper analysis. Send outputs to Slack or archive in a shared workspace. Read about ways to monitor your competitors ads here
by Mario
Purpose This workflow enables you to listen to your recent favorites in very hight quality offline without sacrificing all of your storage. How it works This workflow automatically creates a playlist in Spotify named "Downloads" which periodically gets updated so it always contains only a defined amount of the latest liked songs. This enables only the Downloads playlist to set for automatic downloading and thus free up space on the device. Setup The workflow is ready to go. Just select your Spotify credentials and activate the workflow. In Spotify just enable automatic downloads on the automatically created Downloads folder after the first workflow run. Current limitations This setup currently supports a maximum of 50 songs in the Downloads Playlist. This is due to the paylod limits defined by Spotify encountered in the Get liked songs node. Implementing batching would solve the issue.