by Jonathan Bennetts
> This has been updated to support the Query feature added to the Zendesk node in 0.144.0 This workflow will post all New and Open tickets without an agent assigned to a Slack channel on a schedule. The function node is used in this example to merge multiple inputs into one output message which is then used as the Slack message. The output in Slack will be similar to the below message, The "TICKET_ID" will be a link to the ticket. > Unassigned Tickets TICKET_ID [STATUS] - TICKET_SUBJECT Usage Update the Cron schedule, The default value is 16:30 daily. Update the Credentials in the Zendesk nodes Update the Credentials and Channel in the Slack Node Grab a coffee and enjoy! Zendesk Query In the Zendesk node we are using the query assignee:none status<pending this returns all New and Open tickets with no assignee allowing us to remove the extra nodes.
by Itamar
π§ ICP Scoring Agent (n8n + Explorium + LLM) This workflow automates Ideal Customer Profile (ICP) scoring for any company using a combination of Explorium data and an LLM-driven evaluation framework. π§ How It Works Input: Company name is submitted via form. Data Enrichment: Explorium's MCP Server is used to fetch firmographic, hiring, and tech data about the company. Scoring Logic: An AI agent (LLM) applies a 3-pillar framework to assess and score the company. Output: A structured JSON or Google Doc summary is generated using the AgentGeeks formatter. π Scoring System (100 points total) | Pillar | Max Points | |------------------------------|------------| | Strategic Fit | 40 | | AI / Tech Readiness | 40 | | Engagement & Reachability | 20 | π§ Scoring Criteria Strategic Fit**: Industry, size, use case, buyer roles Tech Readiness**: AI maturity, hiring trends, stack visibility Reachability**: Geography, contactability, data quality π― Verdict Scale π© 90β100: Ideal ICP β 70β89: Good Fit π¨ 40β69: Medium Fit β < 40: Poor Fit π¦ Workflow Components Trigger**: Form submission via webhook MCP Client**: Pulls enriched company data via Explorium's MCP API AI Agent**: Uses Anthropic Claude (or other LLM) to calculate scores Output**: Results are posted to a structured endpoint (e.g. Google Doc or JSON API) π§° Dependencies n8n (self-hosted or cloud) Explorium MCP credentials and access LLM API (e.g., Anthropic Claude, OpenAI, etc.) Optional: AgentGeeks formatter or similar doc generator πΌ Use Case This ICP scoring system is designed for GTM and sales teams to: Automate lead prioritization Qualify accounts before outbounding Sync ICP data into CRMs, routing systems, or reporting layers π Example Output in Google Doc { "company": "Acme Inc.", "score": 87, "verdict": "Good Fit", "pillars": { "strategic_fit": 35, "tech_readiness": 37, "reachability": 15 }, "summary": "Acme Inc. is a mid-sized SaaS company with strong AI hiring activity and a buyer profile aligned to enterprise IT. Moderate reachability via firmographic signals." }
by Joachim Brindeau
Are you looking to install external libraries for your self-hosted N8N instance? This automated workflow makes adding npm packages to your N8N environment quick and effortless. Beware, this workflow only works on self-hosted instances. What This Workflow Does This solution automatically installs npm packages like axios, cheerio, or node-fetch in your self-hosted N8N Docker container, making them immediately available in Code nodes. Key features β Automated Installation: No manual npm commands needed β Daily Updates: Scheduled trigger keeps packages current β Smart Installation: Only installs missing packages β Multiple Triggers: Manual, scheduled, and on startup of the N8N instance so you can upgrade your N8N version without worrying about external libraries. How to install and update external libraries automatically Step 1: Setting Up Your Environment Variables Before using external libraries in N8N Code nodes, configure these environment variables in your Docker comppose file. Option A to allow specific external npm packages in N8N Code nodes NODE_FUNCTION_ALLOW_EXTERNAL=axios,cheerio,node-fetch Option B to allow all external npm packages in Code nodes NODE_FUNCTION_ALLOW_EXTERNAL=* Step 2: Import the external packages workflow Import the workflow into your N8N instance by copy pasting all nodes. Step 3: Input the list of external libraries you need Edit the libraries_set node Change the comma-separated list (e.g., axios,cheerio,node-fetch). If you chose Option A above, update your NODE_FUNCTION_ALLOW_EXTERNAL variable with the same packages Step 4: Start the workflow! Run the workflow manually or let it trigger automatically Why use this to install NPM packages in N8N? Managing external packages manually in N8N can be time-consuming. This workflow automates the entire process, making sure your libraries are always installed and up-to-date.
by Danger
Automated Execution Pruning This workflow is designed to help you manage and optimize your n8n instance by automatically pruning old workflow executions, ensuring a cleaner environment and improved performance. You can customize the retention period to suit your needs. Key Features: Configurable Retention Period: The workflow is preconfigured to delete workflow executions older than 10 days. You can easily adjust this duration by modifying the condition in the If node. Daily Automation: Using the Schedule Trigger, the workflow runs daily at the specified time (default: 4:44 AM), retrieving all workflow executions and identifying those that are older than the defined retention period. On-Demand Testing: The Manual Trigger allows you to test the workflow at any time, ensuring everything is working as expected. Decision Making: The If node evaluates each execution based on its start date and determines whether it should be deleted or retained. Execution Pruning: Delete Action: Executions meeting the criteria are removed via the Delete Execution node. No-Operation: Executions that don't meet the criteria remain untouched. Workflow Nodes: Manual Trigger: Enables on-demand testing of the workflow. Schedule Trigger: Runs the workflow daily at the configured time. n8n List Execution: Fetches all executions in your n8n instance. If Node: Compares the execution's start date with the configured retention period. Delete Execution: Deletes executions older than the specified retention period. No Operation: Serves as a placeholder for executions that don't meet the pruning criteria. How to Customize: Retention Period**: Update the If node's condition to modify the retention period. For instance, change 10 * 24 * 60 * 60 * 1000 to the desired number of days in milliseconds. Schedule**: Adjust the timing of the Schedule Trigger to match your preferred automation schedule. This workflow ensures your instance remains efficient by keeping only the relevant execution logs. Use it to maintain a streamlined and clutter-free environment effortlessly.
by Habeeb Mohammed
Who's it for This workflow is perfect for individuals who want to maintain detailed financial records without the overhead of complex budgeting apps. If you prefer natural language over data entry forms and want an AI assistant to handle the bookkeeping, this template is for you. It's especially useful for: People who want to track cash and online transactions separately Anyone who lends money to friends/family and needs debt tracking Users comfortable with Slack as their primary interface Those who prefer conversational interactions over manual spreadsheet updates What it does This AI-powered finance tracker transforms your Slack workspace into a personal finance command center. Simply mention your bot with transactions in plain English (e.g., "βΉ500 cash food, borrowed βΉ1000 from John"), and the AI agent will: Parse transactions using natural language understanding via Google Gemini Calculate balance changes for cash and online accounts Show a preview of changes before saving anything Update Google Sheets only after you approve Track debts (who owes you, who you owe, repayments) Send daily reminders at 11 PM with current balances and active debts The workflow maintains conversational context using PostgreSQL memory, so you can say things like "yesterday's transactions" or "that payment to Sarah" and it understands the context. How it works Scheduled Daily Check-in (11 PM) Fetches current balances from Google Sheets Retrieves all active debts Formats and sends a Slack message with balance summary Prompts you to share the day's transactions AI Agent Transaction Processing When you mention the bot in Slack: Phase 1: Parse & Analyze Extracts amount, payment type (cash/online), category (food, travel, etc.) Identifies transaction type (expense, income, borrowed, lent, repaid) Stores conversation context in PostgreSQL memory Phase 2: Calculate & Preview Reads current balances from Google Sheets Calculates new balances based on transactions Shows formatted preview with projected changes Waits for your approval ("yes"/"no") Phase 3: Update Database (only after approval) Logs transactions with unique IDs and timestamps Updates debt records with person names and status Recalculates and stores new balances Handles debt lifecycle (Active β Settled) Phase 4: Confirmation Sends success message with updated balances Shows active debts summary Includes logging timestamp Requirements Essential Services: n8n instance (self-hosted or cloud) Slack workspace with admin access Google account Google Gemini API key PostgreSQL database Recommended: Claude AI model (mentioned in workflow notes as better alternative to Gemini) How to set up 1. Google Sheets Setup Create a new Google Sheet with three tabs named exactly: Balances Tab: | Date | Cash_Balance | Online_Balance | Total_Balance | |------|--------------|----------------|---------------| Transactions Tab: | Transaction_ID | Date | Time | Amount | Payment_Type | Category | Transaction_Type | Person_Name | Description | Added_At | |----------------|------|------|--------|--------------|----------|------------------|-------------|-------------|----------| Debts Tab: | Person_Name | Amount | Type | Date_created | Status | Notes | |-------------|--------|------|--------------|--------|-------| Add header rows and one initial balance row in the Balances tab with today's date and starting amounts. 2. Slack App Setup Go to api.slack.com/apps and create a new app Under OAuth & Permissions, add these Bot Token Scopes: app_mentions:read chat:write channels:read Install the app to your workspace Copy the Bot User OAuth Token Create a dedicated channel (e.g., #personal-finance-tracker) Invite your bot to the channel 3. Google Gemini API Visit ai.google.dev Create an API key Save it for n8n credentials setup 4. PostgreSQL Database Set up a PostgreSQL database (you can use Supabase free tier): Create a new project Note down connection details (host, port, database name, user, password) The workflow will auto-create the required table 5. n8n Workflow Configuration Import the workflow and configure: A. Credentials Google Sheets OAuth2**: Connect your Google account Slack API**: Add your Bot User OAuth Token Google Gemini API**: Add your API key PostgreSQL**: Add database connection details B. Update Node Parameters All Google Sheets nodes: Select your finance spreadsheet Slack nodes: Select your finance channel Schedule Trigger: Adjust time if you prefer a different check-in hour (default: 11 PM) Postgres Chat Memory: Change sessionKey to something unique (e.g., finance_tracker_your_name) Keep tableName as n8n_chat_history_finance or rename consistently C. Slack Trigger Setup Activate the "Bot Mention trigger" node Copy the webhook URL from n8n In Slack App settings, go to Event Subscriptions Enable events and paste the webhook URL Subscribe to bot event: app_mention Save changes 6. Test the Workflow Activate both workflow branches (scheduled and agent) In your Slack channel, mention the bot: @YourBot βΉ100 cash snacks Bot should respond with a preview Reply "yes" to approve Verify Google Sheets are updated How to customize Change Transaction Categories Edit the AI Agent's system message to add/remove categories. Current categories: travel, food, entertainment, utilities, shopping, health, education, other Modify Daily Check-in Time Change the Schedule Trigger's triggerAtHour value (0-23 in 24-hour format). Add Currency Support Replace βΉ with your currency symbol in: Format Daily Message code node AI Agent system prompt examples Switch AI Models The workflow uses Google Gemini, but notes recommend Claude. To switch: Replace "Google Gemini Chat Model" node Add Claude credentials Connect to AI Agent node Customize Debt Types Modify AI Agent's system prompt to change debt handling logic: Currently: I_Owe and They_Owe_Me You can add more types or change naming Add More Payment Methods Current: cash, online To add more (e.g., credit card): Update AI Agent prompt Modify Balances sheet structure Update balance calculation logic Change Approval Keywords Edit AI Agent's Phase 2 approval logic to recognize different approval phrases. Add Spending Analytics Extend the daily check-in to calculate: Weekly/monthly spending summaries Category-wise breakdowns Use additional Code nodes to process transaction history Important Notes β οΈ Never trigger with normal messages - Only use app mentions (@botname) to avoid infinite loops where the bot replies to its own messages. π‘ Context Awareness - The bot remembers conversation history, so you can reference "yesterday", "last week", or previous transactions naturally. π Data Privacy - All your financial data stays in your Google Sheets and PostgreSQL database. The AI only processes transaction text temporarily. π Backup Regularly - Export your Google Sheets periodically as backup. Pro Tips: Start with small test transactions to ensure everything works Use consistent person names for debt tracking The bot understands various formats: "βΉ500 cash food" = "paid 500 rupees in cash for food" You can batch transactions in one message: "βΉ100 travel, βΉ200 food, βΉ50 snacks"
by Ricardo Espinozaas
Use Case When having a call with a new potential customer, one of the keys to getting the most out of the call is to find out as much information as you can about them before the call. Normally this involves a lot of manual research before every call. This workflow automates this tedious work for you. What this workflow does The workflow runs every time a new call is booked via your Calendly. It then filters out personal emails, before enriching the email. If the email is attached to a company it enriches the company and upserts it in your Hubspot CRM. Setup Add Clearbit, Hubspot, and Calendly credentials. Click on Test workflow. Book a meeting on Calendly so the event starts the workflow. Be aware that you can adapt this workflow to work with your enrichment tool, CRM, and booking tool of choice.
by Airtop
Monitoring Job Changes on LinkedIn Use Case This automation tracks job changes among your LinkedIn connections and extracts relevant details. It's ideal for triggering timely outreach, updating CRM records, or feeding lead scoring workflows based on new roles. What This Automation Does It scrapes your LinkedIn "Job Changes" feed and returns: Name of the person Their new position LinkedIn profile URL Functional category (e.g., marketing, sales, HR, executive) Each run processes 5 job changes at a time. How It Works Manual Trigger: Starts the workflow when the user clicks "Test workflow." Airtop Enrichment: Navigates to the LinkedIn job changes page and extracts: name new_position linkedin_profile_url position_function (classification such as marketing, sales, HR, etc.) Formatting: Output is structured into clean JSON for use in further workflows. Setup Requirements Airtop Profile connected to LinkedIn Airtop API key configured in n8n A LinkedIn account with a populated βJob Changesβ feed Next Steps Automate Alerts**: Add Slack, email, or CRM integrations to notify your team. Enrich and Score Leads**: Chain this with your ICP scoring workflow to evaluate new roles. Customize Scope**: Expand extraction to more than 5 job changes or add filters based on job titles or functions. Read more about Monitoring Job Changes on Linkedin.
by Ricardo Espinozaas
Use Case Whenever someone shows interest in your offerings by subscribing to a list in ConvertKit it could be a potential new customer. Typically you need to gather more detailed information about them (data enrichment) and finally update their profile in your CRM system to better manage and nurture your relationship with them. This workflow does this all for you! What this workflow does The workflow runs every time a user is subscribed to a ConvertKit list. It then filters out personal emails, before enriching the email. If the email is attached to a company it enriches the company and upserts it in your Hubspot CRM. Setup Add Clearbit, Hubspot, and ConvertKit credentials. Click on Test workflow. Subscribe to a list on ConvertKit to trigger the workflow. Be aware that you can adapt this workflow to work with your enrichment tool, CRM, and email automation tool of choice.
by Risper
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How It Works This n8n workflow automatically discovers high-quality business leads from Reddit posts by analysing posts across targeted subreddits. Loads your business profile from a connected Google Sheet. Uses AI to identify relevant subreddits where your potential customers engage. Generates intent-based Reddit search queries based on your services, keywords, and client pain points. Searches Reddit in real time using the generated queries. Classifies posts based on whether they show lead potential. Analyses high-potential posts for service-fit, urgency, and estimated value. Filters and scores leads to prioritize high-conversion opportunities. Saves the most promising leads to a dedicated Google Sheet. Sends Slack alerts to notify your sales team for immediate follow-up. Requirements Before using this workflow, ensure the following services are connected and configured: Google Sheets (OAuth2): Reads your business profile and writes qualified leads Reddit (OAuth2) Perform Reddit post searches based on generated queries Google Gemini API Analyse posts, generate queries, and extract insights Slack API : Notify your team with qualified lead summaries Google Sheets Setup You will need two Google Sheets: Business Profile Sheet (Input) This sheet contains a single row describing your service business. The workflow reads this to generate relevant subreddit selections and search queries. Required Fields (as headers in row 1): profession industry primary_services service_keywords target_client_profile pain_points intent_signals urgency_indicators price_range Reddit Leads Sheet (Output) This sheet stores high-quality Reddit posts identified as potential leads. The workflow appends or updates rows based on post_id to avoid duplication. Expected Columns: post_id post_url post_title post_post post_subreddit post_date
by Stefan
Track n8n Node Definitions from GitHub and Export to Google Sheets Overview This workflow automatically retrieves and processes metadata from the official n8n GitHub repository, filters all available .node.json files, parses their structure, and appends structured information into a Google Sheet. Perfect for developers, community managers, and technical writers who need to maintain up-to-date information about n8n's evolving node ecosystem. Setup Instructions Prerequisites Before setting up this workflow, ensure you have: A GitHub account with API access A Google account with Google Sheets access An active n8n instance (cloud or self-hosted) Step 1: GitHub API Configuration Navigate to GitHub Settings β Developer Settings β Personal Access Tokens Generate a new token with public_repo permissions Copy the generated token and store it securely In n8n, create a new "GitHub API" credential Paste your token in the credential configuration and save Step 2: Google Sheets Setup Create a new Google Sheets document Set up the following column headers in the first row: node (Column A) - Node identifier/name nodeVersion (Column B) - Version of the node codexVersion (Column C) - Codex version number categories (Column D) - Node categories credentialDocumentation (Column E) - Credential documentation URL primaryDocumentation (Column F) - Primary documentation URL Note down the Google Sheets document ID from the URL Configure Google Sheets OAuth2 credentials in n8n Step 3: Workflow Configuration Import the workflow into your n8n instance Update the following placeholder values: Replace YOUR_GOOGLE_SHEETS_DOCUMENT_ID with your actual document ID Replace YOUR_WEBHOOK_ID if using webhook functionality Configure the GitHub API credentials in the HTTP Request nodes Set up Google Sheets credentials in the Google Sheets nodes Share your Google Sheets document with the email address associated with your Google OAuth2 credentials Grant "Editor" permissions to allow the workflow to write data Google Sheets Template Details The workflow creates a structured dataset with these columns: node**: Node identifier (e.g., n8n-nodes-base.slack) nodeVersion**: Version of the node (e.g., 1.0.0) codexVersion**: Codex version number (e.g., 1.0.0) categories**: Node categories (e.g., Communication, Productivity) credentialDocumentation**: URL to credential documentation primaryDocumentation**: URL to primary node documentation Customization Options Modifying Data Extraction You can customize the "Format Data" node to extract additional fields: Add new assignments in the Set node Modify the column mapping in the Google Sheets node Update your spreadsheet headers accordingly Changing Update Frequency To run this workflow on a schedule: Replace the Manual Trigger with a Cron node Set your desired schedule (e.g., daily, weekly) Configure appropriate timing to avoid API rate limits Adding Filters Customize the "Filter Node Files" code node to: Filter specific node types Include/exclude certain categories Process only recently updated nodes Features Fetches all node definitions from the n8n-io/n8n repository Filters for .node.json files only Downloads and parses metadata automatically Extracts key fields like node names, versions, categories, and documentation URLs Appends structured data to Google Sheets with batch processing Includes error handling and retry mechanisms Clears existing data before appending new information for fresh results Use Cases This workflow is ideal for: Track changes in official n8n node definitions over time Audit node categories and documentation links for completeness Build custom dashboards from node metadata Community management and documentation maintenance Integration planning and compatibility analysis
by Andrew
Who is this for? This workflow is ideal for n8n self-hosted users, DevOps engineers, and automation developers who want to automatically back up their n8n workflows to GitHub on a regular basis. What problem is this workflow solving Manually backing up n8n workflows can be time-consuming and prone to human error. This workflow automates the backup process, ensuring that all workflows are safely stored in a version-controlled GitHub repository every 24 hours. What this workflow does This automation runs daily to back up all workflows from your n8n instance to a specified GitHub repository. Each workflow is saved as a .json file using its unique ID, organized into a folder path defined by repo_path. The workflow is designed to manage memory usage efficiently by recursively calling itself. Once the backup is complete, it optionally sends a Slack notification to confirm success. Setup Configure the Config node in the subworkflow to set: GitHub Repo Owner GitHub Repo Name Main folder path (repo_path) Connect your GitHub and (optionally) Slack credentials. Set the workflow to run on a daily cron schedule. Test the workflow manually to confirm the GitHub integration works. Sign up for a free consultation and find out how n8n can help you.
by Candice Capelle
Who is this template for? This template is for everyone who has to take notes during a call: Talent Acquisition Managers / Talent Acquisition Specialists / Recruiters HR professionals Sales teams, customer success teams Product teams / User Experience Designers / anyone conducting user research interviews Use case This workflow allows specific events created on Google Calendar (or any other meeting scheduling tool like Calendly) to trigger the duplication and renaming of a specific template document. Example: For each new screening call that is scheduled in your calendar, you want to create a draft of your screening interview template for the role, titled "{Name of the candidate} | {Date of the interview}", and located in your Google Drive in a specific folder This workflow could then be extended to copy the link to the file on a Notion database that is shared with the team (check "To go further" section). This workflow ensures that if you're jumping from calls to calls, you're already set up to take notes, and every document is tidied up and sorted in a structured way! How it works The workflow starts when a new event is created in Google Calendar The Filter node then selects a specific type of events, depending on a chosen pattern (title includes a specific term, organizer is X, attendees include Y, etc.) The workflow then searches for a specific folder in your Google Drive, where the file you want to duplicate is located The workflow then searches for the specific file you want to duplicate The last step allows to duplicate and rename the file with variables from your Google Calendar event Set up Set up credentials for Google Calendar, Google Drive, and Google File. You'll need a Google Workspace account. Set up the Filter node with the pattern you want to look for to retreive specific events in your calendar Set up the Google Drive you want to search in Set up the Google File you want to duplicate Set up variable at the last step to rename your duplicated file however you want it, or add a description To go further Here's a few idea to enhance this workflow depending on your specific needs: Instead of a filter, separate your flow depending on your use case (ex: you have want to fetch different templates depending on the type of call it'll be). Switch Google Calendar for another trigger (Calendly, Hubspot..) 10 minutes before the event, send the duplicate Google File to the meeting organizer through Slack The day after the event, if the event hasn't been cancelled, add the link to the Google File to your ATS, Hubspot, etc.