by L Hùng
This workflow acts as an error handler, sending real-time notifications to Telegram when another workflow fails. It provides detailed error information, including workflow name, timestamp, execution URL, last executed node, and error message. Pre-Conditions A Telegram bot created via BotFather. The bot token and Telegram group/channel chatId. An active n8n instance with the Telegram and Error Trigger nodes installed. Setup Workflow Configuration: Import the workflow into n8n. Update the Telegram chatId in the Config node. Add your Telegram bot token in the Telegram node credentials. Error Workflow Setup: Set this workflow as the Error Workflow in other workflows. Testing: Trigger an error in another workflow to verify Telegram notifications. Who the Workflow is For Developers:** Monitoring workflow failures in real-time. Teams:** Managing multiple n8n workflows and needing instant error alerts. n8n Users:** Looking for a simple way to handle workflow errors via Telegram. Primary Use Automates error notifications for failed workflows. Sends detailed error reports to Telegram for quick troubleshooting. Easily customizable to fit specific monitoring needs.
by Keith Rumjahn
Who's this for? If you own a website and need to analyze your Google analytics data If you need to create an SEO report on which pages are getting most traffic or how your google search terms are performing If you want to grow your site based on suggestions from data Use case Instead of hiring an SEO expert, I run this report weekly. It checks compares the data from this week to the week before: Views based on countries The top performing pages Google search console performance Watch youtube tutorial here Get my SEO A.I. agent system here Read my detailed case study here How it works The workflow gathers google analytics for the past 7 days then it gathers the data for the week before for comparison. It does this 3 times to get: views per country, engagement per page and google search console results for organic search results. The google analytics nodes has already chosen the correct dimensions and metrics. At the end, it passes the data to openrouter.ai for A.I. analyse. Finally it saves to baserow. How to use this Input your Google analytics credentials Input your property ID Input your Openrouter.ai credentials Input your baserow credentials You will need to create a baserow database with columns: Name, Country Views, Page Views, Search Report, Blog (name of your blog). Created by Rumjahn
by Romain Jouhannet
Linear Project/Issue Status and End Date to Productboard feature Sync Sync project and issue data between Linear and Productboard to keep teams aligned. This workflow updates Productboard features with the status and end date from Linear projects or due date from Linear issues. It ensures consistent data and sends a Slack notification whenever changes are made. Features Listens for updates in Linear projects/issues. Maps Linear statuses to Productboard feature statuses. Updates Productboard feature details including timeframe. Sends a Slack notification summarizing the updates. Setup Linear Credentials: Add your Linear API credentials in n8n. Productboard Credentials: Configure the Productboard API credentials in n8n. Linear Projects or Issues: Select the Linear project(s) or Issue(s) you want to monitor for updates. Productboard Custom Field: Create a custom field in Productboard named "Linear". This field should store the URL of the Linear project or issue you want to sync. Retrieve the UUID of the custom field in Productboard and set it up in the "Get Productboard Feature ID" node. Slack Notification: Update the Slack node with the desired Slack channel ID. Activate the Workflow: Enable the workflow to automatically sync data when triggered by updates in Linear.
by Blockia Labs
Time Logging on Clockify Using Slack How it works This workflow simplifies time tracking for teams and agencies by integrating Slack with Clockify. It enables users to log, update, or delete time entries directly within Slack, leveraging an AI-powered assistant for seamless and conversational interactions. Key features include: Effortless Time Logging**: Create and manage time entries in Clockify without leaving Slack. AI-Powered Assistant**: Get step-by-step guidance to ensure accurate and efficient time logging. Project and Client Management**: Retrieve project and client information from Clockify effortlessly. Overlap Prevention**: Avoid overlapping entries with built-in time validation. Automated Descriptions**: Generate ethical, grammatically correct descriptions for time logs. Set up steps 1. Prepare your integrations Ensure you have active accounts for both Slack and Clockify. Generate your Clockify API credentials for integration. 2. Import the workflow Download and import the workflow template into your n8n instance. Configure the workflow to connect with your Slack and Clockify accounts. 3. Configure the workflow Add your Clockify API credentials in the workflow settings. Set up the Slack Trigger to listen for app mentions or specific commands. 4. Test the workflow Use Slack to create a time entry and verify it in Clockify. Test updating and deleting existing entries to ensure smooth functionality. Check for any overlapping time logs or incorrect data entries. Why use this workflow? Efficiency**: Eliminate the need to switch between tools for time tracking. Accuracy**: AI-driven validation ensures error-free entries. Automation**: Simplify repetitive tasks like updating or deleting time logs. Proactive Guidance**: Conversational assistant ensures smooth operations.
by Dvir Sharon
💼 LinkedIn Job Finder Automation using Bright Data API & Google Sheets A comprehensive n8n automation that searches LinkedIn job postings using Bright Data’s API and automatically organizes results in Google Sheets for efficient job hunting and recruitment workflows. 📋 Overview This workflow provides an automated LinkedIn job search solution that collects job postings based on your search criteria and organizes them in Google Sheets. Perfect for job seekers, recruiters, HR professionals, and talent acquisition teams. ✨ Key Features 🔍 Smart Job Search:** Form-based input for city, job title, country, and job type 🛍 LinkedIn Integration:** Uses Bright Data’s LinkedIn dataset for accurate job posting data 📊 Automated Organization:** Populates Google Sheets with structured job data 📧 Real-time Processing:** Processes job search requests in real-time 📈 Data Storage:** Stores job details including company info, locations, and apply links 🔄 Batch Processing:** Handles multiple job postings efficiently ⚡ Fast & Reliable:** Built-in error handling for scraping 🎯 Customizable Filters:** Advanced job filtering based on criteria 🎯 What This Workflow Does Input Job Search Criteria:** City, job title, country, and optional job type Search Parameters:** Configurable filters and limits Output Preferences:** Google Sheets destination Processing Steps Form Submission Data Request to Bright Data API Status Monitoring Data Extraction Data Filtering Sheet Update Error Handling Output Data Points Field Description Example Job Title Position title from posting Senior Software Engineer Company Name Employer company name Tech Solutions Inc. Job Detail Job summary/description Remote position requiring 5+ years… Location Job location San Francisco, CA Company URL Company profile link View Profile Apply Link Direct application link Apply Now 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with LinkedIn dataset access Steps Import the Workflow: Use JSON import in n8n Configure Bright Data: Add API credentials and dataset ID Configure Google Sheets: Create sheet, set credentials, map columns Update Workflow Settings: Replace placeholders with your actual data Test & Activate: Submit test form and verify data in Google Sheets 📖 Usage Guide Submitting Job Searches Go to your webhook URL and fill in the form with: City:** e.g., New York Job Title:** e.g., Software Engineer Country:** e.g., US Job Type:** Optional (Full-Time, Remote, etc.) Understanding Results Comprehensive job data Company info and profile links Direct application links Location and job descriptions Customizing Search Parameters Edit the Create Snapshot ID node to change: Time range (e.g., “Past month”) Result limits Company filters 🔧 Customization Options More Data Points:** Add salary, seniority, applicants, etc. Custom Form Fields:** Add filters for salary, experience, industry Multiple Sheets:** Route results by job type or location 🚨 Troubleshooting Bright Data connection failed:** Check API credentials and dataset access No job data extracted:** Verify search parameters and API limits Google Sheets permission denied:** Re-authenticate and check sharing Form not working:** Check webhook URL and field mappings Filter issues:** Review logic and data types Execution failed:** Check logs, retry logic, and network status 📊 Use Cases & Examples Job Seeker Dashboard:** Automate job search and track applications Recruitment Pipeline:** Source candidates and monitor hiring trends Market Research:** Analyze job trends and salary benchmarks HR Analytics:** Support workforce planning and competitive insights ⚙️ Advanced Configuration Batch Processing:** Queue multiple searches with delays Search History:** Track and analyze past searches Tool Integration:** Connect to CRM, Slack, databases, BI tools 📈 Performance & Limits Processing Time:** 30–60 seconds per search Concurrent Requests:** 2–3 (depends on Bright Data plan) Data Accuracy:** 95%+ Success Rate:** 90%+ Daily Capacity:** 50–200 searches Memory:** ~50MB per execution API Calls:** 3–4 Bright Data + 1 Google Sheets per search 🤝 Support & Community n8n Community:** community.n8n.io Documentation:** docs.n8n.io Bright Data Support:** Via your Bright Data dashboard GitHub Issues:** Report bugs and request features 🎯 Ready to Use! Your workflow is ready for automated LinkedIn job searching. Customize it to your recruiting or job search needs. Webhook URL: https://your-n8n-instance.com/webhook/linkedin-job-finder What Gets Extracted: * ✅ Job Title * ✅ Company Information * ✅ Location Data * ✅ Job Details * ✅ Application Links * ✅ Processing Timestamps ### Use Cases: * 🔍 Job Search Automation * 📊 Recruitment Intelligence * 📝 Market Research * 🎯 HR Analytics
by Kirill Khatkevich
This workflow is a comprehensive solution for digital marketers, performance agencies, and e-commerce brands looking to scale their creative testing process on Meta Ads efficiently. It eliminates the tedious manual work of uploading assets, creating campaigns, and setting up ads one by one. Use Case Manually launching weekly creative tests is time-consuming and prone to errors. This workflow solves that problem by creating a fully automated pipeline: from a creative asset in a folder to a complete, ready-to-launch (but paused) ad structure in your Meta Ads account. It's perfect for teams that want to: Save hours of manual work every week. Systematically test a high volume of creatives. Maintain a structured and consistent campaign naming convention. Keep a detailed log of all created assets for data-driven performance analysis. How it Works The workflow is structured into four logical blocks: 1. Configuration & Scheduling: The workflow runs on a weekly schedule. A central "Configuration" Set node at the beginning holds all key variables (Ad Account ID, Page ID, Pixel ID, making it incredibly easy to adapt the template for different projects. 2. Creative Ingestion & Processing: It scans a specific Google Drive folder for new image and video files. Using an IF node, it branches the logic based on the file type. Each file is uploaded to the Meta Ads library, and a corresponding Ad Creative is built with a pre-defined destination URL. 3. Campaign & Ad Set Assembly: The workflow creates a single new Campaign with an OUTCOME_SALES objective. It then creates a single Ad Set optimized for OFFSITE_CONVERSIONS (e.g., "Add to Cart"), using the Pixel ID from the configuration. A Merge node intelligently combines the single Ad Set ID with every creative processed in the previous block, preparing the data for the final step. 4. Ad Creation & Data Logging: The workflow iterates through the prepared data, creating a unique Ad for each creative. Upon the successful creation of each ad, a new row is appended to a Google Sheet, logging all relevant IDs (CampaignID, AdSetID, AdID, CreativeID) and metadata for a complete audit trail. Setup Instructions To use this template, you need to configure a few key nodes. 1. Credentials: Connect your Meta Ads account. Connect your Google account (for both Drive and Sheets). 2. The ⚙️ Configuration Node (Set node): This is the most important step. Open the first Set node and fill in your specific values: adAccountId: Your Meta Ad Account ID. pageId: The ID of the Facebook Page you're advertising for. pixelId: Your Meta Pixel ID for conversion tracking. 3. Google Sheets Node (Save Full Report to Sheet): Select your spreadsheet and the specific sheet where you want to save the reports. Make sure your sheet has columns with the following headers: CampaignID, AdSetID, AdID, CreativeID, FileName, MimeType, Timestamp. 4. Check URLs and IDs in HTTP Request Nodes: The template is configured to use the variables from the ⚙️ Configuration node. Double-check that the URLs in the Create Campaign, Create Ad Set, and Create ... Creative nodes correctly reference these variables (e.g., .../act_{{ $('⚙️ Configuration Meta Ads').item.json.adAccountId }}/campaigns). Verify the link in the Create Video Creative and Create Image Creative nodes points to your desired landing page. 5. Activate the Workflow: Set your desired schedule in the Schedule Trigger node. Save and activate the workflow. Further Ideas & Customization This workflow is a powerful foundation. You can easily extend it to: Create a second workflow** that runs a week later, reads the Google Sheet, and pulls performance data for all the ads created. A/B test ad copy** by adding different text variations from a spreadsheet. Add a Slack or Email notification** at the end to confirm that the weekly campaign launch was successful.
by Jimleuk
This n8n template monitors active support issues in Linear.app to track the mood of their ongoing conversation between reporter and assignee using Sentiment Analysis. When sentiment dips into the negative, a notification is sent via Slack to alert the team. How it works A scheduled trigger is used to fetch recently updated issues in Linear using the GraphQL node. Each issue's comments thread is passed into a simple Information Extractor node to identify the overall sentiment. The resulting sentiment analysis combined with the some issue details are uploaded to Airtable for review. When the template is re-run at a later date, each issue is re-analysed for sentiment Each issue's new sentiment state is saved to the airtable whilst its previous state is moved to the "previous sentiment" column. An Airtable trigger is used to watch for recently updated rows Each matching Airtable row is filtered to check if it has a previous non-negative state but now has a negative state in its current sentiment. The results are sent via notification to a team slack channel for priority. Check out the sample Airtable here: https://airtable.com/appViDaeaFw4qv9La/shrq6HgeYzpW6uwXL How to use Modify the GraphQL filter to fetch issues to a relevant issue type, team or person. Update the Slack channel to ensure messages are sent to the correct location or persons. The Airtable also serves to give a snapshot of Sentiment across support tickets for a given period. It's possible to use this to assess the daily operations. Requirements Linear for issue tracking (but feel free to use another system if preferred) Airtable for Database OpenAI for LLM and Sentiment Analysis Customising the workflow Add more granular levels of sentiment to reduce the number of alerts. Explore different types of sentiment based on issue types and customer types. This may help prioritise alerts and response. Run across teams or categories of issues to get an overview of sentiment across the support organisation.
by Joey D’Anna
This template is an error handler that will log n8n workflow errors to a Monday.com board for troubleshooting and tracking. Prerequisites Monday account and Monday credential Create a board on Monday for error logging, with the following columns and types: Timestamp (text) Error Message (text) Stack Trace (long text) Determine the column IDs using Monday's instructions Setup Edit the Monday nodes to use your credential Edit the node labeled CREATE ERROR ITEM to point to your error log board and group name Edit the column IDs in the "Column Values" field of the UPDATE node to match the IDs of the fields on your error log board To trigger error logging, select this automation as the error workflow on any automation For more detailed logging, add Stop and Error nodes in your workflow to send specific error messages to your board.
by Wildkick
🚀 Local Multi-LLM Testing & Performance Tracker This workflow is perfect for developers, researchers, and data scientists benchmarking multiple LLMs with LM Studio. It dynamically fetches active models, tests prompts, and tracks metrics like word count, readability, and response time, logging results into Google Sheets. Easily adjust temperature 🔥 and top P 🎯 for flexible model testing. Level of Effort: 🟢 Easy – Minimal setup with customizable options. Setup Steps: Install LM Studio and configure models. Update IP to connect to LM Studio. Create a Google Sheet for result tracking. Key Outcomes: Benchmark LLM performance. Automate results in Google Sheets for easy comparison. Version 1.0
by Arnaud MARIE
Monthly Spotify Track Archiving and Playlist Classification This n8n workflow allows you to automatically archive your monthly Spotify liked tracks in a Google Sheet, along with playlist details and descriptions. Based on this data, Claude 3.5 is used to classify each track into multiple playlists and add them in bulk. Who is this template for? This workflow template is perfect for Spotify users who want to systematically archive their listening history and organize their tracks into custom playlists. What problem does this workflow solve? It automates the monthly process of tracking, storing, and categorizing Spotify tracks into relevant playlists, helping users maintain well-organized music collections and keep a historical record of their listening habits. Workflow Overview Trigger Options**: Can be initiated manually or on a set schedule. Spotify Playlists Retrieval**: Fetches the current playlists and filters them by owner. Track Details Collection**: Retrieves information such as track ID and popularity from the user’s library. Audio Features Fetching**: Uses Spotify's API to get audio features for each track. Data Merging**: Combines track information with their audio features. Duplicate Checking**: Filters out tracks that have already been logged in Google Sheets. Data Logging**: Archives new tracks into a Google Sheet. AI Classification**: Uses an AI model to classify tracks into suitable playlists. Playlist Updates**: Adds classified tracks to the corresponding playlists. Setup Instructions Credentials Setup: Make sure you have valid Spotify OAuth2 and Google Sheets access credentials. Trigger Configuration: Choose between manual or scheduled triggers to start the workflow. Google Sheets Preparation: Set up a Google Sheet with the necessary structure for logging track details. Spotify Playlists Setup: Have a diverse range of playlists and exhaustive description (see example) ready to accommodate different music genres and moods. Customization Options Adjust Playlist Conditions**: Modify the AI model’s classification criteria to align with your personal music preferences. Enhance Track Analysis**: Incorporate additional audio features or external data sources for more refined track categorization. Personalize Data Logging**: Customize which track attributes to log in Google Sheets based on your archival preferences. Configure Scheduling**: Set a preferred schedule for periodic track archiving, e.g., monthly or weekly. Cost Estimate For 300 tracks, the token usage amounts to approximately 60,000 tokens (58,000 for input and 2,000 for completion), costing around 20 cents with Claude 3.5 Sonnet (as of October 2024). Playlists' Description Examples | Playlist Name | Playlist Description | |-------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Classique | Indulge in the timeless beauty of classical music with this refined playlist. From baroque to romantic periods, this collection showcases renowned compositions. | | Poi | Find your flow with this dynamic playlist tailored for poi, staff, and ball juggling. Featuring rhythmic tracks that complement your movements. | | Pro Sound | Boost your productivity and focus with this carefully selected mix of concentration-enhancing music. Ideal for work or study sessions. | | ChillySleep | Drift off to dreamland with this soothing playlist of sleep-inducing tracks. Gentle melodies and ambient sounds create a peaceful atmosphere for restful sleep. | | To Sing | Warm up your vocal cords and sing your heart out with karaoke-friendly tracks. Featuring popular songs, perfect for solo performances or group sing-alongs. | | 1990s | Relive the diverse musical landscape of the 90s with this eclectic mix. From grunge to pop, hip-hop to electronic, this playlist showcases defining genres. | | 1980s | Take a nostalgic trip back to the era of big hair and neon with this 80s playlist. Packed with iconic hits and forgotten gems, capturing the energy of the decade.| | Groove Up | Elevate your mood and energy with this upbeat playlist. Featuring a mix of feel-good tracks across various genres to lift your spirits and get you moving. | | Reggae & Dub | Relax and unwind with the laid-back vibes of reggae and dub. This playlist combines classic reggae tunes with deep, spacious dub tracks for a chilled-out vibe. | | Psytrance | Embark on a mind-bending journey with this collection of psychedelic trance tracks. Ideal for late-night dance sessions or intense focus. | | Cumbia | Sway to the infectious rhythms of Cumbia with this lively playlist. Blending traditional Latin American sounds with modern interpretations for a danceable mix. | | Funky Groove | Get your body moving with this collection of funk and disco tracks. Featuring irresistible basslines and catchy rhythms, perfect for dance parties. | | French Chanson | Experience the romance and charm of France with this mix of classic and modern French songs, capturing the essence of French musical culture. | | Workout Motivation | Push your limits and power through your exercise routine with this high-energy playlist. From warm-up to cool-down, these tracks will keep you motivated. | | Cinematic Instrumentals | Immerse yourself in a world of atmospheric sounds with this collection of cinematic instrumental tracks, perfect for focus, relaxation, or contemplation. |
by Zacharia Kimotho
How it works This workflow gets the search console results data and exports this to google sheets. This makes it easier to visualize and do other SEO related tasks and activities without having to log into Search Console Setup and use Set your desired schedule Enter your desired domain Connect to your Google sheets or make a copy of this sheet. Detailed Setup Inputs and Outputs:** Input: API response from Google Search Console regarding keywords, page data, and date data. Output: Entries written to Google Sheets containing keyword data, clicks, impressions, CTR, and positions. Setup Instructions: Prerequisites:** An n8n instance set up and running. Active Google Account with access to Google Search Console and Google Sheets. Google OAuth 2.0 credentials for API access. Step-by-Step Setup:** Open n8n and create a new workflow. Add the nodes as described in the JSON. Configure the Google OAuth2 credentials in n8n to enable API access. Set your domain in the Set your domain node. Customize the Google Sheets document URLs to your personal sheets. Adjust the schedule in the Schedule Trigger node as per your requirements. Save the workflow. Configuration Options:** You can customize the date ranges in the body of the HttpRequest nodes. Adjust any fields in the Edit Fields nodes based on different data requirements. Use Case Examples: Useful in tracking website performance over time using Search Console metrics. Ideal for digital marketers, SEO specialists, and web analytics professionals. Offers value in compiling performance reports for stakeholders or team reviews. Running and Troubleshooting: Running the Workflow:** Trigger the workflow manually or wait for the schedule to run it automatically. Monitoring Execution:** Check the execution logs in n8n's dashboard to ensure all nodes complete successfully. Common Issues:** Invalid OAuth credentials – ensure credentials are set up correctly. Incorrect Google Sheets URLs – double-check document links and permissions. Scheduling conflicts – make sure the schedule set does not overlap with other workflows.
by Carlos Contreras
Introduction This workflow is designed to create and attach notes or comments to any record in your Odoo instance. It acts as a sub-workflow that can be triggered by a main workflow to log messages or comments in a centralized manner. By leveraging the powerful Odoo API, this template ensures that updates to records are handled efficiently, providing an organized way to document important information related to your business processes. Setup Instructions Import the Workflow: Import the provided JSON file into your n8n instance. Odoo Credentials: Ensure you have valid Odoo API credentials (e.g., "Roodsys Odoo Automation Account") configured in n8n. Node Configuration: Verify that the "Odoo" node (consider renaming it to "Odoo Record Manager" for clarity) is set up with your server details and authentication parameters. Check that the workflow trigger ("When Executed by Another Workflow") is configured to receive input parameters from the parent workflow. Execution Trigger: This workflow is designed to be initiated by another workflow. Make sure the main workflow supplies the required inputs. Workflow Details Trigger Node: The workflow begins with the "When Executed by Another Workflow" node, which accepts three inputs: rec_id: A numeric identifier for the Odoo record. message: The text of the comment or note. model: The specific Odoo model (e.g., rs.deployment.action.log) where the note should be attached. Odoo Node: The second node in the workflow calls the Odoo API to create a new log message. It maps the inputs as follows: message_type is set to "comment". model is assigned the provided model name. res_id is assigned the record ID (rec_id). body is assigned the message content. Additional Information: A sticky note node is included to provide a brief overview of the workflow’s purpose directly within the interface. Input Parameters Record ID (rec_id): The unique identifier of the record in Odoo where the note will be added. Message (message): The content of the comment or note that is to be logged. Model (model): The Odoo model name indicating the context in which the note should be created (e.g., rs.deployment.action.log). Usage Examples Internal Logging: Use the workflow to attach internal comments or logs to specific records, such as customer profiles, orders, or deployment logs. Audit Trails: Create a comprehensive audit trail by documenting changes or important events in Odoo records. Integration with Other Workflows: Link this workflow with other automation processes in n8n (like email notifications, data synchronization, or reporting) to create a seamless integration across your systems. Pre-conditions The Odoo instance must be accessible and correctly configured. API permissions and user roles should be validated to ensure that the workflow has the necessary access rights. The workflow expects inputs from an external trigger or parent workflow. Customization & Integration This template offers several customization options to tailor it to your needs: Field Customization: Modify or add new fields to match your logging or commenting requirements. Node Renaming: Rename nodes for better clarity and consistency within your workflow ecosystem. Integration Possibilities: Easily integrate this workflow with other processes in n8n, such as triggering notifications or synchronizing data across different systems. This sub-workflow receives data from a main workflow (for example, a record ID, a message, and the Odoo model) and creates a new note (or comment) in the corresponding Odoo record. Essentially, it acts as a centralized point for logging comments or notes in a specific Odoo model, ensuring that the information remains organized and easy to track. Your model must inherit from _inherit = ['portal.mixin', 'mail.thread.main.attachment']