by Airtop
About the Automation Staying on top of competitor pricing changes can be a full-time job. Manual price tracking is time-consuming and prone to errors, especially when dealing with complex pricing structures and multiple subscription tiers. Paid competitor price monitoring tools like Competera, Visualping and Fluxguard can be expensive. What if you could automate this process and get instant alerts when competitors adjust their pricing? How to easily monitor competitor pricing With this automation, you'll learn how to set up automated price monitoring system using Airtop's built-in node in n8n. By the end, your system will automatically track competitor pricing changes and notify you of any modifications. What You'll Need A free Airtop API Key Google Sheets account with a copy of this sheet URLs of competitors' pricing pages Understanding the Process This automation continuously monitors competitor pricing pages and compares them against your baseline data. The workflow: Tracks all different pricing plans (monthly, yearly, etc.). Monitors feature changes across different tiers. Detects and logs pricing structure modifications. Alerts you via Slack when changes are detected Setting Up Your Automation We've created a ready-to-use blueprint for seamless price monitoring. Here's how to get started: Connect your Google Sheets Set up your Airtop API connection Define update frequency Customization Options Enhance the basic template with these popular modifications: Add other notification channels (Email, Telegram, etc.). Include feature comparison tracking. Set up threshold-based alerts for significant price changes Track historical pricing trends Real-World Applications Case Study 1: A B2B SaaS company can use this automation to track competitors' pricing changes. When they identify a market-wide pricing shift, they can adjust their strategy proactively within minutes. Case Study 2: An online Ecommerce retailer automates monitoring of 100+ competitor products, maintaining optimal pricing positions and increasing profit margins. Best Practices To ensure accurate tracking: Include detailed baseline data for each pricing tier Specify both monthly and annual pricing clearly List all features included in each plan Update your baseline data whenever you verify changes Include any promotional pricing or special offers Document currency and regional variations if applicable Example Structure in Google Sheets: Competitor: Acme Tools Basic Plan: Monthly: $29 Annual: $290 ($24.17/mo) Features: 5 users, 10GB storage, basic support Pro Plan: Monthly: $79 Annual: $790 ($65.83/mo) Features: 20 users, 50GB storage, priority support What's Next? After setting up your price monitoring automation, consider the following: Creating automated competitive analysis reports Setting up market trend analysis Implementing automatic pricing recommendations Expanding monitoring to feature changes Happy monitoring!
by Jitesh Dugar
Streamline client onboarding and project setup from hours to minutes with AI-driven automation. This intelligent workflow eliminates manual coordination, builds proposals, creates projects in Asana, welcomes clients on Slack, and logs everything β ensuring 90% faster onboarding and zero dropped steps. What This Workflow Does Transforms your client onboarding from scattered tools and emails into one seamless automation: π Capture Client Details β Jotform intake form collects client, company, and project information. π§ AI-Powered Analysis β LangChain AI Agent analyzes the project scope, estimates effort, and recommends team composition. π Generate Proposal β Automatically builds a professional HTML proposal summarizing goals, timeline, and estimated hours. ποΈ Create Asana Project β Generates a new project with all key details, milestones, and assigned team members. π¬ Slack Collaboration β Creates a dedicated Slack channel, sends welcome messages, and introduces the project team. π§ Welcome Email β Sends a personalized onboarding email to the client with project summary and next steps. πΌ CRM Sync β Creates or updates a HubSpot contact with complete project and client information. π Audit Logging β Logs all onboarding activity to Google Sheets for centralized record-keeping. Key Features π€ AI Proposal Generation β Uses LangChain AI to generate smart project summaries and resource plans. βοΈ End-to-End Automation β From form submission to project creation, communication, and CRM logging. π¬ Smart Slack Setup β Automatic channel creation and messaging for internal coordination. π§ Personalized Client Emails β Beautifully formatted, professional onboarding emails. ποΈ Asana Integration β Project creation with dynamic task templates and priorities. π Google Sheets Logging β Instant audit trail for every client submission and generated proposal. πΌ CRM Integration β Automatically syncs client data with HubSpot for sales and account tracking. Perfect For π Agencies & Service Providers β Automate client onboarding, proposal creation, and task setup. π’ Consultancies β Quickly turn client requests into structured projects with assigned resources. π» Freelancers & Creators β Impress clients with AI-built proposals and instant communication. π Growing Teams β Scale onboarding without extra admin or coordination time. π§ Operations Teams β Ensure consistency and transparency across all onboarding activities. What Youβll Need Required Integrations π§Ύ Jotform β Client intake form (project details, budget, company info). Create your form for free on Jotform using this link π€ AI Agent β For analyzing project scope and building proposals. ποΈ Asana β Project creation and task assignment. π¬ Slack β For automated client channel creation and internal communication. π§ Gmail β For onboarding and proposal emails. πΌ HubSpot β CRM contact creation and project linkage. π Google Sheets β For logging all onboarding and AI results. Optional Enhancements π PDF Generation (PDF Munk) β Convert AI-generated proposals into downloadable PDFs. π¬ Slack Interactive Approvals β Add buttons for internal review before client communication. π Performance Dashboard β Connect Google Sheets data to Looker Studio for tracking onboarding times. π Multilingual Support β Add translation nodes for international clients. π File Attachments β Send proposal PDFs or onboarding kits automatically via Gmail. Quick Start 1οΈβ£ Import Template β Copy and import the JSON file into your n8n workspace. 2οΈβ£ Set Up Jotform β Create a form with fields for client name, company, project name, budget, and requirements. 3οΈβ£ Add Credentials β Connect Jotform, AI Agent, Asana, Slack, Gmail, HubSpot, and Google Sheets. 4οΈβ£ Configure Sheet ID β Replace YOUR_SHEET_ID in the Google Sheets node. 5οΈβ£ Customize Proposal HTML β Edit AI prompt and branding to reflect your companyβs style. 6οΈβ£ Test Workflow β Submit a test form and verify Slack, Asana, Gmail, and Sheets outputs. 7οΈβ£ Deploy β Activate workflow and share the Jotform link with your sales or operations team. Customization Options 1οΈβ£ Proposal Branding β Customize proposal HTML with logos, brand colors, and formatting. 2οΈβ£ AI Prompt Tuning β Refine the LangChain AI prompt to match your tone or project style. 3οΈβ£ Task Templates β Adjust task names, assignees, and due dates in the Asana creation node. 4οΈβ£ Slack Messaging β Update welcome message formatting and team introduction details. 5οΈβ£ CRM Fields β Map additional HubSpot properties for better data tracking. 6οΈβ£ Sheet Logging β Add more columns for tracking team recommendations or proposal scores. Expected Results β‘ 90% Faster Onboarding β Reduce manual setup from hours to minutes. π€ AI Precision β Intelligent proposals and team allocations that impress clients instantly. π Zero Missed Steps β Every project automatically created, communicated, and logged. π¬ Seamless Collaboration β Slack, Gmail, and Asana in perfect sync. ποΈ Complete Transparency β Every onboarding step logged for accountability and improvement. π Use Cases π§βπΌ Marketing & Creative Agencies β Automate creative project scoping and proposal creation. π» Software Development Teams β Rapidly assess client tech requirements and allocate developers. π§Ύ Consulting Firms β Build data-backed, AI-enhanced proposals for client engagements. π’ Corporate PMOs β Standardize project setup and approvals across multiple departments. Pro Tips π‘ Refine AI Prompt β Include examples of past projects to improve proposal quality. π¬ Add Slack Approvals β Insert βmanager approvalβ logic before sending proposals. π Attach PDFs β Use PDF Munk for branded, downloadable proposals. π Track Conversion β Link HubSpot deal stage changes based on Asana progress. π Monitor Efficiency β Use Sheet timestamps to calculate average onboarding time. Learning Resources This workflow demonstrates: AI integration using Agents Multi-app orchestration and data syncing Advanced HTML and email template customization Real-world Asana and Slack API usage CRM syncing and Google Sheets logging Modular, scalable n8n workflow design Workflow Structure Visualization π Jotform Submission β π§ AI Project Analysis (Agent) β π Proposal Generation (HTML) β ποΈ Asana Project Creation β π¬ Slack Channel Setup & Message β π§ Gmail Welcome Email β πΌ HubSpot Contact Creation β π Google Sheets Log Ready to Revolutionize Client Onboarding? Import this template today and let AI handle the heavy lifting. Your team saves hours, your clients get instant engagement β and your entire process runs flawlessly. β¨
by Wildkick
π Local Multi-LLM Testing & Performance Tracker This workflow is perfect for developers, researchers, and data scientists benchmarking multiple LLMs with LM Studio. It dynamically fetches active models, tests prompts, and tracks metrics like word count, readability, and response time, logging results into Google Sheets. Easily adjust temperature π₯ and top P π― for flexible model testing. Level of Effort: π’ Easy β Minimal setup with customizable options. Setup Steps: Install LM Studio and configure models. Update IP to connect to LM Studio. Create a Google Sheet for result tracking. Key Outcomes: Benchmark LLM performance. Automate results in Google Sheets for easy comparison. Version 1.0
by Keith Rumjahn
Who's this for? If you own a website and need to analyze your Google analytics data If you need to create an SEO report on which pages are getting most traffic or how your google search terms are performing If you want to grow your site based on suggestions from data Use case Instead of hiring an SEO expert, I run this report weekly. It checks compares the data from this week to the week before: Views based on countries The top performing pages Google search console performance Watch youtube tutorial here Get my SEO A.I. agent system here Read my detailed case study here How it works The workflow gathers google analytics for the past 7 days then it gathers the data for the week before for comparison. It does this 3 times to get: views per country, engagement per page and google search console results for organic search results. The google analytics nodes has already chosen the correct dimensions and metrics. At the end, it passes the data to openrouter.ai for A.I. analyse. Finally it saves to baserow. How to use this Input your Google analytics credentials Input your property ID Input your Openrouter.ai credentials Input your baserow credentials You will need to create a baserow database with columns: Name, Country Views, Page Views, Search Report, Blog (name of your blog). Created by Rumjahn
by Dvir Sharon
πΌ LinkedIn Job Finder Automation using Bright Data API & Google Sheets A comprehensive n8n automation that searches LinkedIn job postings using Bright Dataβs API and automatically organizes results in Google Sheets for efficient job hunting and recruitment workflows. π Overview This workflow provides an automated LinkedIn job search solution that collects job postings based on your search criteria and organizes them in Google Sheets. Perfect for job seekers, recruiters, HR professionals, and talent acquisition teams. β¨ Key Features π Smart Job Search:** Form-based input for city, job title, country, and job type π LinkedIn Integration:** Uses Bright Dataβs LinkedIn dataset for accurate job posting data π Automated Organization:** Populates Google Sheets with structured job data π§ Real-time Processing:** Processes job search requests in real-time π Data Storage:** Stores job details including company info, locations, and apply links π Batch Processing:** Handles multiple job postings efficiently β‘ Fast & Reliable:** Built-in error handling for scraping π― Customizable Filters:** Advanced job filtering based on criteria π― What This Workflow Does Input Job Search Criteria:** City, job title, country, and optional job type Search Parameters:** Configurable filters and limits Output Preferences:** Google Sheets destination Processing Steps Form Submission Data Request to Bright Data API Status Monitoring Data Extraction Data Filtering Sheet Update Error Handling Output Data Points Field Description Example Job Title Position title from posting Senior Software Engineer Company Name Employer company name Tech Solutions Inc. Job Detail Job summary/description Remote position requiring 5+ yearsβ¦ Location Job location San Francisco, CA Company URL Company profile link View Profile Apply Link Direct application link Apply Now π Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with LinkedIn dataset access Steps Import the Workflow: Use JSON import in n8n Configure Bright Data: Add API credentials and dataset ID Configure Google Sheets: Create sheet, set credentials, map columns Update Workflow Settings: Replace placeholders with your actual data Test & Activate: Submit test form and verify data in Google Sheets π Usage Guide Submitting Job Searches Go to your webhook URL and fill in the form with: City:** e.g., New York Job Title:** e.g., Software Engineer Country:** e.g., US Job Type:** Optional (Full-Time, Remote, etc.) Understanding Results Comprehensive job data Company info and profile links Direct application links Location and job descriptions Customizing Search Parameters Edit the Create Snapshot ID node to change: Time range (e.g., βPast monthβ) Result limits Company filters π§ Customization Options More Data Points:** Add salary, seniority, applicants, etc. Custom Form Fields:** Add filters for salary, experience, industry Multiple Sheets:** Route results by job type or location π¨ Troubleshooting Bright Data connection failed:** Check API credentials and dataset access No job data extracted:** Verify search parameters and API limits Google Sheets permission denied:** Re-authenticate and check sharing Form not working:** Check webhook URL and field mappings Filter issues:** Review logic and data types Execution failed:** Check logs, retry logic, and network status π Use Cases & Examples Job Seeker Dashboard:** Automate job search and track applications Recruitment Pipeline:** Source candidates and monitor hiring trends Market Research:** Analyze job trends and salary benchmarks HR Analytics:** Support workforce planning and competitive insights βοΈ Advanced Configuration Batch Processing:** Queue multiple searches with delays Search History:** Track and analyze past searches Tool Integration:** Connect to CRM, Slack, databases, BI tools π Performance & Limits Processing Time:** 30β60 seconds per search Concurrent Requests:** 2β3 (depends on Bright Data plan) Data Accuracy:** 95%+ Success Rate:** 90%+ Daily Capacity:** 50β200 searches Memory:** ~50MB per execution API Calls:** 3β4 Bright Data + 1 Google Sheets per search π€ Support & Community n8n Community:** community.n8n.io Documentation:** docs.n8n.io Bright Data Support:** Via your Bright Data dashboard GitHub Issues:** Report bugs and request features π― Ready to Use! Your workflow is ready for automated LinkedIn job searching. Customize it to your recruiting or job search needs. Webhook URL: https://your-n8n-instance.com/webhook/linkedin-job-finder What Gets Extracted: * β Job Title * β Company Information * β Location Data * β Job Details * β Application Links * β Processing Timestamps ### Use Cases: * π Job Search Automation * π Recruitment Intelligence * π Market Research * π― HR Analytics
by Kirill Khatkevich
This workflow is a comprehensive solution for digital marketers, performance agencies, and e-commerce brands looking to scale their creative testing process on Meta Ads efficiently. It eliminates the tedious manual work of uploading assets, creating campaigns, and setting up ads one by one. Use Case Manually launching weekly creative tests is time-consuming and prone to errors. This workflow solves that problem by creating a fully automated pipeline: from a creative asset in a folder to a complete, ready-to-launch (but paused) ad structure in your Meta Ads account. It's perfect for teams that want to: Save hours of manual work every week. Systematically test a high volume of creatives. Maintain a structured and consistent campaign naming convention. Keep a detailed log of all created assets for data-driven performance analysis. How it Works The workflow is structured into four logical blocks: 1. Configuration & Scheduling: The workflow runs on a weekly schedule. A central "Configuration" Set node at the beginning holds all key variables (Ad Account ID, Page ID, Pixel ID, making it incredibly easy to adapt the template for different projects. 2. Creative Ingestion & Processing: It scans a specific Google Drive folder for new image and video files. Using an IF node, it branches the logic based on the file type. Each file is uploaded to the Meta Ads library, and a corresponding Ad Creative is built with a pre-defined destination URL. 3. Campaign & Ad Set Assembly: The workflow creates a single new Campaign with an OUTCOME_SALES objective. It then creates a single Ad Set optimized for OFFSITE_CONVERSIONS (e.g., "Add to Cart"), using the Pixel ID from the configuration. A Merge node intelligently combines the single Ad Set ID with every creative processed in the previous block, preparing the data for the final step. 4. Ad Creation & Data Logging: The workflow iterates through the prepared data, creating a unique Ad for each creative. Upon the successful creation of each ad, a new row is appended to a Google Sheet, logging all relevant IDs (CampaignID, AdSetID, AdID, CreativeID) and metadata for a complete audit trail. Setup Instructions To use this template, you need to configure a few key nodes. 1. Credentials: Connect your Meta Ads account. Connect your Google account (for both Drive and Sheets). 2. The βοΈ Configuration Node (Set node): This is the most important step. Open the first Set node and fill in your specific values: adAccountId: Your Meta Ad Account ID. pageId: The ID of the Facebook Page you're advertising for. pixelId: Your Meta Pixel ID for conversion tracking. 3. Google Sheets Node (Save Full Report to Sheet): Select your spreadsheet and the specific sheet where you want to save the reports. Make sure your sheet has columns with the following headers: CampaignID, AdSetID, AdID, CreativeID, FileName, MimeType, Timestamp. 4. Check URLs and IDs in HTTP Request Nodes: The template is configured to use the variables from the βοΈ Configuration node. Double-check that the URLs in the Create Campaign, Create Ad Set, and Create ... Creative nodes correctly reference these variables (e.g., .../act_{{ $('βοΈ Configuration Meta Ads').item.json.adAccountId }}/campaigns). Verify the link in the Create Video Creative and Create Image Creative nodes points to your desired landing page. 5. Activate the Workflow: Set your desired schedule in the Schedule Trigger node. Save and activate the workflow. Further Ideas & Customization This workflow is a powerful foundation. You can easily extend it to: Create a second workflow** that runs a week later, reads the Google Sheet, and pulls performance data for all the ads created. A/B test ad copy** by adding different text variations from a spreadsheet. Add a Slack or Email notification** at the end to confirm that the weekly campaign launch was successful.
by Jimleuk
This n8n template monitors active support issues in Linear.app to track the mood of their ongoing conversation between reporter and assignee using Sentiment Analysis. When sentiment dips into the negative, a notification is sent via Slack to alert the team. How it works A scheduled trigger is used to fetch recently updated issues in Linear using the GraphQL node. Each issue's comments thread is passed into a simple Information Extractor node to identify the overall sentiment. The resulting sentiment analysis combined with the some issue details are uploaded to Airtable for review. When the template is re-run at a later date, each issue is re-analysed for sentiment Each issue's new sentiment state is saved to the airtable whilst its previous state is moved to the "previous sentiment" column. An Airtable trigger is used to watch for recently updated rows Each matching Airtable row is filtered to check if it has a previous non-negative state but now has a negative state in its current sentiment. The results are sent via notification to a team slack channel for priority. Check out the sample Airtable here: https://airtable.com/appViDaeaFw4qv9La/shrq6HgeYzpW6uwXL How to use Modify the GraphQL filter to fetch issues to a relevant issue type, team or person. Update the Slack channel to ensure messages are sent to the correct location or persons. The Airtable also serves to give a snapshot of Sentiment across support tickets for a given period. It's possible to use this to assess the daily operations. Requirements Linear for issue tracking (but feel free to use another system if preferred) Airtable for Database OpenAI for LLM and Sentiment Analysis Customising the workflow Add more granular levels of sentiment to reduce the number of alerts. Explore different types of sentiment based on issue types and customer types. This may help prioritise alerts and response. Run across teams or categories of issues to get an overview of sentiment across the support organisation.
by L HΓΉng
This workflow acts as an error handler, sending real-time notifications to Telegram when another workflow fails. It provides detailed error information, including workflow name, timestamp, execution URL, last executed node, and error message. Pre-Conditions A Telegram bot created via BotFather. The bot token and Telegram group/channel chatId. An active n8n instance with the Telegram and Error Trigger nodes installed. Setup Workflow Configuration: Import the workflow into n8n. Update the Telegram chatId in the Config node. Add your Telegram bot token in the Telegram node credentials. Error Workflow Setup: Set this workflow as the Error Workflow in other workflows. Testing: Trigger an error in another workflow to verify Telegram notifications. Who the Workflow is For Developers:** Monitoring workflow failures in real-time. Teams:** Managing multiple n8n workflows and needing instant error alerts. n8n Users:** Looking for a simple way to handle workflow errors via Telegram. Primary Use Automates error notifications for failed workflows. Sends detailed error reports to Telegram for quick troubleshooting. Easily customizable to fit specific monitoring needs.
by Arnaud MARIE
Monthly Spotify Track Archiving and Playlist Classification This n8n workflow allows you to automatically archive your monthly Spotify liked tracks in a Google Sheet, along with playlist details and descriptions. Based on this data, Claude 3.5 is used to classify each track into multiple playlists and add them in bulk. Who is this template for? This workflow template is perfect for Spotify users who want to systematically archive their listening history and organize their tracks into custom playlists. What problem does this workflow solve? It automates the monthly process of tracking, storing, and categorizing Spotify tracks into relevant playlists, helping users maintain well-organized music collections and keep a historical record of their listening habits. Workflow Overview Trigger Options**: Can be initiated manually or on a set schedule. Spotify Playlists Retrieval**: Fetches the current playlists and filters them by owner. Track Details Collection**: Retrieves information such as track ID and popularity from the userβs library. Audio Features Fetching**: Uses Spotify's API to get audio features for each track. Data Merging**: Combines track information with their audio features. Duplicate Checking**: Filters out tracks that have already been logged in Google Sheets. Data Logging**: Archives new tracks into a Google Sheet. AI Classification**: Uses an AI model to classify tracks into suitable playlists. Playlist Updates**: Adds classified tracks to the corresponding playlists. Setup Instructions Credentials Setup: Make sure you have valid Spotify OAuth2 and Google Sheets access credentials. Trigger Configuration: Choose between manual or scheduled triggers to start the workflow. Google Sheets Preparation: Set up a Google Sheet with the necessary structure for logging track details. Spotify Playlists Setup: Have a diverse range of playlists and exhaustive description (see example) ready to accommodate different music genres and moods. Customization Options Adjust Playlist Conditions**: Modify the AI modelβs classification criteria to align with your personal music preferences. Enhance Track Analysis**: Incorporate additional audio features or external data sources for more refined track categorization. Personalize Data Logging**: Customize which track attributes to log in Google Sheets based on your archival preferences. Configure Scheduling**: Set a preferred schedule for periodic track archiving, e.g., monthly or weekly. Cost Estimate For 300 tracks, the token usage amounts to approximately 60,000 tokens (58,000 for input and 2,000 for completion), costing around 20 cents with Claude 3.5 Sonnet (as of October 2024). Playlists' Description Examples | Playlist Name | Playlist Description | |-------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Classique | Indulge in the timeless beauty of classical music with this refined playlist. From baroque to romantic periods, this collection showcases renowned compositions. | | Poi | Find your flow with this dynamic playlist tailored for poi, staff, and ball juggling. Featuring rhythmic tracks that complement your movements. | | Pro Sound | Boost your productivity and focus with this carefully selected mix of concentration-enhancing music. Ideal for work or study sessions. | | ChillySleep | Drift off to dreamland with this soothing playlist of sleep-inducing tracks. Gentle melodies and ambient sounds create a peaceful atmosphere for restful sleep. | | To Sing | Warm up your vocal cords and sing your heart out with karaoke-friendly tracks. Featuring popular songs, perfect for solo performances or group sing-alongs. | | 1990s | Relive the diverse musical landscape of the 90s with this eclectic mix. From grunge to pop, hip-hop to electronic, this playlist showcases defining genres. | | 1980s | Take a nostalgic trip back to the era of big hair and neon with this 80s playlist. Packed with iconic hits and forgotten gems, capturing the energy of the decade.| | Groove Up | Elevate your mood and energy with this upbeat playlist. Featuring a mix of feel-good tracks across various genres to lift your spirits and get you moving. | | Reggae & Dub | Relax and unwind with the laid-back vibes of reggae and dub. This playlist combines classic reggae tunes with deep, spacious dub tracks for a chilled-out vibe. | | Psytrance | Embark on a mind-bending journey with this collection of psychedelic trance tracks. Ideal for late-night dance sessions or intense focus. | | Cumbia | Sway to the infectious rhythms of Cumbia with this lively playlist. Blending traditional Latin American sounds with modern interpretations for a danceable mix. | | Funky Groove | Get your body moving with this collection of funk and disco tracks. Featuring irresistible basslines and catchy rhythms, perfect for dance parties. | | French Chanson | Experience the romance and charm of France with this mix of classic and modern French songs, capturing the essence of French musical culture. | | Workout Motivation | Push your limits and power through your exercise routine with this high-energy playlist. From warm-up to cool-down, these tracks will keep you motivated. | | Cinematic Instrumentals | Immerse yourself in a world of atmospheric sounds with this collection of cinematic instrumental tracks, perfect for focus, relaxation, or contemplation. |
by Zacharia Kimotho
How it works This workflow gets the search console results data and exports this to google sheets. This makes it easier to visualize and do other SEO related tasks and activities without having to log into Search Console Setup and use Set your desired schedule Enter your desired domain Connect to your Google sheets or make a copy of this sheet. Detailed Setup Inputs and Outputs:** Input: API response from Google Search Console regarding keywords, page data, and date data. Output: Entries written to Google Sheets containing keyword data, clicks, impressions, CTR, and positions. Setup Instructions: Prerequisites:** An n8n instance set up and running. Active Google Account with access to Google Search Console and Google Sheets. Google OAuth 2.0 credentials for API access. Step-by-Step Setup:** Open n8n and create a new workflow. Add the nodes as described in the JSON. Configure the Google OAuth2 credentials in n8n to enable API access. Set your domain in the Set your domain node. Customize the Google Sheets document URLs to your personal sheets. Adjust the schedule in the Schedule Trigger node as per your requirements. Save the workflow. Configuration Options:** You can customize the date ranges in the body of the HttpRequest nodes. Adjust any fields in the Edit Fields nodes based on different data requirements. Use Case Examples: Useful in tracking website performance over time using Search Console metrics. Ideal for digital marketers, SEO specialists, and web analytics professionals. Offers value in compiling performance reports for stakeholders or team reviews. Running and Troubleshooting: Running the Workflow:** Trigger the workflow manually or wait for the schedule to run it automatically. Monitoring Execution:** Check the execution logs in n8n's dashboard to ensure all nodes complete successfully. Common Issues:** Invalid OAuth credentials β ensure credentials are set up correctly. Incorrect Google Sheets URLs β double-check document links and permissions. Scheduling conflicts β make sure the schedule set does not overlap with other workflows.
by Carlos Contreras
Introduction This workflow is designed to create and attach notes or comments to any record in your Odoo instance. It acts as a sub-workflow that can be triggered by a main workflow to log messages or comments in a centralized manner. By leveraging the powerful Odoo API, this template ensures that updates to records are handled efficiently, providing an organized way to document important information related to your business processes. Setup Instructions Import the Workflow: Import the provided JSON file into your n8n instance. Odoo Credentials: Ensure you have valid Odoo API credentials (e.g., "Roodsys Odoo Automation Account") configured in n8n. Node Configuration: Verify that the "Odoo" node (consider renaming it to "Odoo Record Manager" for clarity) is set up with your server details and authentication parameters. Check that the workflow trigger ("When Executed by Another Workflow") is configured to receive input parameters from the parent workflow. Execution Trigger: This workflow is designed to be initiated by another workflow. Make sure the main workflow supplies the required inputs. Workflow Details Trigger Node: The workflow begins with the "When Executed by Another Workflow" node, which accepts three inputs: rec_id: A numeric identifier for the Odoo record. message: The text of the comment or note. model: The specific Odoo model (e.g., rs.deployment.action.log) where the note should be attached. Odoo Node: The second node in the workflow calls the Odoo API to create a new log message. It maps the inputs as follows: message_type is set to "comment". model is assigned the provided model name. res_id is assigned the record ID (rec_id). body is assigned the message content. Additional Information: A sticky note node is included to provide a brief overview of the workflowβs purpose directly within the interface. Input Parameters Record ID (rec_id): The unique identifier of the record in Odoo where the note will be added. Message (message): The content of the comment or note that is to be logged. Model (model): The Odoo model name indicating the context in which the note should be created (e.g., rs.deployment.action.log). Usage Examples Internal Logging: Use the workflow to attach internal comments or logs to specific records, such as customer profiles, orders, or deployment logs. Audit Trails: Create a comprehensive audit trail by documenting changes or important events in Odoo records. Integration with Other Workflows: Link this workflow with other automation processes in n8n (like email notifications, data synchronization, or reporting) to create a seamless integration across your systems. Pre-conditions The Odoo instance must be accessible and correctly configured. API permissions and user roles should be validated to ensure that the workflow has the necessary access rights. The workflow expects inputs from an external trigger or parent workflow. Customization & Integration This template offers several customization options to tailor it to your needs: Field Customization: Modify or add new fields to match your logging or commenting requirements. Node Renaming: Rename nodes for better clarity and consistency within your workflow ecosystem. Integration Possibilities: Easily integrate this workflow with other processes in n8n, such as triggering notifications or synchronizing data across different systems. This sub-workflow receives data from a main workflow (for example, a record ID, a message, and the Odoo model) and creates a new note (or comment) in the corresponding Odoo record. Essentially, it acts as a centralized point for logging comments or notes in a specific Odoo model, ensuring that the information remains organized and easy to track. Your model must inherit from _inherit = ['portal.mixin', 'mail.thread.main.attachment']
by Greg Evseev
This workflow template provides a robust solution for efficiently sending multiple prompts to Anthropic's Claude models in a single batch request and retrieving the results. It leverages the Anthropic Batch API endpoint (/v1/messages/batches) for optimized processing and outputs each result as a separate item. Core Functionality & Example Usage Included This template includes: The Core Batch Processing Workflow: Designed to be called by another n8n workflow. An Example Usage Workflow: A separate branch demonstrating how to prepare data and trigger the core workflow, including examples using simple strings and n8n's Langchain Chat Memory nodes. Who is this for? This template is designed for: Developers, data scientists, and researchers** who need to process large volumes of text prompts using Claude models via n8n. Content creators** looking to generate multiple pieces of content (e.g., summaries, Q&As, creative text) based on different inputs simultaneously. n8n users** who want to automate interactions with the Anthropic API beyond single requests, improve efficiency, and integrate batch processing into larger automation sequences. Anyone needing to perform bulk text generation or analysis tasks with Claude programmatically. What problem does this workflow solve? Sending prompts to language models one by one can be slow and inefficient, especially when dealing with hundreds or thousands of requests. This workflow addresses that by: Batching:** Grouping multiple prompts into a single API call to Anthropic's dedicated batch endpoint (/v1/messages/batches). Efficiency:** Significantly reducing the time required compared to sequential processing. Scalability:** Handling large numbers of prompts (up to API limits) systematically. Automation:** Providing a ready-to-use, callable n8n structure for batch interactions with Claude. Structured Output:** Parsing the results and outputting each individual prompt's result as a separate n8n item. Use Cases: Bulk content generation (e.g., product descriptions, summaries). Large-scale question answering based on different contexts. Sentiment analysis or data extraction across multiple text snippets. Running the same prompt against many different inputs for research or testing. What the Core Workflow does (Triggered by the 'When Executed by Another Workflow' node) Receive Input: The workflow starts when called by another workflow (e.g., using the 'Execute Workflow' node). It expects input data containing: anthropic-version (string, e.g., "2023-06-01") requests (JSON array, where each object represents a single prompt request conforming to the Anthropic Batch API schema). Submit Batch Job: Sends the formatted requests data via POST to the Anthropic API /v1/messages/batches endpoint to create a new batch job. Requires Anthropic credentials. Wait & Poll: Enters a loop: Checks if the processing_status of the batch job is ended. If not ended, it waits for a set interval (10 seconds by default in the 'Batch Status Poll Interval' node). It then checks the batch job status again via GET to /v1/messages/batches/{batch_id}. Requires Anthropic credentials. This loop continues until the status is ended. Retrieve Results: Once the batch job is complete, it fetches the results file by making a GET request to the results_url provided in the batch status response. Requires Anthropic credentials. Parse Results: The results are typically returned in JSON Lines (.jsonl) format. The 'Parse response' Code node splits the response text by newlines and parses each line into a separate JSON object, storing them in an array field (e.g., parsed). Split Output: The 'Split Out Parsed Results' node takes the array of parsed results and outputs each result object as an individual item from the workflow. Prerequisites An active n8n instance (Cloud or self-hosted). An Anthropic API account with access granted to Claude models and the Batch API. Your Anthropic API Key. Basic understanding of n8n concepts (nodes, workflows, credentials, expressions, 'Execute Workflow' node). Familiarity with JSON data structures for providing input prompts and understanding the output. Understanding of the Anthropic Batch API request/response structure. (For Example Usage Branch) Familiarity with n8n's Langchain nodes (@n8n/n8n-nodes-langchain) if you plan to adapt that part. Setup Import Template: Add this template to your n8n instance. Configure Credentials: Navigate to the 'Credentials' section in your n8n instance. Click 'Add Credential'. Search for 'Anthropic' and select the Anthropic API credential type. Enter your Anthropic API Key and save the credential (e.g., name it "Anthropic account"). Assign Credentials: Open the workflow and locate the three HTTP Request nodes in the core workflow: Submit batch Check batch status Get results In each of these nodes, select the Anthropic credential you just configured from the 'Credential for Anthropic API' dropdown. Review Input Format: Understand the required input structure for the When Executed by Another Workflow trigger node. The primary inputs are anthropic-version (string) and requests (array). Refer to the Sticky Notes in the template and the Anthropic Batch API documentation for the exact schema required within the requests array. Activate Workflow: Save and activate the core workflow so it can be called by other workflows. β‘οΈ Quick Start & Input/Output Examples: Look for the Sticky Notes within the workflow canvas! They provide crucial information, including examples of the required input JSON structure and the expected output format. How to customize this workflow Input Source:* The core workflow is designed to be called. You will build *another workflow that prepares the anthropic-version and requests array and then uses the 'Execute Workflow' node to trigger this template. The included example branch shows how to prepare this data. Model Selection & Parameters:* Model (claude-3-opus-20240229, etc.), max_tokens, temperature, and other parameters are defined *within each object inside the requests array you pass to the workflow trigger. You configure these in the workflow calling this template. Polling Interval:** Modify the 'Wait' node ('Batch Status Poll Interval') duration if you need faster or slower status checks (default is 10 seconds). Be mindful of potential rate limits. Parsing Logic:** If Anthropic changes the result format or you have specific needs, modify the Javascript code within the 'Parse response' Code node. Error Handling:** Enhance the workflow with more specific error handling for API failures (e.g., using 'Error Trigger' or checking HTTP status codes) or batch processing issues (batch.status === 'failed'). Output Processing:* In the workflow that *calls this template, add nodes after the 'Execute Workflow' node to process the individual result items returned (e.g., save to a database, spreadsheet, send notifications). Example Usage Branch (Manual Trigger) This template also contains a separate branch starting with the Run example Manual Trigger node. Purpose:** This branch demonstrates how to construct the necessary anthropic-version and requests array payload. Methods Shown:** It includes steps for: Creating a request object from a simple query string. Creating a request object using data from n8n's Langchain Chat Memory nodes (@n8n/n8n-nodes-langchain). Execution:** It merges these examples, constructs the final payload, and then uses the Execute Workflow node to call the main batch processing logic described above. It finishes by filtering the results for demonstration. Note:** This branch is for demonstration and testing. You would typically build your own data preparation logic in a separate workflow. The use of Langchain nodes is optional for the core batch functionality. Notes API Limits:** According to the Anthropic API documentation, batches can contain up to 100,000 requests and be up to 256 MB in total size. Ensure your n8n instance has sufficient resources for large batches. API Costs:** Using the Anthropic API, including the Batch API, incurs costs based on token usage. Monitor your usage via the Anthropic dashboard. Completion Time:** Batch processing time depends on the number and complexity of prompts and current API load. The polling mechanism accounts for this variability. Versioning:** Always include the anthropic-version header in your requests, as shown in the workflow and examples. Refer to Anthropic API versioning documentation.