by Robert Breen
This n8n workflow template automatically processes phone interview transcripts using AI to evaluate candidates against specific criteria and saves the results to Google Sheets. Perfect for HR departments, recruitment agencies, or any business conducting phone screenings. What This Workflow Does This automated workflow: Receives phone interview transcripts via webhook Uses OpenAI GPT models to analyze candidate responses against predefined qualification criteria Extracts key information (name, phone, location, qualification status) Automatically saves structured results to a Google Sheet for easy review and follow-up The workflow is specifically designed for driving job interviews but can be easily adapted for any position with custom evaluation criteria. Tools & Services Used N8N** - Workflow automation platform OpenAI API** - AI-powered transcript analysis (GPT-4o-mini) Google Sheets** - Data storage and management Webhook** - Receiving transcript data Prerequisites Before implementing this workflow, you'll need: N8N Instance - Self-hosted or cloud version OpenAI API Account - For AI transcript processing Google Account - For Google Sheets integration Phone Interview System - That can send webhooks (like Vapi.ai) Step-by-Step Setup Instructions Step 1: Set Up OpenAI API Access Visit OpenAI's API platform Create an account or log in Navigate to API Keys section Generate a new API key Copy and securely store your API key Step 2: Create Your Google Sheet Option 1: Use Our Pre-Made Template (Recommended) Copy our template: Driver Interview Results Template Click "File" → "Make a copy" to create your own version Rename it as desired Copy your new sheet's URL - you'll need this for the workflow Option 2: Create From Scratch Go to Google Sheets Create a new spreadsheet Name it "Driver Interview Results" (or your preferred name) Set up the following column headers in row 1: A1: name B1: phone C1: cityState D1: qualifies E1: reasoning Copy the Google Sheet URL - you'll need this for the workflow Step 3: Import and Configure the N8N Workflow Import the Workflow Copy the workflow JSON from the template In your N8N instance, go to Workflows → Import from JSON Paste the JSON and import Configure OpenAI Credentials Click on either "OpenAI Chat Model" node Set up credentials using your OpenAI API key Test the connection to ensure it works Configure Google Sheets Integration Click on the "Save to Google Sheets" node Set up Google Sheets OAuth2 credentials Select your spreadsheet from the dropdown Choose the correct sheet (usually "Sheet1") Update the Webhook Click on the "Webhook" node Note the webhook URL that n8n generates This URL will receive your transcript data Step 4: Customize Evaluation Criteria The workflow includes predefined criteria for a Massachusetts driving job. To customize for your needs: Click on the "Evaluate Candidate" node Modify the system message to include your specific requirements Update the evaluation criteria checklist Adjust the JSON output format if needed Current Evaluation Criteria: Valid Massachusetts driver's license No felony convictions Clean driving record (no recent tickets/accidents) Willing to complete background check Can pass drug test (including marijuana) Available full-time Monday-Friday Lives in Massachusetts Step 5: Connect to Vapi.ai (Phone Interview System) This workflow is specifically designed to work with Vapi.ai's phone interview system. Here's how to connect it: Setting Up the Vapi Integration Copy Your N8N Webhook URL In your n8n workflow, click on the "Webhook" node Copy the webhook URL (it should look like: https://your-n8n-instance.com/webhook-test/351ffe7c-69f2-4657-b593-c848d59205c0) Configure Your Vapi Assistant Log into your Vapi.ai dashboard Create or edit your phone interview assistant In the assistant settings, find the "Server" section Set the Server URL to your n8n webhook URL Set timeout to 20 seconds (as configured in the workflow) Configure Server Messages In your Vapi assistant settings, enable these server messages: end-of-call-report transcript[transcriptType="final"] Set Up the Interview Script Use the provided interview script in your Vapi assistant (found in the workflow's system message) This ensures consistent data collection for the AI evaluation Expected Data Format from Vapi The workflow expects Vapi to send data in this specific format: { "body": { "message": { "artifact": { "transcript": "AI: Hi. Are you interested in driving for Bank of Transport?\nUser: Yes.\nAI: Great. Before we go further..." } } } } Vapi Configuration Checklist ✅ Webhook URL set in Vapi assistant server settings ✅ Server messages enabled: end-of-call-report, transcript[transcriptType="final"] ✅ Interview script configured in assistant ✅ Assistant set to send webhooks on call completion Alternative Phone Systems If you're not using Vapi.ai, you can adapt this workflow for other phone systems by: Modifying the "Edit Fields2" node to extract transcripts from your system's data format Updating the webhook data structure expectations Ensuring your phone system sends the complete interview transcript Step 6: Test the Workflow Test with Sample Data Use the "Execute Workflow" button with test data Verify that data appears correctly in your Google Sheet Check that the AI evaluation logic works as expected End-to-End Testing Send a test webhook with a real transcript Monitor each step of the workflow Confirm the final result is saved to Google Sheets Workflow Node Breakdown Webhook - Receives transcript data from your phone system Edit Fields2 - Extracts the transcript from the incoming data Evaluate Candidate - AI analysis using GPT-4o-mini to assess qualification Convert to JSON - Ensures proper JSON formatting with structured output parser Save to Google Sheets - Automatically logs results to your spreadsheet Customization Options Modify Evaluation Criteria Edit the system prompt in the "Evaluate Candidate" node Add or remove qualification requirements Adjust the scoring logic Change Output Format Modify the JSON schema in the "Structured Output Parser" node Update Google Sheets column mapping accordingly Add Additional Processing Insert nodes for email notifications Add Slack/Discord alerts for qualified candidates Integrate with your CRM or ATS system Troubleshooting Common Issues: OpenAI API Errors**: Check API key validity and billing status Google Sheets Not Updating**: Verify OAuth permissions and sheet access Webhook Not Receiving Data**: Confirm URL and POST format from your phone system AI Evaluation Inconsistencies**: Refine the system prompt with more specific criteria Usage Tips Monitor Token Usage**: OpenAI charges per token, so monitor your usage Regular Review**: Periodically review AI evaluations for accuracy Backup Data**: Export Google Sheets data regularly for backup Privacy Compliance**: Ensure transcript handling complies with local privacy laws Need Help with Implementation? For professional setup, customization, or troubleshooting of this workflow, contact: Robert - Ynteractive Solutions Email**: rbreen@ynteractive.com Website**: www.ynteractive.com LinkedIn**: linkedin.com/in/robert-interactive Specializing in AI-powered workflow automation, business process optimization, and custom integration solutions.
by PDF Vector
Overview Transform your accounts payable department with this enterprise-grade invoice processing solution. This workflow automates the entire invoice lifecycle - from document ingestion through payment processing. It handles invoices from multiple sources (Google Drive, email attachments, API submissions), extracts data using AI, validates against purchase orders, routes for appropriate approvals based on amount thresholds, and integrates seamlessly with your ERP system. The solution includes vendor master data management, duplicate invoice detection, real-time spend analytics, and complete audit trails for compliance. What You Can Do This comprehensive workflow creates an intelligent invoice processing pipeline that monitors multiple input channels (Google Drive, email, webhooks) for new invoices and automatically extracts data from PDFs, images, and scanned documents using AI. It validates vendor information against your master database, matches invoices to purchase orders, and detects discrepancies. The workflow implements multi-level approval routing based on invoice amount and department, prevents duplicate payments through intelligent matching algorithms, and integrates with QuickBooks, SAP, or other ERP systems. Additionally, it generates real-time dashboards showing processing metrics and cash flow insights while sending automated reminders for pending approvals. Who It's For Perfect for medium to large businesses, accounting departments, and financial service providers processing more than 100 invoices monthly across multiple vendors. Ideal for organizations that need to enforce approval hierarchies and spending limits, require integration with existing ERP/accounting systems, want to reduce processing time from days to minutes, need audit trails and compliance reporting, and seek to eliminate manual data entry errors and duplicate payments. The Problem It Solves Manual invoice processing creates significant operational challenges including data entry errors (3-5% error rate), processing delays (8-10 days per invoice), duplicate payments (0.1-0.5% of invoices), approval bottlenecks causing late fees, lack of visibility into pending invoices and cash commitments, and compliance issues from missing audit trails. This workflow reduces processing time by 80%, eliminates data entry errors, prevents duplicate payments, and provides complete visibility into your payables process. Setup Instructions Google Drive Setup: Create dedicated folders for invoice intake and configure access permissions PDF Vector Configuration: Set up API credentials with appropriate rate limits for your volume Database Setup: Deploy the provided schema for vendor master and invoice tracking tables Email Integration: Configure IMAP credentials for invoice email monitoring (optional) ERP Connection: Set up API access to your accounting system (QuickBooks, SAP, etc.) Approval Rules: Define approval thresholds and routing rules in the configuration node Notification Setup: Configure Slack/email for approval notifications and alerts Key Features Multi-Channel Invoice Ingestion**: Automatically collect invoices from Google Drive, email attachments, and API uploads Advanced OCR and AI Extraction**: Process any invoice format including handwritten notes and poor quality scans Vendor Master Integration**: Validate and enrich vendor data, maintaining a clean vendor database 3-Way Matching**: Automatically match invoices to purchase orders and goods receipts Dynamic Approval Routing**: Route based on amount, department, vendor, or custom rules Duplicate Detection**: Prevent duplicate payments using fuzzy matching algorithms Real-Time Analytics**: Track KPIs like processing time, approval delays, and early payment discounts Exception Handling**: Intelligent routing of problematic invoices for manual review Audit Trail**: Complete tracking of all actions, approvals, and system modifications Payment Scheduling**: Optimize payment timing to capture discounts and manage cash flow Customization Options This workflow can be customized to add industry-specific extraction fields, implement GL coding rules based on vendor or amount, create department-specific approval workflows, add currency conversion for international invoices, integrate with additional systems (banks, expense management), configure custom dashboards and reporting, set up vendor portals for invoice status inquiries, and implement machine learning for automatic GL coding suggestions. Note: This workflow uses the PDF Vector community node. Make sure to install it from the n8n community nodes collection before using this template.
by Growth AI
Who's it for Marketing teams, business intelligence professionals, competitive analysts, and executives who need consistent industry monitoring with AI-powered analysis and automated team distribution via Discord. What it does This intelligent workflow automatically monitors multiple industry topics, scrapes and analyzes relevant news articles using Claude AI, and delivers professionally formatted intelligence reports to your Discord channel. The system provides weekly automated monitoring cycles with personalized bot communication and comprehensive content analysis. How it works The workflow follows a sophisticated 7-phase automation process: Scheduled Activation: Triggers weekly monitoring cycles (default: Mondays at 9 AM) Query Management: Retrieves monitoring topics from centralized Google Sheets configuration News Discovery: Executes comprehensive Google News searches using SerpAPI for each configured topic Content Extraction: Scrapes full article content from top 3 sources per topic using Firecrawl AI Analysis: Processes scraped content using Claude 4 Sonnet for intelligent synthesis and formatting Discord Optimization: Automatically segments content to comply with Discord's 2000-character message limits Automated Delivery: Posts formatted intelligence reports to Discord channel with branded "Claptrap" bot personality Requirements Google Sheets account for query management SerpAPI account for Google News access Firecrawl account for article content extraction Anthropic API access for Claude 4 Sonnet Discord bot with proper channel permissions Scheduled execution capability (cron-based trigger) How to set up Step 1: Configure Google Sheets query management Create monitoring sheet: Set up Google Sheets document with "Query" sheet Add search topics: Include industry keywords, competitor names, and relevant search terms Sheet structure: Simple column format with "Query" header containing search terms Access permissions: Ensure n8n has read access to the Google Sheets document Step 2: Configure API credentials Set up the following credentials in n8n: Google Sheets OAuth2: For accessing query configuration sheet SerpAPI: For Google News search functionality with proper rate limits Firecrawl API: For reliable article content extraction across various websites Anthropic API: For Claude 4 Sonnet access with sufficient token limits Discord Bot API: With message posting permissions in target channel Step 3: Customize scheduling settings Cron expression: Default set to "0 9 * * 1" (Mondays at 9 AM) Frequency options: Adjust for daily, weekly, or custom monitoring cycles Timezone considerations: Configure according to team's working hours Execution timing: Ensure adequate processing time for multiple topics Step 4: Configure Discord integration Set up Discord delivery settings: Guild ID: Target Discord server (currently: 919951151888236595) Channel ID: Specific monitoring channel (currently: 1334455789284364309) Bot permissions: Message posting, embed suppression capabilities Brand personality: Customize "Claptrap" bot messaging style and tone Step 5: Customize content analysis Configure AI analysis parameters: Analysis depth: Currently processes top 3 articles per topic Content format: Structured markdown format with consistent styling Language settings: Currently configured for French output (easily customizable) Quality controls: Error handling for inaccessible articles and content How to customize the workflow Query management expansion Topic categories: Organize queries by industry, competitor, or strategic focus areas Keyword optimization: Refine search terms based on result quality and relevance Dynamic queries: Implement time-based or event-triggered query modifications Multi-language support: Add international keyword variations for global monitoring Advanced content processing Article quantity: Modify from 3 to more articles per topic based on analysis needs Content filtering: Add quality scoring and relevance filtering for article selection Source preferences: Implement preferred publisher lists or source quality weighting Content enrichment: Add sentiment analysis, trend identification, or competitive positioning Discord delivery enhancements Rich formatting: Implement Discord embeds, reactions, or interactive elements Multi-channel distribution: Route different topics to specialized Discord channels Alert levels: Add priority-based messaging for urgent industry developments Archive functionality: Create searchable message threads or database storage Integration expansions Slack compatibility: Replace or supplement Discord with Slack notifications Email reports: Add formatted email distribution for executive summaries Database storage: Implement persistent storage for historical analysis and trending API endpoints: Create webhook endpoints for third-party system integration AI analysis customization Analysis templates: Create topic-specific analysis frameworks and formatting Competitive focus: Enhance competitor mention detection and analysis depth Trend identification: Implement cross-topic trend analysis and strategic insights Summary levels: Create executive summaries alongside detailed technical analysis Advanced monitoring features Intelligent content curation The system provides sophisticated content management: Relevance scoring: Automatic ranking of articles by topic relevance and publication authority Duplicate detection: Prevents redundant coverage of the same story across different sources Content quality assessment: Filters low-quality or promotional content automatically Source diversity: Ensures coverage from multiple perspectives and publication types Error handling and reliability Graceful degradation: Continues processing even if individual articles fail to scrape Retry mechanisms: Automatic retry logic for temporary API failures or network issues Content fallbacks: Uses article snippets when full content extraction fails Notification continuity: Ensures Discord delivery even with partial content processing Results interpretation Intelligence report structure Each monitoring cycle delivers: Topic-specific summaries: Individual analysis for each configured search query Source attribution: Complete citation with publication date, source, and URL Structured formatting: Consistent presentation optimized for quick scanning Professional analysis: AI-generated insights maintaining factual accuracy and business context Performance analytics Monitor system effectiveness through: Processing metrics: Track successful article extraction and analysis rates Content quality: Assess relevance and usefulness of delivered intelligence Team engagement: Monitor Discord channel activity and report utilization System reliability: Track execution success rates and error patterns Use cases Competitive intelligence Market monitoring: Track competitor announcements, product launches, and strategic moves Industry trends: Identify emerging technologies, regulatory changes, and market shifts Partnership tracking: Monitor alliance formations, acquisitions, and strategic partnerships Leadership changes: Track executive movements and organizational restructuring Strategic planning support Market research: Continuous intelligence gathering for strategic decision-making Risk assessment: Early warning system for industry disruptions and regulatory changes Opportunity identification: Spot emerging markets, technologies, and business opportunities Brand monitoring: Track industry perception and competitive positioning Team collaboration enhancement Knowledge sharing: Centralized distribution of relevant industry intelligence Discussion facilitation: Provide common information baseline for strategic discussions Decision support: Deliver timely intelligence for business planning and strategy sessions Competitive awareness: Keep teams informed about competitive landscape changes Workflow limitations Language dependency: Currently optimized for French analysis output (easily customizable) Processing capacity: Limited to 3 articles per query (configurable based on API limits) Platform specificity: Configured for Discord delivery (adaptable to other platforms) Scheduling constraints: Fixed weekly schedule (customizable via cron expressions) Content access: Dependent on article accessibility and website compatibility with Firecrawl API dependencies: Requires active subscriptions and proper rate limit management for all integrated services
by Rahi
Workflow 1: Domain and Email Health 🩺 This part of the workflow is triggered every 5 hours by the Schedule Trigger1 node. Its purpose is to pull health metrics for both email domains and individual email addresses. How it Works: ++Schedule Trigger:++ The Schedule Trigger1 node initiates the workflow every 5 hours. ++API Requests:++ Two separate HTTP Request nodes, HTTP Request5 and HTTP Request6, make API calls to Smartlead. ++HTTP Request5 calls++ the endpoint for domain-wise health metrics. ++HTTP Request6 calls++ the endpoint for email-wise health metrics. Both requests use the same api_key and a date range from 2025-07-04 to the current day. ++Data Splitting:++ The Split Out5 and Split Out6 nodes take the JSON response from the API calls and split the data into individual items. This is necessary so each row of data can be processed and added to Google Sheets separately. ++Google Sheets Integration:++ Finally, the Append or update row in sheet5 and Append or update row in sheet6 nodes update two different Google Sheets: ++Append or update row in sheet5 adds++ or updates rows in the DomainHealth sheet, matching on the domain column. ++Append or update row in sheet6 adds++ or updates rows in the EmailHealth sheet, matching on the from_email column. Workflow 2: Global and Campaign-Specific Analytics 📊 This second part of the workflow is triggered every 2 hours by the Schedule Trigger node. Its goal is to get a day-by-day overview of email engagement and campaign-specific performance. How it Works: Schedule Trigger: The Schedule Trigger node starts this workflow every 2 hours. ++API Requests:++ Two HTTP Request nodes, HTTP Request and HTTP Request1, call different Smartlead API endpoints. ++HTTP Request++ retrieves day-wise overall stats for email engagement. ++HTTP Request1 ++retrieves overall stats for each campaign. ++Data Splitting:++ The Split Out and Split Out1 nodes separate the JSON responses into individual data items for processing. ++Google Sheets Integration:++ The Append or update row in sheet and Append or update row in sheet1 nodes then write the data to Google Sheets. ++Append or update row in sheet++ updates the Sheet1 sheet with day-wise metrics, using the date as a matching column. ++Append or update row in sheet1++ updates the CampaignWise sheet with campaign performance metrics, using the campaign id to match rows.
by Raz Hadas
Stay ahead of the market with this powerful, automated workflow that performs real-time sentiment analysis on stock market news. By leveraging the advanced capabilities of Google Gemini, this solution provides you with actionable insights to make informed investment decisions. This workflow is designed for investors, traders, and financial analysts who want to automate the process of monitoring news and gauging market sentiment for specific stocks. It seamlessly integrates with Google Sheets for input and output, making it easy to track a portfolio of stocks. Key Features & Benefits Automated Daily Analysis: The workflow is triggered daily, providing you with fresh sentiment analysis just in time for the market open. Dynamic Stock Tracking: Easily manage your list of tracked stocks from a simple Google Sheet. AI-Powered Insights: Utilizes Google Gemini's sophisticated language model to analyze news content for its potential impact on stock prices, including a sentiment score and a detailed rationale. Comprehensive News Aggregation: Fetches the latest news articles from EODHD for each of your specified stock tickers. Error Handling & Validation: Includes built-in checks for invalid stock tickers and formats the AI output for reliable data logging. Centralized Reporting: Automatically logs the sentiment score, rationale, and date into a Google Sheet for easy tracking and historical analysis. How It Works This workflow follows a systematic process to deliver automated sentiment analysis: Scheduled Trigger: The workflow begins each day at a specified time. Fetch Stock Tickers: It reads a list of stock tickers from your designated Google Sheet. Loop and Fetch News: For each ticker, it retrieves the latest news articles using the EODHD API. AI Sentiment Analysis: The collected news articles are then passed to a Google Gemini-powered AI agent. The agent is prompted to act as a stock sentiment analyzer, evaluating the news and generating: A sentiment score from -1 (strong negative) to 1 (strong positive). A detailed rationale explaining the basis for the score. Data Formatting & Validation: The AI's output is parsed and validated to ensure it is in the correct JSON format. Log to Google Sheets: The final sentiment score and rationale are appended to your Google Sheet, alongside the corresponding stock ticker and the current date. Nodes Used Schedule Trigger Google Sheets SplitInBatches HttpRequest (EODHD) If Code (JavaScript) AI Agent (LangChain) Google Gemini Chat Model This workflow is a valuable tool for anyone looking to harness the power of AI for financial market analysis. Deploy this automated solution to save time, gain a competitive edge, and make more data-driven trading decisions.
by Jonathan | NEX
Are you drowning in a sea of security notifications? Do your analysts spend more time sifting through low-level logs than investigating real threats? This workflow transforms n8n into an autonomous SOC (Security Operations Center) Analyst, tackling alert fatigue head-on. Leveraging the NixGuard Security RAG connector, this workflow automates the entire alert triage process. It ingests raw security events (from sources like Wazuh, your SIEM, or EDR), uses AI to analyze and assign a priority, and then intelligently routes the alert to the correct Slack channel. How It Works: Ingest & Filter: The workflow runs on a schedule, fetching all recent security alerts. It first performs a basic filtering to isolate events that meet a minimum severity threshold (e.g., level 7+). AI Analysis & Prioritization: The aggregated high-severity alerts are then sent to the AI with a specific prompt, asking it to analyze the situation and return a structured JSON object containing a single, overall priority (Critical, High, Info) and a concise summary. Intelligent Routing: A Switch node reads the AI-assigned priority and routes the notification to the appropriate destination. Critical alerts go to your #security-incident-response channel, high-priority alerts to #security-investigations, and informational ones to #security-logs. Key Features & Benefits: Eliminate Alert Fatigue:** Drastically reduce the noise by having AI pre-process and categorize alerts before they hit your team. Automate SOC Tier 1 Triage:** Free up your human analysts from repetitive triage tasks so they can focus on high-value investigation and threat hunting. Faster Incident Response:** Route critical alerts to the right people in real-time, cutting down on crucial response time. Consistent Prioritization:** Use AI to ensure a consistent, unbiased approach to alert prioritization, 24/7. Smart Routing Logic:** Go beyond simple keyword matching. The Switch node ensures alerts are delivered to the team best equipped to handle them based on AI-assessed severity. Who is this for? SOC Analysts & Security Engineers** looking to automate alert triage and incident response workflows. SecOps and DevOps Teams** who want to build a more efficient, automated security operations pipeline. IT Managers and Directors** aiming to improve their team's efficiency and reduce the risk of missing critical alerts. Anyone using Wazuh, a SIEM, or other security tools that generate a high volume of alerts. Stop manually triaging alerts. Install this workflow to build your own AI-powered security automation platform and let your team focus on what matters most. Don't have the main workflow yet? Get it HERE! 🔗 Learn more about NixGuard: thenex.world 🔗 Get started with a free security subscription: thenex.world/security/subscribe Tags / Keywords: AI, Security, SOC, Automation, Triage, Alerting, Cybersecurity, Wazuh, SIEM, Slack, Incident Response, Alert Fatigue, SecOps, Generative AI, LLM, NixGuard, Routing
by Łukasz
What Is This? This workflow is a comprehensive solution for automating website audits and optimizations, leveraging advanced technologies to boost SEO effectiveness and overall site performance. Who Is It For? Designed for SEO specialists, digital marketers, webmasters, and content teams, this workflow empowers anyone responsible for website performance to automate and scale their audit processes. Agencies managing multiple client sites, in-house SEO teams aiming to save time on routine checks, and developers seeking to integrate data-driven insights into their deployment pipelines will all find this solution invaluable. By combining your site’s sitemap with Google Search Console and Google Analytics data, then applying AI-powered analysis, the workflow continuously uncovers actionable recommendations to boost search visibility, improve user engagement, and accelerate page performance. Whether you manage a single blog or oversee a sprawling e-commerce platform, this automated pipeline delivers precise, prioritized SEO improvements without manual data wrangling. How Does It Work? This end-to-end site analysis automation consists of five main stages: 1. URL Discovery Processes the sitemap.xml using HTTP Request and XML nodes to extract all site URLs. 2. Search Console Performance Analysis Uses the Google Search Console API to fetch detailed metrics for each page, including search position, clicks, impressions, and CTR. 3. Analytics Data Collection Connects to the Google Analytics API to automatically retrieve traffic metrics such as pageviews, average session duration, bounce rate, and conversions. 4. AI Data Processing Employs OpenAI models to perform in-depth analysis of the collected data. The artificial intelligence engine merges insights from all sources, identifies patterns, and produces detailed optimization recommendations. AI analyses website itsefl aswell. Consider testing different models. I do recommend at least trying out o4-mini. 5. Recommendation Generation Creates tailored suggestions for each page, in form of HTML table, that is being sent to your email. How To Set It Up? Accounts: An active n8n account or instance, API keys for Google Search Console and Google Analytics, an OpenAI access token. Enabled Google APIs: You will neeed at least following scopes: Google Search Console API Google Analytics Aadmin API Google Analytics Data API Scheduling: The workflow can run manually for ad hoc audits or be scheduled (daily, weekly) for continuous site monitoring. Testing: There are two nodes that are optional: "Sort for testing purposes" and "Limit for testing purposes" Together they randomly select items from sitemap and limit them to few so you don't need to run hundreds of sitemap.xml items at once, but you can run just a random batch first. Globals: There is node called "Globals- CHANGE ME!". You need to set up proper variables in there, which are: sitemap_url - self exlpainatory search_console_selector - for example "sc-domain:sailingbyte.com" but can be URL aswell- depends on how did you set up your search console analysis_start_date and analysis_end_date - date range for analytics, by default last 30 days analytics_selector_id - ID of Google Analytics setup, it is a large integer, you can find it in analytics url preceeded with letter "p", ex (your number is where there are X's): https://analytics.google.com/analytics/web/#/pXXXXXXXXX/reports/intelligenthome report_receiver - email which will receive report What's More? That's actually it. I hope that this automation will help your website improvement will be much easier! Thank you, perfect! Glad I could help. Visit my profile for other automations for businesses. And if you are looking for dedicated software development, do not hesitate to reach out!
by Julien DEL RIO
This template is inspired by Save your workflows into a GitHub repository by hikerspath and Back Up Your n8n Workflows To Github by jon-n8n. Basic Retrieve all workflows from an n8n instance and save it on a gitlab project. If the workflow exist, il will only save the changes. Flow What the workflow does : Sets custom parameters Gets workflows Iterates through each workflow one by one Get the file from Gitlab if exists Compare the files as objects (not as strings) Return a status on the workflow Create, Edit or ignore the file depending on the status Return a list of status for each workflow Configuration Select a credential in each Gitlab nodes. Edit the data in node "Globals" : repo.owner : slug of the user or team owning the repo repo.name : slug of the repository repo.branch : branch to commit on repo.path : from root of the repository. Should end with / Comments Error on gitlab nodes will not stop the run but will list the current workflow as error in the results Some fields are ignored to determined if there is changes : updatedAt : should be ignored if only ignores fields are changed globals : it's running information, no need to follow the changes
by Harshil Agrawal
This workflow allows you to get analytics of a website and store it Airtable. In this workflow, we get the analytics for the sessions grouped by the country. Based on your use-case, you can select different Dimensions and set different Metrics. You can use the Cron node or the Interval node to trigger the workflow on a particular interval and fetch the analytics data regularly. Based on your use-case, you might want to store the data returned by Google Analytics to a database or a Google Sheet. Replace the Airtable node with the appropriate node.
by tanaypant
This workflow automatically queries a Postgres database to find outlier readings for which SMS notifications have not been sent. This is Workflow 2 in the blog tutorial Database activity monitoring and alerting. Prerequisites A Postgres database set up and credentials A Twilio account and credentials Nodes Cron node triggers the workflow every minute, so the database is queried at regular intervals. Postgres nodes extract values from, and update values in the database. Twilio node sends an alert SMS about the outlier reading to a specified phone number. Set node sets the notification value to true.
by Raz Hadas
Description Transform your investment strategy with a fully automated, AI-driven trading bot. This workflow bridges the gap between AI-powered market insights and real-world trading by executing buy and sell orders directly through the Alpaca paper trading API. Designed to work in tandem with the Automated Stock Sentiment Analysis workflow, this solution takes the top-performing stocks based on daily news sentiment and automatically rebalances your portfolio. It's perfect for algorithmic traders, data-driven investors, and n8n enthusiasts who want to see their AI analysis translate into tangible actions, all while maintaining a comprehensive log of every transaction in Google Sheets. Key Features & Benefits Automated Trading Execution:** Automatically places buy and sell orders on the Alpaca paper trading platform without manual intervention. Sentiment-Driven Decisions:** Leverages the output from the sentiment analysis workflow to make informed decisions, selling positions with waning sentiment and buying into those with high positive sentiment. Dynamic Portfolio Rebalancing:** Intelligently calculates which positions to close and how to allocate the resulting funds into new, high-potential stocks. Paper Trading Ready:** Safely test and refine your trading strategies in a risk-free environment using Alpaca's paper trading API. Daily Performance Tracking:** Automatically logs your account equity and daily percentage change to a Google Sheet, giving you a clear view of your portfolio's performance. Detailed Trade Logging:** Every buy and sell order is meticulously recorded in a Google Sheet for easy review and historical analysis. Scheduled and Autonomous:** The entire process runs on a daily schedule, making it a "set and forget" solution for systematic trading. How It Works This workflow executes a sophisticated, automated trading strategy in a few key stages: Daily Kick-off & Snapshot: The workflow triggers on a daily schedule, first fetching your current Alpaca account balance and logging it to a Google Sheet to track daily performance. Strategy Formulation: It then reads the daily sentiment scores produced by the accompanying "Stock Sentiment Analysis" workflow. A Code node filters these results to identify the top four stocks with the highest positive sentiment. The Decision Engine: The core of the workflow is a custom Code node that acts as the trading brain. It: Retrieves your currently open positions from Alpaca. Compares your holdings against the day's top four sentiment stocks. Generates a "sell list" of positions you hold that are no longer in the top four. Generates a "buy list" of top-sentiment stocks that you don't yet own. Calculates the total cash value from the "sell list" and determines the exact notional value to invest in each stock on the "buy list." Trade Execution: The workflow first iterates through the "sell list" and executes a DELETE request to Alpaca for each, closing the positions. A Wait node pauses the workflow for two minutes to ensure the sell orders are filled and the account balance is updated. It then iterates through the "buy list," executing POST requests to Alpaca to purchase the new assets with the calculated funds. Record Keeping: All executed orders (both buys and sells) are merged and logged in a dedicated Google Sheet, giving you a permanent and detailed transaction history. Nodes Used Schedule Trigger HttpRequest (Alpaca API) Google Sheets Code (JavaScript) SplitOut Wait Merge This workflow is the perfect next step for anyone looking to take their AI analysis to the next level. Take the emotion out of your trading and let this bot systematically execute your data-driven strategy.
by Łukasz
What is This? This automation simulates Scrum Master role on daily meetings. Essentially it is an AI Scrum Master using different sources of data. As intelligent support system for Scrum Masters that leverages data from Asana, Slack, and direct developer responses for comprehensive sprint status analysis and identification of areas requiring intervention. As such it is usable for Scrum Masters (of course) but Scrum Team aswell, Product Owner and possibly Business Owner. Who is it For? This automation is designed for Agile teams to support the Scrum Master role by collecting and analyzing data from various sources to identify potential impediments and support the team in sprint delivery. How Does It Work? The workflow has four main data entry points, that are launched either on-click or on workdays. First is collecting project section information from Asana. The automation retrieves project structure, available sections, and their organization, allowing the AI to understand the team's work context. Second is getting recently modified tasks in the Asana project. The system tracks changes in tasks, their status, assignments, and updates to detect potential delays or issues. Third is obtaining communication in the team's Slack channel. The flow collects data about recent conversations, discussion threads, and team communication to identify warning signals or areas requiring attention. Fourth is directly collecting responses from developers about the current sprint - their progress, impediments, concerns, and support needs. All collected data is passed to an AI model that analyzes it within the Scrum methodology context and identifies: Potential impediments in sprint delivery Areas requiring Scrum Master intervention Recommendations for team support Warning signals regarding Sprint Goal achievement Output is being pushed to Slack channel so it can be potentially used by another iteration of same flow itself via Slack channel history. Requirements You need Asana oAuth credentials You need OpenAI / alternative AI for processing data You need to have Slack app with proper permissions channels:history chat:write groups:history im:history mpim:history users.profile:write users:write Configuration Set up node "Asana Project and Slack Channel". Provide Asana project ID and Slack Channel ID (optional) Set up node "Get Scrum Master Answers". There are daily questions/answers that are being sent to channel. Alternative use You can get rid of the whole "Ask Users Daily ScrumMaster Questions" part if you don't want to do it simirarly as "daily Scrum standups". In such case whole flow is essentially changed to static analyzer of project status based on Slack and Asana. Extensions and Customizations There are many possibilities to extend this automation depending on team needs. For example, you can add integration with additional project management tools, implement different notification schemes based on detected issue criticality, or adjust data collection frequency to match the team's work rhythm. Disclaimers and Notes Whole automation has one important assumption: project is run on single Slack channel and on single Asana board. Of cource this can be extended, but is beyond currently designed scope. Adding new sources for AI to analyze should be fairly easy - just add another branch of data and push it to AI prompt. This automation represents a proof-of-concept and should not replace an actual Scrum Master. The Scrum Master role extends far beyond data collection and analysis - it requires deep understanding of team dynamics, business context, and interpersonal skills. As Scrum.org emphasizes, the Scrum Master doesn't need to be present during Daily Scrum, and their role is to ensure the meeting happens, but developers are responsible for conducting the meeting. Mindlessly executing daily questions without proper context analysis can lead to situations where the Scrum Master becomes a team manager instead of a self-organization facilitator. A real Scrum Master analyzes much more data than what's collected by automation - they observe team dynamics, understand business context, identify deeper root causes of problems, and support the team in developing self-organization skills. AI can be a valuable support tool, but it cannot replace the human intuition, empathy, and experience essential in this role. The automation should be treated as a tool supporting the Teams's work, providing additional insights and helping identify areas requiring attention, but always under the supervision and interpretation of an experienced Scrum practitioner.