by Angel Menendez
Who is this for? This workflow is designed for teams using Slack for communication and ServiceNow for incident management. It simplifies incident lookup by enabling team members to fetch incident details directly within Slack via a Slash Command. What problem is this workflow solving? Manually switching between Slack and ServiceNow to retrieve incident details can be time-consuming and disrupt workflow efficiency. This workflow bridges the two platforms, providing instant access to critical incident information in Slack, saving time, and improving response efficiency. What this workflow does? The workflow listens for a Slash Command in Slack that includes an incident ID, extracts the ID from the incoming payload, queries ServiceNow for the corresponding incident details, and sends a formatted response back to Slack. Depending on the query result, it can: Display incident details (e.g., ID, description, severity, and priority). Notify the user if no matching incident is found. Alert the user if there’s an issue connecting to ServiceNow. Setup Slack Setup: Create a Slash Command in Slack with the appropriate endpoint URL. Configure the command to send a POST request to the webhook endpoint of this workflow. For details on how to setup the Slack app using Slash commands and n8n, check out this video. ServiceNow Setup: Create or use an existing account with the necessary permissions to access incident data. Configure the ServiceNow node with your ServiceNow credentials. n8n Workflow Activation: Deploy and activate the workflow in your n8n instance. Ensure all nodes are properly configured and connected. How to customize this workflow to your needs Modify Incident Query Parameters:** Adjust the query logic in the Search For Incident in ServiceNow node to include additional filters or data points based on your organization’s needs. Slack Response Customization:** Customize the Slack response template to display additional incident details or to match your team’s tone and style. Error Handling:** Enhance the error handling nodes to include more detailed logs or send alerts to a dedicated Slack channel.
by Yaron Been
Automated pipeline that exports technology stack data from BuiltWith to Google Sheets for analysis, reporting, and team collaboration. 🚀 What It Does Extracts technology stack data Organizes data in Google Sheets Updates automatically on schedule Supports multiple company tracking Enables easy data sharing 🎯 Perfect For Sales teams Market researchers Business analysts Competitive intelligence Technology consultants ⚙️ Key Benefits ✅ Centralized technology database ✅ Easy data analysis ✅ Team collaboration ✅ Historical tracking ✅ Custom reporting 🔧 What You Need BuiltWith API access Google account n8n instance Google Sheets setup 📊 Data Exported Company information Web technologies Hosting details Analytics tools Marketing technologies Contact information 🛠️ Setup & Support Quick Setup Start exporting in 15 minutes with our step-by-step guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Transform raw technology data into actionable business intelligence with automated exports.
by kapio
How it Works: Capture Contact Requests:** This template efficiently handles contact requests coming through a WordPress website using the Contact Form 7 (CF7) plugin with a webhook extension. Contact Management:** It automatically creates or updates contacts in Pipedrive upon receiving a new request. Lead Management:** Each contact request is securely stored in the lead inbox of Pipedrive, ensuring no opportunity is missed. Task Creation:** For each new contact or update, the workflow triggers the creation of a related task, streamlining follow-up actions. Note Attachment:** A comprehensive note containing all details from the contact request is attached to the corresponding lead, ensuring that all information is readily accessible. Step-by-Step Guide: Estimated Setup Time: The setup process is straightforward and can be completed quickly. Specific time may vary depending on your familiarity with n8n and the systems involved. Detailed setup instructions are provided within the workflow via sticky notes. These notes offer in-depth guidance for configuring each component of the template to suit your specific needs.
by Marketing Canopy
Automate Sports Betting Data with TheOddsAPI This workflow enables you to create and update a table using TheOddsAPI for sports betting data. It automatically pulls upcoming Ice Hockey games at the start of the day and updates the table with results at the end of the day. You can modify it to retrieve odds and game data for any sport. This setup is particularly useful for sports betting applications, such as tracking the results of a predictive model. It leverages scheduled triggers to activate HTTP requests, which then create or update fields in Airtable by matching on the game ID. Prerequisites Before implementing this workflow, ensure you have the following: TheOddsAPI Account & API Key Sign up at TheOddsAPI and obtain an API key. Ensure you have the correct API permissions to access sports odds and results. Airtable Account & API Key Create an account at Airtable and set up a database. Obtain an API key from the Account Settings page. API Access & Rate Limits Review TheOddsAPI’s rate limits and ensure your account tier allows for scheduled API calls. Confirm that Airtable API limits align with your expected data retrieval frequency. Step-by-Step Guide to Integrating TheOddsAPI 1. Schedule API Requests Set up a trigger to automatically pull upcoming Ice Hockey games at the start of each day. 2. Fetch Data from TheOddsAPI Retrieve the latest sports betting data, including game details and odds, using TheOddsAPI. 3. Store Data in Airtable Insert or update records in Airtable by matching game IDs, ensuring data accuracy. Sample Airtable Template Column Setup for Ice Hockey (Table can adjust depending on sport and data needs. Reference TheOddsAPI for more documentation.) Game ID** Sport** League** Game Date (UTC)** Home Team** Away Team** Completed** (Boolean: TRUE/FALSE for game completion status) Scores** (JSON or String for final scores) Last Update** (Timestamp of the latest update) 4. Schedule an End-of-Day Update Configure another trigger to fetch final game results at the end of the day. 5. Update Records in Airtable Modify existing Airtable records with final scores and game outcomes for complete tracking. 6. Customize for Other Sports Adjust API parameters to retrieve data for different sports and betting odds, making the system flexible for multiple use cases. This structured workflow automates sports betting data collection and updates, ensuring accurate and real-time tracking of odds and game results. By integrating TheOddsAPI with Airtable, you can build scalable applications for predictive sports analytics and betting insights.
by Yaron Been
🔥 AI Lead Scoring Agent: Smart Contact Form Triager Automatically score every contact form lead as Hot/Warm/Cold and alert your sales team instantly. This intelligent workflow captures contact form submissions, uses GPT-4 to analyze message content and score lead quality, then sends formatted alerts to Slack - ensuring your sales team always focuses on the hottest prospects first. 🚀 What It Does Instant Lead Capture: Automatically receives contact form submissions via webhook endpoint AI-Powered Scoring: GPT-4 analyzes message content and classifies leads as Hot 🔥, Warm 🌤, or Cold ❄️ Smart Data Extraction: Cleanly extracts name, email, and message from form submissions Real-Time Slack Alerts: Sends formatted notifications to your sales team with lead details and AI scoring 🎯 Key Benefits ✅ Never Miss Hot Prospects: AI identifies urgent leads automatically ✅ Save Sales Time: Focus effort on highest-probability leads first ✅ Instant Team Alerts: Real-time notifications in Slack channels ✅ Smart Prioritization: AI scoring eliminates guesswork in lead quality ✅ Zero Manual Work: Complete automation from form to sales alert ✅ Universal Integration: Works with any contact form or landing page 🏢 Perfect For Sales & Marketing Teams SaaS companies managing inbound leads Service businesses qualifying prospects E-commerce stores identifying serious buyers Agencies prioritizing client inquiries Business Applications Lead Qualification**: Identify purchase-ready prospects instantly Sales Efficiency**: Focus team effort on highest-value opportunities Response Prioritization**: Handle urgent inquiries first Team Coordination**: Keep entire sales team informed of new leads ⚙️ What's Included Complete Workflow: Ready-to-deploy lead scoring automation Webhook Endpoint: Receives submissions from any contact form AI Classification: GPT-4 powered lead interest analysis Slack Integration: Professional team notifications with emojis and formatting Data Processing: Clean extraction and formatting of lead information 🔧 Quick Setup Requirements n8n Platform**: Cloud or self-hosted instance OpenAI API**: GPT-4 access for lead scoring Slack Workspace**: Team channel for lead notifications Contact Form**: Any form that can POST to webhook endpoint 📱 Sample Slack Alert 🔥 New Lead: Sarah Johnson (sarah@techstartup.com) Message: "We're looking for a project management solution for our 50-person team. Need to implement ASAP as we're scaling fast. Can we schedule a demo this week?" Triage: 🔥 Hot ❄️ New Lead: John Smith (john@email.com) Message: "Just browsing your website. Might be interested in learning more someday." Triage: ❄️ Cold 🎨 Customization Options Scoring Criteria: Adjust AI prompts for industry-specific lead qualification Team Channels: Route different lead types to specific Slack channels Additional Fields: Capture company size, budget, timeline data CRM Integration: Connect to Salesforce, HubSpot, or Pipedrive Follow-up Automation: Trigger email sequences based on lead temperature Analytics Tracking: Monitor lead quality trends and conversion rates 🏷️ Tags & Categories #lead-scoring #sales-automation #contact-form-processing #ai-qualification #slack-integration #prospect-management #inbound-marketing #sales-productivity #lead-generation #openai-integration #webhook-automation #crm-automation #sales-alerts #lead-triage #ai-agent 💡 Use Case Examples SaaS Company: Score demo requests based on company size and urgency mentions Consulting Firm: Identify clients ready to start projects vs those still researching E-commerce Store: Spot bulk buyers and wholesale inquiries vs casual browsers Marketing Agency: Prioritize clients with specific budgets and timelines mentioned 📈 Expected Results 70% faster** lead response times through smart prioritization 3x higher** conversion rates focusing on Hot leads first 50% time savings** on manual lead qualification 100% lead coverage** - never miss or ignore a prospect again 🛠️ Setup & Support 5-Minute Setup: Simple webhook configuration with any contact form Universal Integration: Works with WordPress, Webflow, custom forms, landing pages Team Training: Clear Slack notification format anyone can understand Scalable: Handles unlimited form submissions automatically 📞 Get Help & Resources YouTube: https://www.youtube.com/@YaronBeen/videos 💼 Sales Automation Support LinkedIn: https://www.linkedin.com/in/yaronbeen/ 📧 Direct Help Email: Yaron@nofluff.online - Response within 24 hours Ready to never miss another hot lead? Get this AI Lead Scoring Agent and transform your contact forms into intelligent lead qualification systems. Your sales team will always know which prospects to call first, and you'll never waste time on cold leads again. Stop treating all leads equally. Start prioritizing the ones ready to buy.
by Yang
Who is this for? This workflow is for social media agencies, influencer marketers, and brand managers who need to automatically qualify TikTok creators based on their follower metrics. It’s especially useful for teams managing influencer outreach campaigns or building talent databases. What problem is this workflow solving? Manually tracking TikTok user stats is time-consuming and inconsistent. This automation instantly pulls TikTok profile data and only saves creators who meet a defined follower threshold. It removes manual vetting, reduces spreadsheet work, and makes influencer qualification scalable. What this workflow does This workflow uses Airtable as the trigger, Dumpling AI to scrape TikTok profile information, and a logic condition to check if the profile has more than 100k followers. Qualified profiles are updated with full metrics and stored back in Airtable. Setup Airtable Setup Create a table with a field named Tik tok username Connect your Airtable account to n8n using a Personal Access Token Set up a trigger to run when a new TikTok username is added Dumpling AI Sign up at Dumpling AI Create a Dumpling AI credential in n8n using your API key The HTTP node sends the TikTok handle to Dumpling’s /get-tiktok-profile endpoint Configure Filter The IF node checks if followerCount is greater than or equal to 100000 Airtable Update If qualified, the record is updated with: ID (TikTok ID) followerCount followingCount heartCount videoCount How to customize this workflow to your needs Change the follower count threshold to fit your campaign (e.g. 10K, 500K, 1M) Add fields like engagement rate, niche tags, or scraped bio Chain additional steps like sending approved creators to your CRM or triggering outreach messages Add another filter to exclude private or inactive accounts
by Anthony
What this workflow does Linkedin tracks which Chrome extensions are installed in your browser. This workflow uses a huge raw JSON of chrome extension ids, extracted from Linkedin pages, and builds a pretty Google Sheet with the list of these extensions. This workflow web scrapes Google to search for chrome extension id - and extracts the first search result. Setup Clone this Google Sheet template: https://docs.google.com/spreadsheets/d/1nVtoqx-wxRl6ckP9rBHSL3xiCURZ8pbyywvEor0VwOY/edit?gid=0#gid=0 Get API key for Google SERP API access here: https://rapidapi.com/restyler/api/serp-api1 Create n8n header auth for Google SERP API Some context and discussion https://www.linkedin.com/feed/update/urn:li:activity:7245006911807393792/ Follow the author and get the final Google Sheet with 1300+ Chrome extensions: https://www.linkedin.com/in/anthony-sidashin/
by Jaruphat J.
Who is this for? This workflow is ideal for businesses, accountants, and finance teams who receive bank slip images via LINE and want to automate the extraction of transaction details. It eliminates manual data entry and speeds up financial tracking. What problem does this workflow solve? Many businesses receive bank transfer slips via LINE from customers, but manually recording transaction details into spreadsheets is time-consuming and error-prone. This workflow automates the entire process, extracting structured data from the bank slips and storing it in Google Sheets for seamless record-keeping. What this workflow does: Receives bank slip images from LINE BOT Extracts transaction details (sender, receiver, amount, transaction ID) using SpaceOCR Automatically logs extracted data into Google Sheets Works with Standard Bank Slips & PromptPay transactions Eliminates manual data entry and reduces errors Setup Instructions: 1. Prerequisites A LINE BOT with Messaging API enabled A SpaceOCR API Key (Get from https://spaceocr.com/) A Google Sheets account to store extracted data An n8n instance running (Cloud or Self-hosted) 2. Setup Google Sheets Create a Google Sheet with the following structure: A (Date) B (Time) C (Sender) D (Receiver) E (Bank Name) F (Amount) G (Transaction ID) Ensure your Google Sheets API is enabled and connected to n8n. For an example of the required format, check this Google Sheets template: Google Sheets Template 3. Configure n8n Workflow 1. Webhook Node (Receives bank slip from LINE BOT) Set method:* Set Path:* 2. HTTP Request (Download Image from LINE Message) Retrieves image URL from the LINE message payload 3. SpaceOCR Node (Extract Text from Bank Slip) Input:* API Key:* #### 4. Google Sheets Node (Save Transaction Data) Select your Google Sheet Map extracted data (sender, receiver, amount, etc.) to the respective columns 4. Deploy & Test Activate the workflow in n8n Set Webhook URL in LINE Developer Console Send a test bank slip image to the LINE BOT Check Google Sheets for extracted transaction data
by n8n custom workflows
Introduction The namesilo Bulk Domain Availability workflow is a powerful automation solution designed to check the registration status of multiple domains simultaneously using the Namesilo API. This workflow efficiently processes large lists of domains by splitting them into manageable batches, adhering to API rate limits, and compiling the results into a convenient Excel spreadsheet. It eliminates the tedious process of manually checking domains one by one, saving significant time for domain investors, web developers, and digital marketers. The workflow is particularly valuable during brainstorming sessions for new projects, when conducting domain portfolio audits, or when preparing domain acquisition strategies. By automating the domain availability check process, users can quickly identify available domains for registration without the hassle of navigating through multiple web interfaces. Who is this for? This workflow is ideal for: Domain investors and flippers who need to check multiple domains quickly Web developers and agencies evaluating domain options for client projects Digital marketers researching domain availability for campaigns Business owners exploring domain options for new ventures IT professionals managing domain portfolios Users should have basic familiarity with n8n workflow concepts and a Namesilo account to obtain an API key. No coding knowledge is required, though understanding of domain name systems would be beneficial. What problem is this workflow solving? Checking domain availability one-by-one is a time-consuming and tedious process, especially when dealing with dozens or hundreds of potential domains. This workflow solves several key challenges: Manual Inefficiency: Eliminates the need to individually search for each domain through registrar websites. Rate Limiting: Handles API rate limits automatically with built-in waiting periods. Data Organization: Compiles availability results into a structured Excel file rather than scattered notes or multiple browser tabs. Bulk Processing: Processes up to 200 domains per batch, with the ability to handle unlimited domains across multiple batches. Time Management: Frees up valuable time that would otherwise be spent on repetitive manual checks. What this workflow does Overview The workflow takes a list of domains, processes them in batches of up to 200 domains per request (to comply with API limitations), checks their availability using the Namesilo API, and compiles the results into an Excel spreadsheet showing which domains are available for registration and which are already taken. Process Input Setup: The workflow begins with a manual trigger and uses the "Set Data" node to collect the list of domains to check and your Namesilo API key. Domain Processing: The "Convert & Split Domains" node transforms the input list into batches of up to 200 domains to comply with API limitations. Batch Processing: The workflow loops through each batch of domains. API Integration: For each batch, the "Namesilo Requests" node sends a request to the Namesilo API to check domain availability. Data Parsing: The "Parse Data" node processes the API response, extracting information about which domains are available and which are taken. Rate Limit Management: A 5-minute wait period is enforced between batches to respect Namesilo's API rate limits. Data Compilation: The "Merge Results" node combines all the availability data. Output Generation: Finally, the "Convert to Excel" node creates an Excel file with two columns: Domain and Availability (showing "Available" or "Unavailable" for each domain). Setup Import the workflow: Download the workflow JSON file and import it into your n8n instance. Get Namesilo API key: Create a free account at Namesilo and obtain your API key from https://www.namesilo.com/account/api-manager Configure the workflow: Open the "Set Data" node Enter your Namesilo API key in the "Namesilo API Key" field Enter your list of domains (one per line) in the "Domains" field Save and activate: Save the workflow and run it using the manual trigger. How to customize this workflow to your needs Modify domain input format**: You can adjust the code in the "Convert & Split Domains" node if your domain list comes in a different format. Change batch size**: If needed, you can modify the batch size (currently set to 200) in the "Convert & Split Domains" node to accommodate different API limitations. Adjust wait time**: If you have a premium API account with different rate limits, you can modify the wait time in the "Wait" node. Enhance output format**: Customize the "Convert to Excel" node to add additional columns or formatting to the output file. Add domain filtering**: You could add a node before the API request to filter domains based on specific criteria (length, keywords, TLDs). Integrate with other services**: Connect this workflow to domain registrars to automatically register available domains that meet your criteria.
by phil
This workflow automates the backup of your n8n workflows data to Google Drive every day. It ensures that important configurations and execution logs are securely stored, reducing the risk of data loss and improving workflow resilience. 🔹 Why Use This? ✅ Automates routine backups effortlessly. ✅ Reduces manual intervention and potential data loss. ✅ Securely stores critical workflow configurations in Google Drive. With this workflow, you can focus on innovation while n8n takes care of your backups. 🔐✨ 🚀 How It Works This workflow operates seamlessly with a combination of scheduled triggers, JSON data transformation, and secure cloud storage. 🛠 Setup Steps Trigger the backup – Choose between manual execution or automated scheduling at 1:30 AM daily. Data preparation – Your workflow parameters define the backup location and organize files effectively. Transformation & Encoding – The data is processed and converted into a JSON file in base64 format. Cloud Storage – The backup is securely uploaded to your designated Google Drive folder. 🔧 Customization Options You can modify various aspects of the backup workflow to better suit your needs: 1️⃣ Adjusting Backup Frequency By default, the workflow runs daily at 1:30 AM. To change this: Open the Trigger Node in n8n. Modify the Cron Expression or select a different frequency (e.g., hourly, weekly, or custom intervals). 2️⃣ Selecting Specific Workflows to Backup Instead of backing up all workflows, you can filter which ones to include: Add a Filter Node before exporting data. Define specific workflow IDs or names to include in the backup. 3️⃣ Changing the Backup Destination The default destination is Google Drive, but you can change this: Replace the Google Drive Node with a different storage provider (e.g., Dropbox, AWS S3, or local storage via FTP/SFTP). Configure authentication for the new destination. 4️⃣ Modifying Data Format By default, the workflow stores data in JSON format. If you need a different format: Convert JSON to CSV using the Spreadsheet File Node. Store backups in a compressed format (ZIP) by adding a Compression Node. 5️⃣ Encrypting the Backup for Extra Security For added protection: Use the Crypto Node to encrypt the JSON file before uploading. Set up an Access-Controlled Folder in Google Drive with limited permissions. ✅ Verify That Your Backup Works Before relying on this workflow for your automated backups, make sure it works correctly by performing a quick test: Manually trigger the workflow in n8n and check if the backup file appears in your Google Drive. Open Google Drive, navigate to the backup folder, and download the JSON file. Verify its content by checking if the data matches your workflow’s execution logs. Try to import the JSON file back into n8n using the “Import File” function to ensure the workflow structure is intact. Alternatively, copy and paste a test file into Google Drive and confirm that it appears correctly in your workflow logs. This quick test will confirm that your backup is running smoothly and that your data is retrievable whenever needed. 📁 How to Find Your Google Drive Directory ID To ensure that the backup is uploaded to the correct folder, you need to retrieve your Google Drive Directory ID. Follow these simple steps: Open Google Drive. Navigate to the folder where you want to store your backups. Click on the folder and check the URL in your browser. The Directory ID is the long string of characters at the end of the URL after /folders/. Example: 📌 If your folder URL is: https://drive.google.com/drive/folders/14oUlH_LW_NT0Xb2woZWvuzRncV-bhla Then, your Directory ID is: 14oUlH_LW_NT0Xb2woZWvuzRncV-bhla Copy this Directory ID and use it in the workflow's parameters to ensure the backup is saved in the correct location. Phil | Inforeole
by Agent Studio
Automatically store Retell transcripts in Google Sheets/Airtable/Notion from webhook Overview This workflow stores the results of a Retell voice call (transcript, analysis, etc.) once it has ended and been analyzed. It listens for call_analyzed webhook events from Retell and stores the data in Airtable, Google Sheets, and Notion (choose based on your stack). Useful for anyone building Retell agents who want to keep a detailed history of analyzed calls in structured tools. Who is it for For builders of Retell's Voice Agents who want to store call history and essential analytic data. Prerequisites Have a Retell AI Account Create a Retell agent Associate a phone number with your Retell agent Set up one of the following: An Airtable base and table (example: "Transcripts") A Google Sheet with a “Transcripts” tab A Notion database with columns to match the transcript fields Templates: Airtable Google Sheets Notion How it works Receives a webhook POST request from Retell when a call has been analyzed. Filters out any event that is not call_analyzed (Retell sends webhooks for call_started, call_ended and call_analyzed) Extracts useful fields like: Call ID, start/end time, duration, total cost Transcript, summary, sentiment Stores this data in your preferred tool: Airtable Google Sheets Notion How to use it Copy the webhook URL (e.g., https://your-instance.app.n8n.cloud/webhook/poc-retell-analysis) and paste it in your Retell agent under "Webhook settings" then "Agent Level Webhook URL". Make sure your Airtable, Google Sheet, or Notion databases are correctly configured to receive the fields. After each call, once Retell finishes the analysis, this workflow will automatically log the results. Extension If you use any "Post-Call Analysis" fields, you can add columns to your Airtable, Google Sheet, or Notion database. Then fetch the data from the call.call_analysis.custom_analysis_data object. Additional Notes Phone numbers are extracted depending on the call direction (from_number or to_number). Cost is converted from cents to dollars before saving. Dates are converted from timestamps to local ISO strings. You can remove any of the outputs (Airtable, Google Sheets, Notion) if you're only using one. 👉 Reach out to us if you're interested in analysing your Retell Agent conversations.
by Pavel Duchovny
Building agentic AI workflows often requires multiple moving parts: memory management, document retrieval, vector similarity, and orchestration. Until now, these pieces had to be custom-wired. But with the new native n8n nodes for MongoDB Atlas, we reduce that overhead dramatically. With just a few clicks: Store and recall long-term memory from MongoDB Query vector embeddings stored in Atlas Vector Search Use these results in your LLM chains and automation logic In this example we present an ingestion and AI Agent flows that focus around Travel Planning. The different interest points that we want the agent to know about can be ingested into the vector store. The AI Agent will use the vector store tool to get relevant context about those points of interest if it needs to. Prerequisites MongoDB Atlas project and Cluster OpenAI Valid API Key for embeddings (can be other provider) Gemini API Key for the LLM (can be other provider) How it works: There are 2 main flows. One is ingesting flow: Gets a document from a webhook and use MongoDB Vector Atlas to embed the document title and description into points_of_interest collection. Embeddings are stored in a field named embedding Embeddings used are OpenAI's but it can be any type of supported embedders. Second flow is an AI Agent node with Chat Memory Stored in MongoDB Atlas and a Vector Search node as a tool: Chat Message Trigger**: Chatting with the AI Agent will trigger the conversation store in the MongoDB Chat Memory node. When data is necessary like a location search or details it will go to the "Vector Search" tool. Vector Search Tool** - uses Atlas Vector Search index created on the points_of_interest collection: // index name : "vector_index" // If you change an embedding provider make sure the numDimensions correspond to the model. { "fields": [ { "type": "vector", "path": "embedding", "numDimensions": 1536, "similarity": "cosine" } ] } Additional Resources MongoDB Atlas Vector Search n8n Atlas Vector Search docs