by Airtop
Automating Company Data Enrichment and ICP Calculation Use Case This automation identifies a company's LinkedIn profile, extracts key business data, and calculates an ICP (Ideal Customer Profile) score to qualify and enrich company records. It is perfect for sales teams, data enrichment pipelines, and CRM integrations. What This Automation Does Input Parameters Company domain**: The company's website domain (e.g., example.com). Airtop Profile (connected to LinkedIn)**: Your Airtop Profile authenticated for LinkedIn. Company LinkedIn* *(optional): If already known, skips search. Output Includes Verified LinkedIn company URL (if not provided) Company profile (name, tagline, website, location, about) Scale metrics (employee count and bracket) Classification (automation agency status, AI focus, technical level) ICP score with justifications Structured JSON object with all values merged How It Works LinkedIn Detection: If not provided, attempts to locate the LinkedIn URL using website scraping or search. Data Extraction: Uses Airtop to gather structured data from the company’s LinkedIn profile. ICP Scoring: Applies a scoring rubric based on AI/tech orientation, scale, agency status, and geography. Merge Results: All data components are merged into a unified output. Setup Requirements Airtop API Key Airtop Profile with LinkedIn authentication Next Steps Combine with Person Enrichment**: Pair with workflows that enrich individuals tied to the company. Sync to CRM**: Connect the output to your CRM for record enrichment or scoring fields. Adjust ICP Scoring Logic**: Modify the rubric for your organization's ICP model. Read more about company data enrichment and ICP scoring
by Marketing Canopy
Automate Sports Betting Data with TheOddsAPI This workflow enables you to create and update a table using TheOddsAPI for sports betting data. It automatically pulls upcoming Ice Hockey games at the start of the day and updates the table with results at the end of the day. You can modify it to retrieve odds and game data for any sport. This setup is particularly useful for sports betting applications, such as tracking the results of a predictive model. It leverages scheduled triggers to activate HTTP requests, which then create or update fields in Airtable by matching on the game ID. Prerequisites Before implementing this workflow, ensure you have the following: TheOddsAPI Account & API Key Sign up at TheOddsAPI and obtain an API key. Ensure you have the correct API permissions to access sports odds and results. Airtable Account & API Key Create an account at Airtable and set up a database. Obtain an API key from the Account Settings page. API Access & Rate Limits Review TheOddsAPI’s rate limits and ensure your account tier allows for scheduled API calls. Confirm that Airtable API limits align with your expected data retrieval frequency. Step-by-Step Guide to Integrating TheOddsAPI 1. Schedule API Requests Set up a trigger to automatically pull upcoming Ice Hockey games at the start of each day. 2. Fetch Data from TheOddsAPI Retrieve the latest sports betting data, including game details and odds, using TheOddsAPI. 3. Store Data in Airtable Insert or update records in Airtable by matching game IDs, ensuring data accuracy. Sample Airtable Template Column Setup for Ice Hockey (Table can adjust depending on sport and data needs. Reference TheOddsAPI for more documentation.) Game ID** Sport** League** Game Date (UTC)** Home Team** Away Team** Completed** (Boolean: TRUE/FALSE for game completion status) Scores** (JSON or String for final scores) Last Update** (Timestamp of the latest update) 4. Schedule an End-of-Day Update Configure another trigger to fetch final game results at the end of the day. 5. Update Records in Airtable Modify existing Airtable records with final scores and game outcomes for complete tracking. 6. Customize for Other Sports Adjust API parameters to retrieve data for different sports and betting odds, making the system flexible for multiple use cases. This structured workflow automates sports betting data collection and updates, ensuring accurate and real-time tracking of odds and game results. By integrating TheOddsAPI with Airtable, you can build scalable applications for predictive sports analytics and betting insights.
by ist00dent
This n8n template allows you to instantly fetch a random dog image from the Dog CEO API by simply sending a webhook request. It's a fun and simple way to integrate random dog photos into your projects, whether for websites, applications, or playful automations. 🔧 How it works Trigger Webhook: This node acts as the entry point for the workflow. It listens for any incoming POST request. No specific data is required in the webhook body, as the workflow fetches a random image. Fetch Random Dog Image: This node makes an HTTP GET request to https://dog.ceo/api/breeds/image/random. The API responds with a JSON object containing the URL of a random dog image. Respond with Image URL: This node sends the URL of the random dog image back to the service that initiated the webhook. 👤 Who is it for? This workflow is ideal for: Developers: Quickly integrate random dog images into web applications, bots, or prototypes. Content Creators: Get fresh, random dog photos for social media, blogs, or presentations. Learning n8n: A straightforward example of using a webhook to trigger an API call and return data. Anyone who loves dogs! 📑 Data Structure When you trigger the webhook, you can send an empty POST request body. The workflow will return a JSON response similar to this (the message URL will vary): { "message": "https://images.dog.ceo/breeds/hound-walker/n02089867_2626.jpg", "status": "success" } ⚙️ Setup Instructions Import Workflow: In your n8n editor, click "Import from JSON" and paste the provided workflow JSON. Configure Webhook Path: Double-click the Trigger Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /get-dog-image). Activate Workflow: Save and activate the workflow. 📝 Tips Download the Image: Instead of just returning the URL, you can download the image and then process it. Insert another HTTP Request node after Fetch Random Dog Image to download the image binary. Set the HTTP Request node's 'Response Format' to 'Binary'. Use the expression ={{ $json.message }} for the URL. Save to Cloud Storage: After downloading the image (as described above), you can save it to various cloud storage services: Google Drive: Add a Google Drive node. Connect it to the output of the image download node. Configure it to upload the binary data to a specific folder. Amazon S3: Add an AWS S3 node. Configure it to upload the binary data, specifying your bucket and desired filename. Dropbox: Use the Dropbox node to upload the image file. Send as a Message: Share the dog image directly in a chat or email: Slack/Discord/Telegram: Use the respective integration node to send the image URL or the downloaded image as an attachment. Email: Attach the downloaded image to an email using an Email or Gmail node. Display on a Web Page: If you're embedding this into a web application, you can simply use the returned URL in an tag to display the image. Error Handling: You can add an Error Trigger node to catch any issues during the image fetching process (e.g., if the Dog CEO API is down) and send notifications.
by Yaron Been
Workflow Overview This cutting-edge n8n automation is a powerful social media intelligence gathering tool designed to transform Instagram profile research into a seamless, automated process. By intelligently combining web scraping, data formatting, and cloud storage technologies, this workflow: Discovers Profile Insights: Automatically scrapes Instagram profile data Captures comprehensive profile metrics Extracts critical social media intelligence Intelligent Data Capture: Retrieves follower counts Collects biographical information Captures profile picture and external links Seamless Data Logging: Automatically stores data in Google Sheets Creates a living, updateable database Enables easy analysis and tracking Key Benefits 🤖 Full Automation: Instant profile intelligence 💡 Comprehensive Insights: Detailed social media metrics 📊 Effortless Tracking: Automated data collection 🌐 Multi-Purpose Research: Flexible data gathering Workflow Architecture 🔹 Stage 1: Trigger & Input Form-Based Trigger**: Manual username submission Webhook Support**: Flexible data entry methods User-Driven Initiation** 🔹 Stage 2: Web Scraping Apify Integration**: Robust Instagram data extraction Comprehensive Profile Scanning**: Followers count Following count Profile biography Profile picture URL 🔹 Stage 3: Data Formatting Intelligent Data Mapping** Standardized Data Structure** Preparation for Storage** 🔹 Stage 4: Cloud Logging Google Sheets Integration** Persistent Data Storage** Easy Access and Analysis** Potential Use Cases Influencer Marketing**: Talent identification Competitive Intelligence**: Audience research Social Media Analysis**: Performance tracking Recruitment**: Talent scouting Brand Partnerships**: Collaboration opportunities Setup Requirements Apify Account Instagram scraping actor API token Configured scraping parameters Google Sheets Connected Google account Prepared tracking spreadsheet Appropriate sharing settings n8n Installation Cloud or self-hosted instance Workflow configuration API credential management Future Enhancement Suggestions 🤖 Advanced profile scoring 📊 Engagement rate calculation 🔔 Real-time change alerts 🌐 Multi-platform profile tracking 🧠 AI-powered insights generation Technical Considerations Implement robust error handling Use exponential backoff for API calls Maintain flexible data extraction strategies Ensure compliance with platform terms of service Ethical Guidelines Respect user privacy Use data for legitimate research Maintain transparent data collection practices Provide opt-out mechanisms Connect With Me Ready to unlock social media insights? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your social media research with intelligent, automated workflows! #InstagramDataScraping #SocialMediaIntelligence #InfluencerMarketing #DataAutomation #AIResearch #MarketingTechnology #SocialMediaAnalytics #ProfileIntelligence #WebScraping #MarketingTech
by Yaron Been
🔥 AI Lead Scoring Agent: Smart Contact Form Triager Automatically score every contact form lead as Hot/Warm/Cold and alert your sales team instantly. This intelligent workflow captures contact form submissions, uses GPT-4 to analyze message content and score lead quality, then sends formatted alerts to Slack - ensuring your sales team always focuses on the hottest prospects first. 🚀 What It Does Instant Lead Capture: Automatically receives contact form submissions via webhook endpoint AI-Powered Scoring: GPT-4 analyzes message content and classifies leads as Hot 🔥, Warm 🌤, or Cold ❄️ Smart Data Extraction: Cleanly extracts name, email, and message from form submissions Real-Time Slack Alerts: Sends formatted notifications to your sales team with lead details and AI scoring 🎯 Key Benefits ✅ Never Miss Hot Prospects: AI identifies urgent leads automatically ✅ Save Sales Time: Focus effort on highest-probability leads first ✅ Instant Team Alerts: Real-time notifications in Slack channels ✅ Smart Prioritization: AI scoring eliminates guesswork in lead quality ✅ Zero Manual Work: Complete automation from form to sales alert ✅ Universal Integration: Works with any contact form or landing page 🏢 Perfect For Sales & Marketing Teams SaaS companies managing inbound leads Service businesses qualifying prospects E-commerce stores identifying serious buyers Agencies prioritizing client inquiries Business Applications Lead Qualification**: Identify purchase-ready prospects instantly Sales Efficiency**: Focus team effort on highest-value opportunities Response Prioritization**: Handle urgent inquiries first Team Coordination**: Keep entire sales team informed of new leads ⚙️ What's Included Complete Workflow: Ready-to-deploy lead scoring automation Webhook Endpoint: Receives submissions from any contact form AI Classification: GPT-4 powered lead interest analysis Slack Integration: Professional team notifications with emojis and formatting Data Processing: Clean extraction and formatting of lead information 🔧 Quick Setup Requirements n8n Platform**: Cloud or self-hosted instance OpenAI API**: GPT-4 access for lead scoring Slack Workspace**: Team channel for lead notifications Contact Form**: Any form that can POST to webhook endpoint 📱 Sample Slack Alert 🔥 New Lead: Sarah Johnson (sarah@techstartup.com) Message: "We're looking for a project management solution for our 50-person team. Need to implement ASAP as we're scaling fast. Can we schedule a demo this week?" Triage: 🔥 Hot ❄️ New Lead: John Smith (john@email.com) Message: "Just browsing your website. Might be interested in learning more someday." Triage: ❄️ Cold 🎨 Customization Options Scoring Criteria: Adjust AI prompts for industry-specific lead qualification Team Channels: Route different lead types to specific Slack channels Additional Fields: Capture company size, budget, timeline data CRM Integration: Connect to Salesforce, HubSpot, or Pipedrive Follow-up Automation: Trigger email sequences based on lead temperature Analytics Tracking: Monitor lead quality trends and conversion rates 🏷️ Tags & Categories #lead-scoring #sales-automation #contact-form-processing #ai-qualification #slack-integration #prospect-management #inbound-marketing #sales-productivity #lead-generation #openai-integration #webhook-automation #crm-automation #sales-alerts #lead-triage #ai-agent 💡 Use Case Examples SaaS Company: Score demo requests based on company size and urgency mentions Consulting Firm: Identify clients ready to start projects vs those still researching E-commerce Store: Spot bulk buyers and wholesale inquiries vs casual browsers Marketing Agency: Prioritize clients with specific budgets and timelines mentioned 📈 Expected Results 70% faster** lead response times through smart prioritization 3x higher** conversion rates focusing on Hot leads first 50% time savings** on manual lead qualification 100% lead coverage** - never miss or ignore a prospect again 🛠️ Setup & Support 5-Minute Setup: Simple webhook configuration with any contact form Universal Integration: Works with WordPress, Webflow, custom forms, landing pages Team Training: Clear Slack notification format anyone can understand Scalable: Handles unlimited form submissions automatically 📞 Get Help & Resources YouTube: https://www.youtube.com/@YaronBeen/videos 💼 Sales Automation Support LinkedIn: https://www.linkedin.com/in/yaronbeen/ 📧 Direct Help Email: Yaron@nofluff.online - Response within 24 hours Ready to never miss another hot lead? Get this AI Lead Scoring Agent and transform your contact forms into intelligent lead qualification systems. Your sales team will always know which prospects to call first, and you'll never waste time on cold leads again. Stop treating all leads equally. Start prioritizing the ones ready to buy.
by Yang
Who is this for? This workflow is for social media agencies, influencer marketers, and brand managers who need to automatically qualify TikTok creators based on their follower metrics. It’s especially useful for teams managing influencer outreach campaigns or building talent databases. What problem is this workflow solving? Manually tracking TikTok user stats is time-consuming and inconsistent. This automation instantly pulls TikTok profile data and only saves creators who meet a defined follower threshold. It removes manual vetting, reduces spreadsheet work, and makes influencer qualification scalable. What this workflow does This workflow uses Airtable as the trigger, Dumpling AI to scrape TikTok profile information, and a logic condition to check if the profile has more than 100k followers. Qualified profiles are updated with full metrics and stored back in Airtable. Setup Airtable Setup Create a table with a field named Tik tok username Connect your Airtable account to n8n using a Personal Access Token Set up a trigger to run when a new TikTok username is added Dumpling AI Sign up at Dumpling AI Create a Dumpling AI credential in n8n using your API key The HTTP node sends the TikTok handle to Dumpling’s /get-tiktok-profile endpoint Configure Filter The IF node checks if followerCount is greater than or equal to 100000 Airtable Update If qualified, the record is updated with: ID (TikTok ID) followerCount followingCount heartCount videoCount How to customize this workflow to your needs Change the follower count threshold to fit your campaign (e.g. 10K, 500K, 1M) Add fields like engagement rate, niche tags, or scraped bio Chain additional steps like sending approved creators to your CRM or triggering outreach messages Add another filter to exclude private or inactive accounts
by Anthony
What this workflow does Linkedin tracks which Chrome extensions are installed in your browser. This workflow uses a huge raw JSON of chrome extension ids, extracted from Linkedin pages, and builds a pretty Google Sheet with the list of these extensions. This workflow web scrapes Google to search for chrome extension id - and extracts the first search result. Setup Clone this Google Sheet template: https://docs.google.com/spreadsheets/d/1nVtoqx-wxRl6ckP9rBHSL3xiCURZ8pbyywvEor0VwOY/edit?gid=0#gid=0 Get API key for Google SERP API access here: https://rapidapi.com/restyler/api/serp-api1 Create n8n header auth for Google SERP API Some context and discussion https://www.linkedin.com/feed/update/urn:li:activity:7245006911807393792/ Follow the author and get the final Google Sheet with 1300+ Chrome extensions: https://www.linkedin.com/in/anthony-sidashin/
by Akram Kadri
Who is this for? This workflow is designed for YouTubers who want to update their video descriptions in bulk without manually editing each one. It's especially useful for creators who include a standard set of links in their descriptions and need to insert a new link between existing ones across multiple videos. What problem does this workflow solve? Manually updating video descriptions for multiple videos can be tedious and time-consuming. If you have a section in your video descriptions that contains important links, adding a new one in a specific position (e.g., between two existing links) can be a challenge. This workflow automates that process, allowing you to insert a specific string between two predefined rows in all of your video descriptions at once. What this workflow does Fetches all videos from your YouTube channel. Iterates through each video to retrieve its existing description. Identifies two predefined rows in the description. Inserts a new row between the two specified rows. Updates the video description with the modified text. Setup Connect your YouTube account to n8n and grant necessary permissions. Define your variables in the "Set String to Insert" node: rowBefore: The existing row after which the new row will be inserted. rowToInsert: The new text or link to insert. rowAfter: The existing row before which the new row will be inserted. Run the workflow using the manual trigger. Review the updated descriptions to ensure accuracy. How to customize this workflow to your needs Change the insertion criteria** by modifying the rowBefore and rowAfter values. Insert multiple rows** by adjusting the JavaScript code in the Code node. Extend the workflow** by adding conditions (e.g., only updating descriptions of videos with certain tags). Filter specific** videos instead of updating all by modifying the "Get All Videos" node. This workflow ensures that all your YouTube descriptions stay updated and consistent with minimal effort.
by Prakash
Who is this for? This workflow is ideal for: Developers** who want to stay updated on issues without constantly checking GitHub. Managers** tracking issue progress in a Telegram group. DevOps teams that need automated notification alerts for new or updated issues. What problem does this workflow solve? Keeping track of GitHub issues manually can be tedious. Teams often miss critical updates because notifications are buried in emails or dashboards. This workflow automates the process by fetching new or open GitHub issues and instantly sending notifications to a specified Telegram chat. What this workflow does This workflow connects GitHub and Telegram to provide real-time issue notifications: Fetch GitHub Issues – Retrieves new or open issues from a selected GitHub repository. Format the Issue Details – Extracts key information like issue title, number, status, and URL. Send to Telegram – Posts the formatted issue details to a Telegram group or private chat. Setup Guide Prerequisites Before setting up the workflow, ensure you have: GitHub Personal Access Token**: Required to fetch issue details. Generate it under Developer Settings with repo or public_repo permissions. Telegram Bot Token**: Create a bot via BotFather on Telegram and obtain the token. Telegram Chat ID**: Find the chat ID where the bot should send messages using this method. Step-by-Step Setup Set Up GitHub Node Authenticate using your GitHub token. Choose the repository you want to track. Configure filters (e.g., fetch only open issues). Format Issue Details Extract key details like title, issue number, assignee, and status. Customize the message structure for better readability. Send Message to Telegram Add the Telegram node and enter your bot token. Use the Chat ID to define the recipient. Format the message to include issue details and links. Schedule the Workflow (Optional) Use the Cron node to run this workflow periodically (e.g., every hour). How to Customize This Workflow Filter Issues by Labels**: Modify the GitHub node to fetch only issues with specific labels. Include Additional Fields**: Add issue comments, priority, or assignee details in the message. Send Alerts Based on Priority**: Use conditional logic to send high-priority issues to a different chat. Trigger on Issue Events**: Instead of fetching periodically, use GitHub webhooks (if permitted in the repo) to trigger the workflow on issue creation or updates. Why Use This Workflow? Automates GitHub issue tracking** without manually checking repositories. Instant notifications in Telegram** ensure quick response times. Fully customizable** to fit different team workflows.
by n8n custom workflows
Introduction The namesilo Bulk Domain Availability workflow is a powerful automation solution designed to check the registration status of multiple domains simultaneously using the Namesilo API. This workflow efficiently processes large lists of domains by splitting them into manageable batches, adhering to API rate limits, and compiling the results into a convenient Excel spreadsheet. It eliminates the tedious process of manually checking domains one by one, saving significant time for domain investors, web developers, and digital marketers. The workflow is particularly valuable during brainstorming sessions for new projects, when conducting domain portfolio audits, or when preparing domain acquisition strategies. By automating the domain availability check process, users can quickly identify available domains for registration without the hassle of navigating through multiple web interfaces. Who is this for? This workflow is ideal for: Domain investors and flippers who need to check multiple domains quickly Web developers and agencies evaluating domain options for client projects Digital marketers researching domain availability for campaigns Business owners exploring domain options for new ventures IT professionals managing domain portfolios Users should have basic familiarity with n8n workflow concepts and a Namesilo account to obtain an API key. No coding knowledge is required, though understanding of domain name systems would be beneficial. What problem is this workflow solving? Checking domain availability one-by-one is a time-consuming and tedious process, especially when dealing with dozens or hundreds of potential domains. This workflow solves several key challenges: Manual Inefficiency: Eliminates the need to individually search for each domain through registrar websites. Rate Limiting: Handles API rate limits automatically with built-in waiting periods. Data Organization: Compiles availability results into a structured Excel file rather than scattered notes or multiple browser tabs. Bulk Processing: Processes up to 200 domains per batch, with the ability to handle unlimited domains across multiple batches. Time Management: Frees up valuable time that would otherwise be spent on repetitive manual checks. What this workflow does Overview The workflow takes a list of domains, processes them in batches of up to 200 domains per request (to comply with API limitations), checks their availability using the Namesilo API, and compiles the results into an Excel spreadsheet showing which domains are available for registration and which are already taken. Process Input Setup: The workflow begins with a manual trigger and uses the "Set Data" node to collect the list of domains to check and your Namesilo API key. Domain Processing: The "Convert & Split Domains" node transforms the input list into batches of up to 200 domains to comply with API limitations. Batch Processing: The workflow loops through each batch of domains. API Integration: For each batch, the "Namesilo Requests" node sends a request to the Namesilo API to check domain availability. Data Parsing: The "Parse Data" node processes the API response, extracting information about which domains are available and which are taken. Rate Limit Management: A 5-minute wait period is enforced between batches to respect Namesilo's API rate limits. Data Compilation: The "Merge Results" node combines all the availability data. Output Generation: Finally, the "Convert to Excel" node creates an Excel file with two columns: Domain and Availability (showing "Available" or "Unavailable" for each domain). Setup Import the workflow: Download the workflow JSON file and import it into your n8n instance. Get Namesilo API key: Create a free account at Namesilo and obtain your API key from https://www.namesilo.com/account/api-manager Configure the workflow: Open the "Set Data" node Enter your Namesilo API key in the "Namesilo API Key" field Enter your list of domains (one per line) in the "Domains" field Save and activate: Save the workflow and run it using the manual trigger. How to customize this workflow to your needs Modify domain input format**: You can adjust the code in the "Convert & Split Domains" node if your domain list comes in a different format. Change batch size**: If needed, you can modify the batch size (currently set to 200) in the "Convert & Split Domains" node to accommodate different API limitations. Adjust wait time**: If you have a premium API account with different rate limits, you can modify the wait time in the "Wait" node. Enhance output format**: Customize the "Convert to Excel" node to add additional columns or formatting to the output file. Add domain filtering**: You could add a node before the API request to filter domains based on specific criteria (length, keywords, TLDs). Integrate with other services**: Connect this workflow to domain registrars to automatically register available domains that meet your criteria.
by phil
This workflow automates the backup of your n8n workflows data to Google Drive every day. It ensures that important configurations and execution logs are securely stored, reducing the risk of data loss and improving workflow resilience. 🔹 Why Use This? ✅ Automates routine backups effortlessly. ✅ Reduces manual intervention and potential data loss. ✅ Securely stores critical workflow configurations in Google Drive. With this workflow, you can focus on innovation while n8n takes care of your backups. 🔐✨ 🚀 How It Works This workflow operates seamlessly with a combination of scheduled triggers, JSON data transformation, and secure cloud storage. 🛠 Setup Steps Trigger the backup – Choose between manual execution or automated scheduling at 1:30 AM daily. Data preparation – Your workflow parameters define the backup location and organize files effectively. Transformation & Encoding – The data is processed and converted into a JSON file in base64 format. Cloud Storage – The backup is securely uploaded to your designated Google Drive folder. 🔧 Customization Options You can modify various aspects of the backup workflow to better suit your needs: 1️⃣ Adjusting Backup Frequency By default, the workflow runs daily at 1:30 AM. To change this: Open the Trigger Node in n8n. Modify the Cron Expression or select a different frequency (e.g., hourly, weekly, or custom intervals). 2️⃣ Selecting Specific Workflows to Backup Instead of backing up all workflows, you can filter which ones to include: Add a Filter Node before exporting data. Define specific workflow IDs or names to include in the backup. 3️⃣ Changing the Backup Destination The default destination is Google Drive, but you can change this: Replace the Google Drive Node with a different storage provider (e.g., Dropbox, AWS S3, or local storage via FTP/SFTP). Configure authentication for the new destination. 4️⃣ Modifying Data Format By default, the workflow stores data in JSON format. If you need a different format: Convert JSON to CSV using the Spreadsheet File Node. Store backups in a compressed format (ZIP) by adding a Compression Node. 5️⃣ Encrypting the Backup for Extra Security For added protection: Use the Crypto Node to encrypt the JSON file before uploading. Set up an Access-Controlled Folder in Google Drive with limited permissions. ✅ Verify That Your Backup Works Before relying on this workflow for your automated backups, make sure it works correctly by performing a quick test: Manually trigger the workflow in n8n and check if the backup file appears in your Google Drive. Open Google Drive, navigate to the backup folder, and download the JSON file. Verify its content by checking if the data matches your workflow’s execution logs. Try to import the JSON file back into n8n using the “Import File” function to ensure the workflow structure is intact. Alternatively, copy and paste a test file into Google Drive and confirm that it appears correctly in your workflow logs. This quick test will confirm that your backup is running smoothly and that your data is retrievable whenever needed. 📁 How to Find Your Google Drive Directory ID To ensure that the backup is uploaded to the correct folder, you need to retrieve your Google Drive Directory ID. Follow these simple steps: Open Google Drive. Navigate to the folder where you want to store your backups. Click on the folder and check the URL in your browser. The Directory ID is the long string of characters at the end of the URL after /folders/. Example: 📌 If your folder URL is: https://drive.google.com/drive/folders/14oUlH_LW_NT0Xb2woZWvuzRncV-bhla Then, your Directory ID is: 14oUlH_LW_NT0Xb2woZWvuzRncV-bhla Copy this Directory ID and use it in the workflow's parameters to ensure the backup is saved in the correct location. Phil | Inforeole
by FORK SOFTWARE TECHNOLOGIES INC.
Overview This n8n workflow is specifically designed to monitor the USDT ERC-20 balance within a specific wallet. It uses Etherscan's public blockchain database, which does not require API authentication, to periodically check and process transaction data. This workflow is ideal for users who need an automated solution to track ERC-20 wallet transactions. Features Automatic Monitoring**: Executes every 5 minutes to capture new transactions. Customizable Filters**: Customize tracking based on parameters like transaction duration and wallet addresses. Data Aggregation**: Compiles transaction data into a single, structured list. Formatted Outputs**: Presents processed data in an organized format. Telegram Tracking**: Tracks wallet balances via Telegram notifications using the bot. Requirements n8n Setup**: Requires a self-hosted or cloud-based n8n instance. Basic Understanding**: Basic knowledge of n8n workflows and nodes. Installation and Configuration Import Workflow: Load the provided JSON workflow into your n8n instance. Configure the User Data Node: Enter your ERC-20 Wallet Address in the 'Your Wallet Address' field. Enter your Etherscan API Key in the “Your Etherscan API Key” field. Enter your USDT ERC-20 Contract Address in the "Your ERC-20 USDT Contract Address" field (0xdAC17F958D2ee523a2206206994597C13D831ec7). You can also monitor another token by entering a different contract address. Configure the Telegram Node: Go to Telegram and search for "BotFather". Select /newbot from the BotFather menu to create your bot. Get the API key BotFather provides. Go to Telegram and search for "Get My ChatID". Start the conversation and get your ChatID. Use this information to configure the Telegram Node. Schedule Trigger Node: By default, the workflow is triggered every 5 minutes. Adjust this according to your needs. Test the Workflow: Execute the workflow manually to ensure everything is working as expected. How It Works Schedule Trigger: Starts the workflow at predetermined intervals. Edit Fields: Sets the wallet address, Etherscan API key, and USDT ERC-20 token address. Edit Telegram Settings: Create a bot via BotFather. Configure the API key and Telegram Chat ID. Etherscan Data Import: Collects transaction data from the ERC-20 wallet using Etherscan's public database. Final Results: Organizes and formats the transaction data for review. Telegram Bot Message Sending: If there is a balance change, it sends a formatted message about the balance change. If there is no balance change, it sends a message that your balance has not changed. You can configure it to avoid sending a message when there is no change.