by Ziad Adel
Turn LinkedIn Noise Into Weekly Slack Insights 🚀 What if your team could skim the best of LinkedIn in 2 minutes instead of scrolling for hours? This workflow transforms raw LinkedIn posts into a bite-sized Slack digest — summarized, grouped, and delivered automatically every week. ⚡ What It Does Scrapes Posts Automatically**: Pulls fresh posts from LinkedIn profiles you specify (via Apify). Summarizes with AI: Condenses each post into **2–3 bullets (≤15 words). Keeps It Lean: Digest capped at **500 words total. Organized by Author**: See exactly who said what, without searching. Delivers to Slack**: Neatly formatted digest drops in your channel on schedule, with post links in thread replies. 🛠 How It Works Google Sheets → Profile URLs Add LinkedIn profile URLs into a Google Sheet tab — this is your watchlist. Apify Scraper → Posts Fetches up to 10 posts per profile within the past 7 days. Clean & Format Code nodes strip out clutter (hashtags, broken links, escapes). OpenAI Summarizer AI rewrites posts into concise bullets and trims the digest under 500 words. Slack Delivery Digest posts directly in Slack every Sunday morning, with original links attached as thread replies. ✅ Pre-conditions / Requirements Google Sheets API credentials** connected in n8n. Apify account + API Token** for the LinkedIn profile posts actor. OpenAI API Key** for summarization. Slack Bot Token** with permission to post messages in your chosen channel. Profiles you want to track must be publicly viewable or accessible to Apify. 🎛 Customization Options Schedule**: Change the Cron node if you prefer daily or monthly digests. Batch Size**: Default is 5 profiles per batch — increase or decrease for performance. Summaries**: Adjust OpenAI system prompt to allow longer or shorter bullet points. Filters**: Extend extendOutputFunction to exclude reposts, sponsored posts, or keep specific authors only. Slack Output**: Change formatting, channel, or send as direct message instead of posting in a channel. 💡 Why This Is Valuable Saves your team 3–5 hours/week of scrolling. Keeps everyone updated with actionable insights, not filler. Turns a chaotic LinkedIn feed into a signal-only digest. Positions you as the one who always brings the smartest highlights to the table. 🎯 Best For Founders who want LinkedIn insights without endless scrolling. Marketing and growth teams tracking thought leaders. Operators who want signal over noise, delivered straight to Slack. No more mindless scrolling. Just sharp insights, automatically packaged. ✅
by Pixcels Themes
Who’s it for This template is designed for recruiters, lead-generation teams, agency owners, and sales professionals who collect LinkedIn profile data and need to automate the process of finding verified company domains and email addresses. It is ideal for teams looking to eliminate manual research and streamline prospect enrichment. What it does / How it works This workflow reads contact records from a Google Sheet, including name, position, and description. An AI agent analyzes each profile to determine the company domain. If the domain is already identifiable from the description, it is used directly. If no domain is found, the workflow generates an intelligent search term and performs a Google Custom Search to extract the most accurate domain from real web results using another AI agent. Once the domain is confirmed, the workflow queries Hunter.io to find the best-matching email address for the contact. Finally, the enriched data—email and company domain—is appended back into the Google Sheet, updating each row automatically. Requirements Google Sheets OAuth2 credentials Google Gemini (PaLM) API credentials Hunter.io API key Google Custom Search API key and CSE ID A Google Sheet with columns for name, position, description, and domain How to set up Connect your Google Sheets, Gemini, Hunter.io, and Google Search credentials. Replace the Google Sheet ID and sheet name with your own. Add your API keys to the designated nodes. Ensure column names match your sheet structure. Execute the workflow to begin enrichment. How to customize the workflow Modify AI prompts for better domain inference Add additional enrichment steps (social profiles, industry tags) Add fallback email providers (Snov, Apollo, etc.) Change update logic to support multiple sheets or batch processing
by Yaron Been
Monitor CRM accounts for hiring spikes by enriching HubSpot companies with PredictLeads job data and alerting your team via Slack. This workflow pulls all companies from your HubSpot CRM, checks each one against the PredictLeads Job Openings API for target roles (sales, engineering, marketing, product, data), compares the current count to historical data stored in Google Sheets, and flags any company where hiring jumped more than 50%. Flagged companies get updated in HubSpot with a hiring signal and trigger a Slack alert so your sales team can act fast. How it works: Schedule trigger runs the workflow daily at 9 AM. Retrieves all companies from HubSpot CRM (domain, name, ID). Loops through each company and fetches job openings from PredictLeads. Filters jobs to target roles (sales, engineering, marketing, product, data). Reads the previous job count for that company from Google Sheets. Calculates percentage change between current and historical counts. If hiring increased more than 50%, flags it as a spike. Updates the HubSpot company record with a hiring signal property. Sends a Slack alert with the company name, role count, and percentage change. Updates Google Sheets with the latest count regardless of spike status. Setup: Connect your HubSpot CRM (OAuth2) with company read/write access. Create a Google Sheet with a "HistoricalCounts" tab containing columns: domain, company_name, job_count, previous_count, percent_change, check_date. Connect a Slack bot to the channel where you want hiring alerts. Add your PredictLeads API credentials (X-Api-Key and X-Api-Token headers). Requirements: HubSpot CRM account with OAuth2 credentials. Google Sheets OAuth2 credentials. Slack OAuth2 credentials (bot with chat:write permission). PredictLeads API account (https://docs.predictleads.com). Notes: The 50% spike threshold can be adjusted in the IF node. Target roles are configured in the Filter Target Roles code node -- add or remove roles as needed. The workflow updates historical data on every run, so spike detection improves over time. PredictLeads Job Openings API docs: https://docs.predictleads.com
by Jovan
AI-Powered PLG Revenue Engine: Segment, Attio & Outreach Sync This workflow bridges the gap between raw product data and revenue sales tools. It automates the entire Product Qualified Lead (PQL) lifecycle—from real-time intent routing to churn prevention—reducing SalesOps overhead by 80%. Who’s it for B2B SaaS Teams** looking to automate PQL outreach based on product usage. Revenue Ops** needing to sync Attio CRM, ActiveCampaign, and Lemlist. Growth Teams** requiring real-time AI classification of user intent. How it works Real-Time Intent Routing:** Segment webhooks trigger Claude AI to classify PQLs. High-intent users are instantly moved to Lemlist for outreach. Deal Progression Sync:** Changes in Attio deal stages automatically update ActiveCampaign nurture lists and Segment profiles. Intercom Revenue Signals:** AI scans Intercom conversations for buying signals or churn risks, creating Attio deals or churn prevention tasks. Daily PQL Scoring:** Every morning at 7 AM, the workflow scores trial users across 5 dimensions, enrolling "Hot" leads into conversion sequences. How to set up Node Configuration: Manually enter your specific Campaign IDs in the Lemlist nodes and List IDs in the ActiveCampaign nodes. Credentials: Set up official n8n credentials for Attio, Segment, Anthropic (Claude), Lemlist, Intercom, and ActiveCampaign. Webhook Mapping: Connect your Segment and Intercom webhook URLs to the respective Trigger nodes. Attio Schema: Ensure your Attio workspace includes custom attributes for pql_score, pql_tier, and churn_risk_score. Requirements n8n version:** 1.0+ AI Credits:** Anthropic (Claude) and OpenAI (for Intercom analysis). Tech Stack:** Segment, Attio CRM, ActiveCampaign, Lemlist, and Intercom. Results 65% Increase** in trial-to-PQL conversion rates. Outreach speed** improved from 3 days to under 10 minutes. 80% reduction** in manual SalesOps and Revenue Ops overhead.
by Avkash Kakdiya
How it works This workflow automatically discovers and qualifies local business leads using structured inputs. It runs on a schedule, reads search queries from Google Sheets, and fetches business data via an API. The data is cleaned and enriched before being analyzed by AI for lead scoring and categorization. Finally, all enriched leads are stored back in Google Sheets for outreach and tracking. Step-by-step Trigger workflow automatically** Schedule Trigger – Runs the workflow at defined time intervals. Fetch lead search inputs** Read Search Requests – Retrieves keywords and locations from Google Sheets. Collect business data from API** Search Businesses API – Queries RapidAPI to find local businesses with contact details. Clean and structure results** Format Business Results – Extracts and formats business name, email, phone, website, and address. Analyze and score leads with AI** Message a model – Uses OpenAI to classify businesses, assign lead scores, and generate outreach lines. Store enriched leads** Write to Business Results – Saves all processed and scored leads into Google Sheets. Why use this? Automates manual lead research and data collection Improves lead quality with AI-based scoring and classification Centralizes all lead data in a structured Google Sheets database Generates ready-to-use outreach messages for faster sales execution Scales easily by adding more keywords and locations
by Jitesh Dugar
Process invoices with UploadToURL, AWS Textract, and Google Sheets Eliminate manual data entry from your accounts payable process. This workflow transforms raw invoice scans into structured financial records by combining UploadToURL for hosting, AWS Textract for OCR data extraction, and Google Sheets for centralized logging. 🎯 What This Workflow Does Turns any paper or PDF invoice into a verified spreadsheet entry in seconds: 📝 Captures Invoice Scans - Receives invoices via mobile upload (binary) or remote URL via Webhook. ☁️ Instant CDN Hosting - UploadToURL hosts the scan and provides a permanent link for your financial audit trail. 👁️ Intelligent OCR - AWS Textract analyzes the document to extract Vendor Name, Invoice Number, Amount, and Due Date. 🚦 Validation & Duplicate Check - Sanitizes currency formats and searches your Google Sheet to prevent double-paying the same invoice. 📊 Automated Logging - Records the extracted data directly into Google Sheets and pings the finance team in Slack. ✨ Key Features UploadToURL Integration**: Hosts your financial evidence on a public CDN, making it accessible directly from your spreadsheet cells. High-Accuracy Extraction**: Uses AWS Textract's specialized invoice processing to handle complex table layouts. Audit-Ready Records**: Every entry includes the original file URL, upload timestamp, and department metadata. Smart Formatting**: Automatically normalizes dates and coerces pricing into a standard float format for easy accounting. Instant Notifications**: Keeps the finance team in the loop with real-time Slack alerts for high-priority bills. 💼 Perfect For Finance Teams**: Processing 50+ vendor invoices monthly without manual typing. Small Business Owners**: Managing utility bills and receipts on the go via smartphone uploads. Operations Managers**: Tracking departmental spending with a standardized, automated log. Agencies**: Organizing reimbursable expenses across multiple client projects. 🔧 What You'll Need Required Integrations UploadToURL** - To host invoice scans and provide audit links. n8n Community Node** - n8n-nodes-uploadtourl must be installed. AWS Account* - Credentials for *AWS Textract** (OCR). Google Sheets** - OAuth2 credentials to write to your finance spreadsheet. Optional Integrations Slack** - To receive real-time notifications for new invoice entries. Gmail** - To trigger the workflow automatically when an invoice arrives in your inbox. 🚀 Quick Start Import Template - Copy the JSON and import it into your n8n canvas. Install Node - Ensure the UploadToURL community node is installed. Set Credentials - Link your UploadToURL, AWS, Google Sheets, and Slack accounts. Configure Spreadsheet - Create a sheet with columns: Invoice No, Vendor, Amount, Due Date, and File URL. Set Variables - Add your GSHEET_SPREADSHEET_ID and SLACK_FINANCE_CHANNEL to n8n variables. Deploy - Switch the workflow to "Active" to begin your automated bookkeeping. 🎨 Customization Options Approval Workflow**: Add a "Wait for Approval" node so a manager must click a Slack button before the row is finalized. Tax Calculation**: Insert a code node to automatically calculate VAT or Sales Tax based on the extracted total. ERP Sync: Replace Google Sheets with **QuickBooks, Xero, or NetSuite to push data directly into your accounting software. Multi-Currency Support**: Add a Currency Conversion node to normalize all totals into your base company currency. 📈 Expected Results 90% reduction** in manual data entry time (from minutes to seconds per bill). Improved Accuracy**: Elimination of typos and transcription errors in financial totals. Better Audit Compliance**: Every line item in your sheet is permanently linked to the original scan. Faster Payment Cycles**: Bills are recorded the moment they are received, preventing late fees. 🏆 Use Cases Accounts Payable Automate the entry of monthly recurring bills (utilities, rent, software) directly into your tracking sheet. Employee Reimbursements Staff can snap a photo of a business dinner receipt and upload it to the webhook to start the reimbursement process instantly. Bulk Document Digitization Upload a folder of historical PDF invoices; the workflow will categorize and log them all in one batch. 💡 Pro Tips Clear Scans**: Ensure invoices are well-lit and flat; high contrast helps AWS Textract achieve 99% accuracy. Folder IDs**: Keep your Google Sheet in a shared Drive folder so your entire finance team can access the linked UploadToURL files. Webhook Security**: Use a unique path or basic auth on your webhook to ensure only authorized devices can submit invoices. Ready to automate your bookkeeping? Import this template and connect UploadToURL to build a hands-free finance pipeline today. Questions about AWS Textract mapping? The workflow includes detailed sticky notes explaining how to extract custom fields from your specific vendor layouts.
by Madame AI
Scrape industry growth signals from BrowserAct to Slack reports Introduction This workflow automates the monitoring of market growth signals, such as funding rounds, for specific target industries. It scrapes data using BrowserAct, uses AI to filter and format the results for the current month, and delivers a consolidated report directly to Slack. Target Audience Account-Based Marketing (ABM) managers, sales development representatives (SDRs), and venture capital researchers looking to track competitor or prospect activity. How it works Scheduling: A Schedule Trigger initiates the workflow once a month to gather recent data. Configuration: A Set node defines the Target_Industry variable (e.g., "Property Management"), which controls what data acts as a signal. Data Extraction: The BrowserAct node runs the "ABM Signal Monitor" template to scrape raw company data and growth lists from the web. AI Analysis: An AI Agent (powered by OpenRouter/GPT-4o) processes the raw text. It filters companies to ensure they match the target industry and have funding dates within the current month. Data Structuring: The AI formats the valid leads into a structured JSON report, splitting the data if the list exceeds Slack's character limits. Data Splitting: A Split Out node separates the message array into individual items to ensure reliable delivery. Notification: A Slack node posts the curated "Growth Signal Report" to a specified channel. How to set up Configure Credentials: Connect your BrowserAct, OpenRouter, and Slack accounts in the n8n credentials section. Prepare BrowserAct: Ensure you have the ABM Signal Monitor template saved in your BrowserAct account. Define Industry: Open the Set Target Industry node and update the value field to the specific industry you want to track (e.g., "Fintech", "Healthcare"). Select Slack Channel: Open the Send the report to the channel node and select the specific Slack channel where you want the report to appear. Requirements BrowserAct Account:* You must have the *ABM Signal Monitor** template active in your library. OpenRouter Account:** Required to access the GPT-4o model for data filtering and formatting. Slack Account:** Required for receiving the final reports. How to customize the workflow Change the Trigger Frequency: Update the Monthly Trigger node to run weekly or daily if you require more frequent updates on market signals. Modify the AI Logic: Edit the System Message in the AI Agent node to change the filtering criteria (e.g., filter by hiring volume instead of funding). Add a Database: Insert a Google Sheets or Notion node before the Slack step to archive the growth signals for long-term tracking. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video 🚀 Automate ABM Sales Signals: Track Startup Funding with n8n & AI
by Roshan Ramani
Who's it for This workflow is perfect for: Content creators who need to stay on top of trending topics Marketers tracking industry discussions and competitor mentions Community managers monitoring relevant subreddits Researchers gathering trending content in specific niches Anyone who wants curated Reddit updates without manual browsing What it does This automated workflow: Monitors multiple subreddits for viral posts daily Filters posts based on engagement metrics (upvotes and recency) Generates concise AI summaries of trending content Delivers formatted updates directly to your Telegram chat Runs completely hands-free once configured How it works Step 1: Configuration & Scheduling Triggers daily at 8 AM (customizable) Loads your configured subreddit niches and Telegram settings Step 2: Data Collection Loops through each subreddit in your niche list Fetches the 50 newest posts from each subreddit Extracts key data: title, URL, upvotes, timestamp, subreddit name Step 3: Smart Filtering Applies viral post criteria: Posts with 500+ upvotes, OR Posts with 70+ upvotes created within the last 24 hours Ensures only high-engagement content passes through Step 4: AI Summarization Aggregates all filtered posts into a single batch Sends to GPT-4o-mini for analysis Generates concise 100-200 word summaries Formats output for Telegram markdown Step 5: Delivery Sends all summaries to your Telegram chat Includes post links and engagement metrics Delivers in a clean, readable format Setup steps 1. Configure Reddit credentials Connect your Reddit OAuth2 API credentials in the "Get Reddit Viral Posts" node Ensure you have API access enabled on your Reddit account 2. Configure Telegram credentials Add your Telegram bot token in the "Send to Telegram" node Get your chat ID by messaging your bot and checking updates 3. Customize your niches Open the "Workflow Configuration" node Edit the niches array with your target subreddits Default niches: technology, programming, science, gaming 4. Set your Telegram chat ID Replace the default chat ID (7917193308) in "Workflow Configuration" Use your personal chat ID or group chat ID 5. Adjust the schedule (optional) Modify the "Daily 8 AM Trigger" to your preferred time Change frequency if you want multiple updates per day 6. Test before activating Run the workflow manually using the "Test workflow" button Verify summaries arrive in Telegram correctly Check that filtering logic works as expected Requirements Required credentials: Reddit OAuth2 API access (free) Telegram bot token (free via @BotFather) OpenAI API key for GPT-4o-mini (paid) Platform requirements: n8n instance (self-hosted or n8n Cloud) Active internet connection Sufficient API rate limits for your usage Technical knowledge: Basic understanding of n8n workflows Ability to generate API credentials Familiarity with Telegram bots (helpful but not required) How to customize Adjust subreddit monitoring: Add or remove subreddits in the niches array Format: ["subreddit1", "subreddit2", "subreddit3"] Example: ["machinelearning", "datascience", "artificial"] Modify viral post criteria: Edit the "Filter" node conditions Change upvote thresholds (default: 500+ or 70+ within 24h) Adjust time window for recency checks Customize AI summaries: Update the system prompt in "AI Summarizer" node Change summary length (default: 100-200 words) Modify tone, style, or focus areas Switch to different OpenAI models if needed Change scheduling: Modify trigger time in "Daily 8 AM Trigger" Options: hourly, twice daily, weekly, custom cron Consider API rate limits when increasing frequency Adjust data collection: Change the limit parameter in "Get Reddit Viral Posts" Default: 50 posts per subreddit Higher limits = more comprehensive but slower execution Enhance filtering logic: Add additional criteria (comments count, awards, etc.) Create category-specific thresholds Filter by post type (text, link, image) Format Telegram output: Modify parse_mode in "Send to Telegram" node Options: Markdown, HTML, or plain text Customize message structure and styling
by Anoop
Who’s it for Solo founders and spreadsheet gremlins who track everything in Notion and want crisp Telegram pings without opening a single page. What it does This workflow runs on daily, weekly, and monthly schedules, queries multiple Notion databases, summarizes key numbers, formats human‑readable messages, and sends them to Telegram. Out of the box it sends: Daily: *Total Expenses Today* (sum of Debit in **Financial Transaction since start of day). Weekly: *Total Expenses This Week*, *Monthly Budget Left/Spent* per budget item, *Financial Obligations due* (from **Scheduler). Month End (28th by default)*: *Total Expenses This Month, Total Income This Month, Funds status. Month Start: *Liquidity snapshot* — balances for **Liquid and Semi Liquid assets (from Assets and Liabilities). Messages are built via Code nodes (simple JS) and delivered to Telegram. How it works (nodes & flow) Schedule Triggers**: Daily, Weekly, Monthly (start & end). Notion queries** (selected DBs): Financial Transaction: filters on Created time, Type = Debit/Invoice. Budget: Currently Applicable = true, Payment Schedule Type = Monthly, formula: Monthly Budget Left. Income: month-to-date Created time filter. Funds: reads Amount Left, Amount Spent, Amount Needed. Scheduler: Next Date on or before now, Type = Financial, Repeat Type != off. Assets and Liabilities: Liquidity = Liquid or Semi Liquid. Summarize nodes**: sum property_cost / property_balance fields. Set/Code nodes**: reshape Notion properties (e.g., property_next_date.start → next-date) and format text blocks like: Total Expenses Today - Rs X Monthly Budget Left - <list> Invoices still to pay - <list> Funds Info - spent/needed Liquidity Balance - <list> Telegram**: sends the composed message to chatId. > Tip: If your Notion property names differ, adjust the filters and Set node mappings accordingly. Requirements n8n (Cloud or self‑hosted) Notion workspace with Personal Finance System Notion template cloned into your workspace. Telegram account (for bot + chat) Setup (quick) Telegram Create a bot via @BotFather → get Bot Token. Get your Chat ID (n8n Telegram Trigger “Run once”, then message your bot, copy chat.id). In the Telegram Send node, set chatId (or use an env var/secret). Notion Create an Internal Integration, copy the token, and share each DB with the integration. In the Notion nodes, select your Notion credential and map the DB IDs (already present in the JSON). n8n Credentials Notion API credential: paste the integration token. Telegram API credential: paste Bot Token and set chatId in the node or via expression. Time windows Daily: on_or_after: $now.startOf('day') Weekly: on_or_after: $now.startOf('week') Monthly: on_or_after: $now.startOf('month') Monthly end trigger runs on day 28 by default — change in the Schedule node. Customization Change the date ranges, add currency symbol, or swap summaries for tables. Add more filters (labels, categories) to the Notion nodes. Replace Telegram with Slack/Email by swapping the final node. To avoid “expects dateTime but got object”, convert $now to string: {{$now.toISO()}} or parse Notion dates with DateTime.fromISO(...) as needed. Example messages Total Expenses Today - Rs 1,840 Monthly Budget Left - 3 1) Groceries: Rs 4,500 2) Dining Out: Rs 1,200 3) Utilities: Rs 800 Invoices still to pay - 2 1) Figma Pro: Rs 3,000 2) AWS: Rs 2,450 Why this is useful Keeps your spend & cash visibility tight without opening Notion. Turns your financial system into low‑effort telemetry — you just look at Telegram. Credentials you’ll likely name in n8n Notion**: Notion account Telegram**: Accountant AI > Works great with the “Personal Finance System” style schemas. Adjust property keys (property_*) if your Notion columns differ.
by Fahmi Fahreza
This template sets up a weekly ETL (Extract, Transform, Load) pipeline that pulls financial data from QuickBooks Online into Google BigQuery. It not only transfers data, but also cleans, classifies, and enriches each transaction using your own business logic. Who It's For Data Analysts & BI Developers** Need structured financial data in a warehouse to build dashboards (e.g., Looker Studio, Tableau) and run complex queries. Financial Analysts & Accountants** Want to run custom SQL queries beyond QuickBooks’ native capabilities. Business Owners** Need a permanent, historical archive of transactions for reporting and tracking. What the Workflow Does 1. Extract Fetches transactions from the previous week every Monday from your QuickBooks Online account. 2. Transform Applies custom business logic: Cleans up text fields Generates stable transaction IDs Classifies transactions (income, expense, internal transfer) 3. Format Prepares the cleaned data as a bulk-insert-ready SQL statement. 4. Load Inserts the structured and enriched data into a Google BigQuery table. Setup Guide 1. Prepare BigQuery Create a dataset (e.g., quickbooks) and table (e.g., transactions) The table schema must match the SQL query in the "Load Data to BigQuery" node 2. Add Credentials Add QuickBooks Online and Google BigQuery credentials to your n8n instance 3. Configure Business Logic Open the Clean & Classify Transactions node Update the JavaScript arrays: internalTransferAccounts expenseCategories incomeCategories Ensure these match your QuickBooks Chart of Accounts exactly 4. Configure BigQuery Node Open the Load Data to BigQuery node Select the correct Google Cloud project Ensure the SQL query references the correct dataset and table 5. Activate the Workflow Save and activate it The workflow will now run weekly Requirements A running n8n instance (Cloud or Self-Hosted) A QuickBooks Online account A Google Cloud Platform project with BigQuery enabled A BigQuery table with a matching schema Customization Options Change Schedule**: Modify the schedule node to run daily, monthly, or at a different time Adjust Date Range**: Change the date macro in the Get Last Week's Transactions node Refine Classification Rules**: Add custom logic in the Clean & Classify Transactions node to handle specific edge cases
by Alex Berman
Who is this for This workflow is for sales teams, growth hackers, and lead generation agencies who want to build a targeted list of Shopify store owners -- complete with emails, phone numbers, and social profiles -- and receive those contacts directly in Slack for immediate follow-up. How it works A manual trigger starts the workflow and passes your search parameters (country, platform, lead count) to the ScraperCity store-leads API. The API returns a runId immediately. Because scrapes can take 10--60 minutes, the workflow enters an async polling loop -- waiting 60 seconds between each status check. Once the scrape status returns SUCCEEDED, the workflow downloads the full CSV result set. A Code node parses the CSV, removes duplicates, and formats each lead into a clean Slack message block. Each contact (store name, email, phone, social links) is posted as a structured Slack message so your team can act on leads in real time. How to set up Create a ScraperCity account at scrapercity.com and copy your API key. In n8n, add an HTTP Header Auth credential named ScraperCity API Key with the header name Authorization and value Bearer YOUR_KEY. Add a Slack credential (OAuth2) and connect it to the Post Lead to Slack node. In the Configure Search Parameters node, update countryCode, totalLeads, and slackChannel to match your needs. Click Execute workflow to run. Requirements ScraperCity account and API key (scrapercity.com) n8n instance (cloud or self-hosted) Slack workspace with a bot token and target channel How to customize the workflow Change platform in Configure Search Parameters from shopify to woocommerce to target WooCommerce stores instead. Increase totalLeads up to 5000 per run. Add a Filter node after CSV parsing to keep only leads with verified emails. Replace the Slack node with a Google Sheets or HubSpot node to store leads in a CRM.
by Jitesh Dugar
Eliminate the manual chaos of HR and legal document management. This workflow automates the transition from a raw document upload to a structured, audit-ready archive by combining UploadToURL for instant CDN hosting, Google Drive for long-term storage, and Airtable for status tracking and database management. 🎯 What This Workflow Does Transforms loose document scans into a structured corporate filing system: 📝 Captures Legal Assets - Receives signed contracts or IDs via mobile scan (binary) or remote URL. 🛡️ Duplicate Prevention - Checks Airtable first to ensure a contract isn't already filed for that specific Employee ID. ☁️ Instant CDN Hosting - UploadToURL hosts the document to provide a high-speed link for immediate HR review. 📁 Smart Folder Logic - Automatically navigates or creates a structured Google Drive path: HR/Contracts/{Year}/{Department}/{EmployeeName}/. 🗃️ Database Synchronization - Updates (or creates) an Airtable record to tick "Contract Received," logging both the Drive URL and the CDN backup. 📧 Automated Confirmation - Sends a professional HTML email to HR and the employee with access links and filing metadata. ✨ Key Features UploadToURL Integration**: Provides a redundant, accessible CDN link stored alongside your primary Drive storage for total data reliability. Auto-Nomenclature**: Renames files using a strict audit-ready format: {EmployeeID}{LastName}{Type}_{Date}.pdf. Intelligent Folder Creation**: Never manually create a folder again; the workflow builds the entire hierarchy on the fly. Audit Trail Generation**: Captures "Filed By," "Filed At," and unique "Upload IDs" for every document. Conflict Handling**: Built-in 409 Conflict logic prevents accidental overwrites or double-filing of critical legal papers. 💼 Perfect For HR Teams**: Managing onboarding documents and employment contracts at scale. Legal Departments**: Archiving NDAs, vendor agreements, and compliance certifications. Small Businesses**: Moving away from "loose files in folders" to a searchable, automated database. Remote Teams**: Enabling employees to "upload and forget" their paperwork via a simple link. 🔧 What You'll Need Required Integrations UploadToURL** - To host documents and provide public CDN backup links. n8n Community Node** - n8n-nodes-uploadtourl must be installed. Google Drive** - OAuth2 credentials for secure document storage. Airtable** - Personal Access Token to manage your employee/document database. Gmail / SMTP** - To send automated filing confirmations. Configuration Variables GDRIVE_ROOT_FOLDER_ID: The ID of your main HR folder in Google Drive. AIRTABLE_BASE_ID: Your specific Airtable base for HR/Legal tracking. 🚀 Quick Start Import Template - Copy the JSON and import it into your n8n workspace. Install Node - Ensure the UploadToURL community node is active. Set Credentials - Link your UploadToURL, Google Drive, Airtable, and Gmail accounts. Define Variables - Set your Root Folder ID and Airtable Base details in n8n variables. Test the Pipeline - Send a test POST with a sample PDF to the Webhook URL. Activate - Enable the workflow to begin hands-free archiving. 🎨 Customization Options Expiration Alerts**: Add a node to calculate 1-year expiry dates and set an automated reminder in Slack. OCR Processing**: Integrate an OCR step to read the content of scans and verify names automatically. Watermarking**: Add a "Confidential" or "Draft" watermark to documents before they are uploaded to the CDN. Multi-Base Routing**: Route documents to different Airtable bases depending on the "Department" field. 📈 Expected Results 100% Consistency** in file naming and folder structures across the entire organization. Zero manual data entry**—employee records and checkboxes update automatically. Audit-ready in minutes**: Every file has a timestamped trail and redundant storage links. Instant Accessibility**: HR can view documents via the CDN link before Drive permissions even propagate. 🏆 Use Cases High-Growth Onboarding A startup hiring 20 people a month can automate all contract filings, ensuring the "Contract Received" flag is always accurate for payroll. Compliance Audits When auditors ask for specific contracts, use the Airtable "Structured Filename" column to find and share the relevant Drive or CDN links in seconds. Field Service Scans Technicians in the field can upload signed site reports via a mobile app; the workflow handles the filing and notifies the office immediately. 💡 Pro Tips Folder IDs**: You can find your GDRIVE_ROOT_FOLDER_ID by looking at the last string in the URL when you are inside that folder in your browser. Structured JSON**: Use the returned auditTrail object to build a log of all uploads in a separate "Master Audit" spreadsheet. Employee IDs**: If no ID is provided, the workflow generates a temporary one using a timestamp to ensure the archive never breaks. Ready to secure your document pipeline? Import this template and connect UploadToURL to build a world-class archiving system in under 20 minutes. Need help with Airtable field mapping? The workflow includes detailed sticky notes explaining the exact field names required for the automation to run.