by osama goda
How it works This workflow automatically uploads YouTube Shorts from a Google Drive folder. It picks one video at each run, generates a YouTube-optimized title, description, and hashtags using an AI model, uploads the video through YouTube’s resumable upload API, and finally moves the processed video to a “Posted” folder. Key steps Run on a schedule (daily/hourly/custom CRON) Fetch one video from a selected Google Drive folder Generate title + description + hashtags using an LLM Prepare YouTube metadata Upload the video via resumable upload Move the file to a “posted” folder to avoid duplicates Setup instructions Connect your Google Drive credentials Connect your YouTube OAuth2 credentials Update the Drive folder IDs (input + posted folders) Edit the “Set variables” node to change store name, country, coupon code, and tone Review the prompt in the AI node if you want to customize content style All technical details are documented inside the sticky notes within the workflow.
by Ziad Adel
Turn LinkedIn Noise Into Weekly Slack Insights 🚀 What if your team could skim the best of LinkedIn in 2 minutes instead of scrolling for hours? This workflow transforms raw LinkedIn posts into a bite-sized Slack digest — summarized, grouped, and delivered automatically every week. ⚡ What It Does Scrapes Posts Automatically**: Pulls fresh posts from LinkedIn profiles you specify (via Apify). Summarizes with AI: Condenses each post into **2–3 bullets (≤15 words). Keeps It Lean: Digest capped at **500 words total. Organized by Author**: See exactly who said what, without searching. Delivers to Slack**: Neatly formatted digest drops in your channel on schedule, with post links in thread replies. 🛠 How It Works Google Sheets → Profile URLs Add LinkedIn profile URLs into a Google Sheet tab — this is your watchlist. Apify Scraper → Posts Fetches up to 10 posts per profile within the past 7 days. Clean & Format Code nodes strip out clutter (hashtags, broken links, escapes). OpenAI Summarizer AI rewrites posts into concise bullets and trims the digest under 500 words. Slack Delivery Digest posts directly in Slack every Sunday morning, with original links attached as thread replies. ✅ Pre-conditions / Requirements Google Sheets API credentials** connected in n8n. Apify account + API Token** for the LinkedIn profile posts actor. OpenAI API Key** for summarization. Slack Bot Token** with permission to post messages in your chosen channel. Profiles you want to track must be publicly viewable or accessible to Apify. 🎛 Customization Options Schedule**: Change the Cron node if you prefer daily or monthly digests. Batch Size**: Default is 5 profiles per batch — increase or decrease for performance. Summaries**: Adjust OpenAI system prompt to allow longer or shorter bullet points. Filters**: Extend extendOutputFunction to exclude reposts, sponsored posts, or keep specific authors only. Slack Output**: Change formatting, channel, or send as direct message instead of posting in a channel. 💡 Why This Is Valuable Saves your team 3–5 hours/week of scrolling. Keeps everyone updated with actionable insights, not filler. Turns a chaotic LinkedIn feed into a signal-only digest. Positions you as the one who always brings the smartest highlights to the table. 🎯 Best For Founders who want LinkedIn insights without endless scrolling. Marketing and growth teams tracking thought leaders. Operators who want signal over noise, delivered straight to Slack. No more mindless scrolling. Just sharp insights, automatically packaged. ✅
by Pixcels Themes
Who’s it for This template is designed for recruiters, lead-generation teams, agency owners, and sales professionals who collect LinkedIn profile data and need to automate the process of finding verified company domains and email addresses. It is ideal for teams looking to eliminate manual research and streamline prospect enrichment. What it does / How it works This workflow reads contact records from a Google Sheet, including name, position, and description. An AI agent analyzes each profile to determine the company domain. If the domain is already identifiable from the description, it is used directly. If no domain is found, the workflow generates an intelligent search term and performs a Google Custom Search to extract the most accurate domain from real web results using another AI agent. Once the domain is confirmed, the workflow queries Hunter.io to find the best-matching email address for the contact. Finally, the enriched data—email and company domain—is appended back into the Google Sheet, updating each row automatically. Requirements Google Sheets OAuth2 credentials Google Gemini (PaLM) API credentials Hunter.io API key Google Custom Search API key and CSE ID A Google Sheet with columns for name, position, description, and domain How to set up Connect your Google Sheets, Gemini, Hunter.io, and Google Search credentials. Replace the Google Sheet ID and sheet name with your own. Add your API keys to the designated nodes. Ensure column names match your sheet structure. Execute the workflow to begin enrichment. How to customize the workflow Modify AI prompts for better domain inference Add additional enrichment steps (social profiles, industry tags) Add fallback email providers (Snov, Apollo, etc.) Change update logic to support multiple sheets or batch processing
by Yaron Been
Monitor CRM accounts for hiring spikes by enriching HubSpot companies with PredictLeads job data and alerting your team via Slack. This workflow pulls all companies from your HubSpot CRM, checks each one against the PredictLeads Job Openings API for target roles (sales, engineering, marketing, product, data), compares the current count to historical data stored in Google Sheets, and flags any company where hiring jumped more than 50%. Flagged companies get updated in HubSpot with a hiring signal and trigger a Slack alert so your sales team can act fast. How it works: Schedule trigger runs the workflow daily at 9 AM. Retrieves all companies from HubSpot CRM (domain, name, ID). Loops through each company and fetches job openings from PredictLeads. Filters jobs to target roles (sales, engineering, marketing, product, data). Reads the previous job count for that company from Google Sheets. Calculates percentage change between current and historical counts. If hiring increased more than 50%, flags it as a spike. Updates the HubSpot company record with a hiring signal property. Sends a Slack alert with the company name, role count, and percentage change. Updates Google Sheets with the latest count regardless of spike status. Setup: Connect your HubSpot CRM (OAuth2) with company read/write access. Create a Google Sheet with a "HistoricalCounts" tab containing columns: domain, company_name, job_count, previous_count, percent_change, check_date. Connect a Slack bot to the channel where you want hiring alerts. Add your PredictLeads API credentials (X-Api-Key and X-Api-Token headers). Requirements: HubSpot CRM account with OAuth2 credentials. Google Sheets OAuth2 credentials. Slack OAuth2 credentials (bot with chat:write permission). PredictLeads API account (https://docs.predictleads.com). Notes: The 50% spike threshold can be adjusted in the IF node. Target roles are configured in the Filter Target Roles code node -- add or remove roles as needed. The workflow updates historical data on every run, so spike detection improves over time. PredictLeads Job Openings API docs: https://docs.predictleads.com
by Roshan Ramani
Who's it for This workflow is perfect for: Content creators who need to stay on top of trending topics Marketers tracking industry discussions and competitor mentions Community managers monitoring relevant subreddits Researchers gathering trending content in specific niches Anyone who wants curated Reddit updates without manual browsing What it does This automated workflow: Monitors multiple subreddits for viral posts daily Filters posts based on engagement metrics (upvotes and recency) Generates concise AI summaries of trending content Delivers formatted updates directly to your Telegram chat Runs completely hands-free once configured How it works Step 1: Configuration & Scheduling Triggers daily at 8 AM (customizable) Loads your configured subreddit niches and Telegram settings Step 2: Data Collection Loops through each subreddit in your niche list Fetches the 50 newest posts from each subreddit Extracts key data: title, URL, upvotes, timestamp, subreddit name Step 3: Smart Filtering Applies viral post criteria: Posts with 500+ upvotes, OR Posts with 70+ upvotes created within the last 24 hours Ensures only high-engagement content passes through Step 4: AI Summarization Aggregates all filtered posts into a single batch Sends to GPT-4o-mini for analysis Generates concise 100-200 word summaries Formats output for Telegram markdown Step 5: Delivery Sends all summaries to your Telegram chat Includes post links and engagement metrics Delivers in a clean, readable format Setup steps 1. Configure Reddit credentials Connect your Reddit OAuth2 API credentials in the "Get Reddit Viral Posts" node Ensure you have API access enabled on your Reddit account 2. Configure Telegram credentials Add your Telegram bot token in the "Send to Telegram" node Get your chat ID by messaging your bot and checking updates 3. Customize your niches Open the "Workflow Configuration" node Edit the niches array with your target subreddits Default niches: technology, programming, science, gaming 4. Set your Telegram chat ID Replace the default chat ID (7917193308) in "Workflow Configuration" Use your personal chat ID or group chat ID 5. Adjust the schedule (optional) Modify the "Daily 8 AM Trigger" to your preferred time Change frequency if you want multiple updates per day 6. Test before activating Run the workflow manually using the "Test workflow" button Verify summaries arrive in Telegram correctly Check that filtering logic works as expected Requirements Required credentials: Reddit OAuth2 API access (free) Telegram bot token (free via @BotFather) OpenAI API key for GPT-4o-mini (paid) Platform requirements: n8n instance (self-hosted or n8n Cloud) Active internet connection Sufficient API rate limits for your usage Technical knowledge: Basic understanding of n8n workflows Ability to generate API credentials Familiarity with Telegram bots (helpful but not required) How to customize Adjust subreddit monitoring: Add or remove subreddits in the niches array Format: ["subreddit1", "subreddit2", "subreddit3"] Example: ["machinelearning", "datascience", "artificial"] Modify viral post criteria: Edit the "Filter" node conditions Change upvote thresholds (default: 500+ or 70+ within 24h) Adjust time window for recency checks Customize AI summaries: Update the system prompt in "AI Summarizer" node Change summary length (default: 100-200 words) Modify tone, style, or focus areas Switch to different OpenAI models if needed Change scheduling: Modify trigger time in "Daily 8 AM Trigger" Options: hourly, twice daily, weekly, custom cron Consider API rate limits when increasing frequency Adjust data collection: Change the limit parameter in "Get Reddit Viral Posts" Default: 50 posts per subreddit Higher limits = more comprehensive but slower execution Enhance filtering logic: Add additional criteria (comments count, awards, etc.) Create category-specific thresholds Filter by post type (text, link, image) Format Telegram output: Modify parse_mode in "Send to Telegram" node Options: Markdown, HTML, or plain text Customize message structure and styling
by Madame AI
Scrape industry growth signals from BrowserAct to Slack reports Introduction This workflow automates the monitoring of market growth signals, such as funding rounds, for specific target industries. It scrapes data using BrowserAct, uses AI to filter and format the results for the current month, and delivers a consolidated report directly to Slack. Target Audience Account-Based Marketing (ABM) managers, sales development representatives (SDRs), and venture capital researchers looking to track competitor or prospect activity. How it works Scheduling: A Schedule Trigger initiates the workflow once a month to gather recent data. Configuration: A Set node defines the Target_Industry variable (e.g., "Property Management"), which controls what data acts as a signal. Data Extraction: The BrowserAct node runs the "ABM Signal Monitor" template to scrape raw company data and growth lists from the web. AI Analysis: An AI Agent (powered by OpenRouter/GPT-4o) processes the raw text. It filters companies to ensure they match the target industry and have funding dates within the current month. Data Structuring: The AI formats the valid leads into a structured JSON report, splitting the data if the list exceeds Slack's character limits. Data Splitting: A Split Out node separates the message array into individual items to ensure reliable delivery. Notification: A Slack node posts the curated "Growth Signal Report" to a specified channel. How to set up Configure Credentials: Connect your BrowserAct, OpenRouter, and Slack accounts in the n8n credentials section. Prepare BrowserAct: Ensure you have the ABM Signal Monitor template saved in your BrowserAct account. Define Industry: Open the Set Target Industry node and update the value field to the specific industry you want to track (e.g., "Fintech", "Healthcare"). Select Slack Channel: Open the Send the report to the channel node and select the specific Slack channel where you want the report to appear. Requirements BrowserAct Account:* You must have the *ABM Signal Monitor** template active in your library. OpenRouter Account:** Required to access the GPT-4o model for data filtering and formatting. Slack Account:** Required for receiving the final reports. How to customize the workflow Change the Trigger Frequency: Update the Monthly Trigger node to run weekly or daily if you require more frequent updates on market signals. Modify the AI Logic: Edit the System Message in the AI Agent node to change the filtering criteria (e.g., filter by hiring volume instead of funding). Add a Database: Insert a Google Sheets or Notion node before the Slack step to archive the growth signals for long-term tracking. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video 🚀 Automate ABM Sales Signals: Track Startup Funding with n8n & AI
by Anoop
Who’s it for Solo founders and spreadsheet gremlins who track everything in Notion and want crisp Telegram pings without opening a single page. What it does This workflow runs on daily, weekly, and monthly schedules, queries multiple Notion databases, summarizes key numbers, formats human‑readable messages, and sends them to Telegram. Out of the box it sends: Daily: *Total Expenses Today* (sum of Debit in **Financial Transaction since start of day). Weekly: *Total Expenses This Week*, *Monthly Budget Left/Spent* per budget item, *Financial Obligations due* (from **Scheduler). Month End (28th by default)*: *Total Expenses This Month, Total Income This Month, Funds status. Month Start: *Liquidity snapshot* — balances for **Liquid and Semi Liquid assets (from Assets and Liabilities). Messages are built via Code nodes (simple JS) and delivered to Telegram. How it works (nodes & flow) Schedule Triggers**: Daily, Weekly, Monthly (start & end). Notion queries** (selected DBs): Financial Transaction: filters on Created time, Type = Debit/Invoice. Budget: Currently Applicable = true, Payment Schedule Type = Monthly, formula: Monthly Budget Left. Income: month-to-date Created time filter. Funds: reads Amount Left, Amount Spent, Amount Needed. Scheduler: Next Date on or before now, Type = Financial, Repeat Type != off. Assets and Liabilities: Liquidity = Liquid or Semi Liquid. Summarize nodes**: sum property_cost / property_balance fields. Set/Code nodes**: reshape Notion properties (e.g., property_next_date.start → next-date) and format text blocks like: Total Expenses Today - Rs X Monthly Budget Left - <list> Invoices still to pay - <list> Funds Info - spent/needed Liquidity Balance - <list> Telegram**: sends the composed message to chatId. > Tip: If your Notion property names differ, adjust the filters and Set node mappings accordingly. Requirements n8n (Cloud or self‑hosted) Notion workspace with Personal Finance System Notion template cloned into your workspace. Telegram account (for bot + chat) Setup (quick) Telegram Create a bot via @BotFather → get Bot Token. Get your Chat ID (n8n Telegram Trigger “Run once”, then message your bot, copy chat.id). In the Telegram Send node, set chatId (or use an env var/secret). Notion Create an Internal Integration, copy the token, and share each DB with the integration. In the Notion nodes, select your Notion credential and map the DB IDs (already present in the JSON). n8n Credentials Notion API credential: paste the integration token. Telegram API credential: paste Bot Token and set chatId in the node or via expression. Time windows Daily: on_or_after: $now.startOf('day') Weekly: on_or_after: $now.startOf('week') Monthly: on_or_after: $now.startOf('month') Monthly end trigger runs on day 28 by default — change in the Schedule node. Customization Change the date ranges, add currency symbol, or swap summaries for tables. Add more filters (labels, categories) to the Notion nodes. Replace Telegram with Slack/Email by swapping the final node. To avoid “expects dateTime but got object”, convert $now to string: {{$now.toISO()}} or parse Notion dates with DateTime.fromISO(...) as needed. Example messages Total Expenses Today - Rs 1,840 Monthly Budget Left - 3 1) Groceries: Rs 4,500 2) Dining Out: Rs 1,200 3) Utilities: Rs 800 Invoices still to pay - 2 1) Figma Pro: Rs 3,000 2) AWS: Rs 2,450 Why this is useful Keeps your spend & cash visibility tight without opening Notion. Turns your financial system into low‑effort telemetry — you just look at Telegram. Credentials you’ll likely name in n8n Notion**: Notion account Telegram**: Accountant AI > Works great with the “Personal Finance System” style schemas. Adjust property keys (property_*) if your Notion columns differ.
by Jitesh Dugar
Process invoices with UploadToURL, AWS Textract, and Google Sheets Eliminate manual data entry from your accounts payable process. This workflow transforms raw invoice scans into structured financial records by combining UploadToURL for hosting, AWS Textract for OCR data extraction, and Google Sheets for centralized logging. 🎯 What This Workflow Does Turns any paper or PDF invoice into a verified spreadsheet entry in seconds: 📝 Captures Invoice Scans - Receives invoices via mobile upload (binary) or remote URL via Webhook. ☁️ Instant CDN Hosting - UploadToURL hosts the scan and provides a permanent link for your financial audit trail. 👁️ Intelligent OCR - AWS Textract analyzes the document to extract Vendor Name, Invoice Number, Amount, and Due Date. 🚦 Validation & Duplicate Check - Sanitizes currency formats and searches your Google Sheet to prevent double-paying the same invoice. 📊 Automated Logging - Records the extracted data directly into Google Sheets and pings the finance team in Slack. ✨ Key Features UploadToURL Integration**: Hosts your financial evidence on a public CDN, making it accessible directly from your spreadsheet cells. High-Accuracy Extraction**: Uses AWS Textract's specialized invoice processing to handle complex table layouts. Audit-Ready Records**: Every entry includes the original file URL, upload timestamp, and department metadata. Smart Formatting**: Automatically normalizes dates and coerces pricing into a standard float format for easy accounting. Instant Notifications**: Keeps the finance team in the loop with real-time Slack alerts for high-priority bills. 💼 Perfect For Finance Teams**: Processing 50+ vendor invoices monthly without manual typing. Small Business Owners**: Managing utility bills and receipts on the go via smartphone uploads. Operations Managers**: Tracking departmental spending with a standardized, automated log. Agencies**: Organizing reimbursable expenses across multiple client projects. 🔧 What You'll Need Required Integrations UploadToURL** - To host invoice scans and provide audit links. n8n Community Node** - n8n-nodes-uploadtourl must be installed. AWS Account* - Credentials for *AWS Textract** (OCR). Google Sheets** - OAuth2 credentials to write to your finance spreadsheet. Optional Integrations Slack** - To receive real-time notifications for new invoice entries. Gmail** - To trigger the workflow automatically when an invoice arrives in your inbox. 🚀 Quick Start Import Template - Copy the JSON and import it into your n8n canvas. Install Node - Ensure the UploadToURL community node is installed. Set Credentials - Link your UploadToURL, AWS, Google Sheets, and Slack accounts. Configure Spreadsheet - Create a sheet with columns: Invoice No, Vendor, Amount, Due Date, and File URL. Set Variables - Add your GSHEET_SPREADSHEET_ID and SLACK_FINANCE_CHANNEL to n8n variables. Deploy - Switch the workflow to "Active" to begin your automated bookkeeping. 🎨 Customization Options Approval Workflow**: Add a "Wait for Approval" node so a manager must click a Slack button before the row is finalized. Tax Calculation**: Insert a code node to automatically calculate VAT or Sales Tax based on the extracted total. ERP Sync: Replace Google Sheets with **QuickBooks, Xero, or NetSuite to push data directly into your accounting software. Multi-Currency Support**: Add a Currency Conversion node to normalize all totals into your base company currency. 📈 Expected Results 90% reduction** in manual data entry time (from minutes to seconds per bill). Improved Accuracy**: Elimination of typos and transcription errors in financial totals. Better Audit Compliance**: Every line item in your sheet is permanently linked to the original scan. Faster Payment Cycles**: Bills are recorded the moment they are received, preventing late fees. 🏆 Use Cases Accounts Payable Automate the entry of monthly recurring bills (utilities, rent, software) directly into your tracking sheet. Employee Reimbursements Staff can snap a photo of a business dinner receipt and upload it to the webhook to start the reimbursement process instantly. Bulk Document Digitization Upload a folder of historical PDF invoices; the workflow will categorize and log them all in one batch. 💡 Pro Tips Clear Scans**: Ensure invoices are well-lit and flat; high contrast helps AWS Textract achieve 99% accuracy. Folder IDs**: Keep your Google Sheet in a shared Drive folder so your entire finance team can access the linked UploadToURL files. Webhook Security**: Use a unique path or basic auth on your webhook to ensure only authorized devices can submit invoices. Ready to automate your bookkeeping? Import this template and connect UploadToURL to build a hands-free finance pipeline today. Questions about AWS Textract mapping? The workflow includes detailed sticky notes explaining how to extract custom fields from your specific vendor layouts.
by Fahmi Fahreza
This template sets up a weekly ETL (Extract, Transform, Load) pipeline that pulls financial data from QuickBooks Online into Google BigQuery. It not only transfers data, but also cleans, classifies, and enriches each transaction using your own business logic. Who It's For Data Analysts & BI Developers** Need structured financial data in a warehouse to build dashboards (e.g., Looker Studio, Tableau) and run complex queries. Financial Analysts & Accountants** Want to run custom SQL queries beyond QuickBooks’ native capabilities. Business Owners** Need a permanent, historical archive of transactions for reporting and tracking. What the Workflow Does 1. Extract Fetches transactions from the previous week every Monday from your QuickBooks Online account. 2. Transform Applies custom business logic: Cleans up text fields Generates stable transaction IDs Classifies transactions (income, expense, internal transfer) 3. Format Prepares the cleaned data as a bulk-insert-ready SQL statement. 4. Load Inserts the structured and enriched data into a Google BigQuery table. Setup Guide 1. Prepare BigQuery Create a dataset (e.g., quickbooks) and table (e.g., transactions) The table schema must match the SQL query in the "Load Data to BigQuery" node 2. Add Credentials Add QuickBooks Online and Google BigQuery credentials to your n8n instance 3. Configure Business Logic Open the Clean & Classify Transactions node Update the JavaScript arrays: internalTransferAccounts expenseCategories incomeCategories Ensure these match your QuickBooks Chart of Accounts exactly 4. Configure BigQuery Node Open the Load Data to BigQuery node Select the correct Google Cloud project Ensure the SQL query references the correct dataset and table 5. Activate the Workflow Save and activate it The workflow will now run weekly Requirements A running n8n instance (Cloud or Self-Hosted) A QuickBooks Online account A Google Cloud Platform project with BigQuery enabled A BigQuery table with a matching schema Customization Options Change Schedule**: Modify the schedule node to run daily, monthly, or at a different time Adjust Date Range**: Change the date macro in the Get Last Week's Transactions node Refine Classification Rules**: Add custom logic in the Clean & Classify Transactions node to handle specific edge cases
by Milo Bravo
RSVP Lead Scoring for Events with GPT, HubSpot & Slack Who is this for? Event organizers, RevOps teams, sales managers, and marketers running conferences, webinars, or meetups who want to automatically qualify RSVPs and turn attendees into revenue opportunities. What problem is this workflow solving? RSVP lead qualification is manual and slow: Reviewing 1000+ registrations takes hours Missing high-fit prospects (directors, founders) No instant sales handoff or nurture segmentation Leads leak between events and CRM This workflow auto-scores RSVPs 0-100 and routes high-fits to sales instantly. What this workflow does Trigger**: Google Forms/Typeform RSVP webhook AI Scoring: GPT-4o-mini analyzes job title, company, intent → **0-100 fit score High Scorers (80+)**: HubSpot CRM contact + Slack "#sales: Director @ Acme (87) → Book now [Calendly]" Low Scorers**: Nurture sequence (email lists) Dashboard**: Google Sheets with scores, conversions tracked Setup (3 minutes) Forms**: Google Forms/Typeform → n8n webhook (copy/paste URL) AI**: OpenAI API key (GPT-4o-mini - cheap & fast) CRM**: HubSpot API key (works Free tier) Slack**: #sales channel + bot token Fully configurable—no code changes needed. How to customize to your needs Scoring**: Adjust GPT thresholds (80→90 for enterprise) CRM**: Swap HubSpot for Salesforce/Pipedrive Channels**: Add Teams/Email + multiple Slack rooms Events**: Multi-event scoring by form source Follow-up**: Auto-Calendly, SMS, or email sequences ROI: 3x qualified leads** to sales team 5x faster sales response** 40% less lead leakage** (proven 500+ runs) Zero manual qualification** Need help customizing?: Contact me for consulting and support: LinkedIn / Message Keywords: event RSVP, AI lead scoring, conference lead qualification, event lead gen RevOps, lead fit scoring.
by Madame AI
Automate B2B lead research from Google Sheets to Airtable With BrowserAct This workflow automates the time-consuming process of B2B market research. It takes a list of company website URLs from a Google Sheet, uses BrowserAct to scrape their profiles and news sections, analyzes the data using AI to determine strategic focus and key activities, and saves a comprehensive executive summary into an Airtable database. Target Audience Sales operations managers, SDRs (Sales Development Representatives), market researchers, and venture capital analysts who need to process large volumes of company data efficiently. How it works Ingest Data: The workflow retrieves a list of target company URLs from a Google Sheet. Scrape Content: It loops through each URL and triggers BrowserAct (using the "B2B Contact Research" template) to scrape company profiles, about pages, and news sections. Analyze & Stage: An AI Agent (using OpenRouter/GPT-5) processes the raw scraped text to identify whether it is news or profile data, extracting key insights like strategic focus and recent updates. This raw analysis is staged back into the Google Sheet. Synthesize: The workflow retrieves the staged data and aggregates it. Final Summary: A second AI Agent compiles all data points into a cohesive "Research Record," writing a high-impact executive summary and formatting notes with Markdown. Database Entry: The final structured data (Name, Notes, Strategic Focus) is created as a new record in Airtable. How to set up Configure Credentials: Connect your BrowserAct, Google Sheets, Airtable, and OpenRouter accounts in n8n. BrowserAct Template: Ensure you have the B2B Contact Research template saved in your BrowserAct account. Prepare Google Sheet: Create a Google Sheet with a tab named "DataBase". Add the headers listed below. Populate the Page URL column with the companies you want to research. Configure Nodes: Open the Google Sheets nodes and select your file. Open the Airtable node and select your Base and Table. Run: Execute the workflow to start processing the list. Google Sheet Headers To use this workflow, create a Google Sheet with the following headers: Page URL (Input - put your links here) Page Data (Output - leave blank, populated by bot) row_number (Output - leave blank, populated by bot) Requirements BrowserAct* account with the *B2B Contact Research** template. Google Sheets** account. Airtable** account. OpenRouter** account (or credentials for a specific LLM like GPT-4o or Gemini). How to customize the workflow Change the CRM: Replace the final Airtable node with HubSpot, Salesforce, or Pipedrive nodes to inject research directly into your CRM deals. Adjust AI Prompts: Modify the system prompt in the "Analyze the Company Page" agent to focus on specific criteria relevant to your business (e.g., look specifically for "pricing models" or "hiring trends"). Email Alerts: Add a Gmail or Slack node at the end of the workflow to notify your sales team immediately when a high-value prospect has been researched. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video How to Structure Airtable for Automated Company Research (n8n Tutorial)
by Jitesh Dugar
Eliminate the manual chaos of HR and legal document management. This workflow automates the transition from a raw document upload to a structured, audit-ready archive by combining UploadToURL for instant CDN hosting, Google Drive for long-term storage, and Airtable for status tracking and database management. 🎯 What This Workflow Does Transforms loose document scans into a structured corporate filing system: 📝 Captures Legal Assets - Receives signed contracts or IDs via mobile scan (binary) or remote URL. 🛡️ Duplicate Prevention - Checks Airtable first to ensure a contract isn't already filed for that specific Employee ID. ☁️ Instant CDN Hosting - UploadToURL hosts the document to provide a high-speed link for immediate HR review. 📁 Smart Folder Logic - Automatically navigates or creates a structured Google Drive path: HR/Contracts/{Year}/{Department}/{EmployeeName}/. 🗃️ Database Synchronization - Updates (or creates) an Airtable record to tick "Contract Received," logging both the Drive URL and the CDN backup. 📧 Automated Confirmation - Sends a professional HTML email to HR and the employee with access links and filing metadata. ✨ Key Features UploadToURL Integration**: Provides a redundant, accessible CDN link stored alongside your primary Drive storage for total data reliability. Auto-Nomenclature**: Renames files using a strict audit-ready format: {EmployeeID}{LastName}{Type}_{Date}.pdf. Intelligent Folder Creation**: Never manually create a folder again; the workflow builds the entire hierarchy on the fly. Audit Trail Generation**: Captures "Filed By," "Filed At," and unique "Upload IDs" for every document. Conflict Handling**: Built-in 409 Conflict logic prevents accidental overwrites or double-filing of critical legal papers. 💼 Perfect For HR Teams**: Managing onboarding documents and employment contracts at scale. Legal Departments**: Archiving NDAs, vendor agreements, and compliance certifications. Small Businesses**: Moving away from "loose files in folders" to a searchable, automated database. Remote Teams**: Enabling employees to "upload and forget" their paperwork via a simple link. 🔧 What You'll Need Required Integrations UploadToURL** - To host documents and provide public CDN backup links. n8n Community Node** - n8n-nodes-uploadtourl must be installed. Google Drive** - OAuth2 credentials for secure document storage. Airtable** - Personal Access Token to manage your employee/document database. Gmail / SMTP** - To send automated filing confirmations. Configuration Variables GDRIVE_ROOT_FOLDER_ID: The ID of your main HR folder in Google Drive. AIRTABLE_BASE_ID: Your specific Airtable base for HR/Legal tracking. 🚀 Quick Start Import Template - Copy the JSON and import it into your n8n workspace. Install Node - Ensure the UploadToURL community node is active. Set Credentials - Link your UploadToURL, Google Drive, Airtable, and Gmail accounts. Define Variables - Set your Root Folder ID and Airtable Base details in n8n variables. Test the Pipeline - Send a test POST with a sample PDF to the Webhook URL. Activate - Enable the workflow to begin hands-free archiving. 🎨 Customization Options Expiration Alerts**: Add a node to calculate 1-year expiry dates and set an automated reminder in Slack. OCR Processing**: Integrate an OCR step to read the content of scans and verify names automatically. Watermarking**: Add a "Confidential" or "Draft" watermark to documents before they are uploaded to the CDN. Multi-Base Routing**: Route documents to different Airtable bases depending on the "Department" field. 📈 Expected Results 100% Consistency** in file naming and folder structures across the entire organization. Zero manual data entry**—employee records and checkboxes update automatically. Audit-ready in minutes**: Every file has a timestamped trail and redundant storage links. Instant Accessibility**: HR can view documents via the CDN link before Drive permissions even propagate. 🏆 Use Cases High-Growth Onboarding A startup hiring 20 people a month can automate all contract filings, ensuring the "Contract Received" flag is always accurate for payroll. Compliance Audits When auditors ask for specific contracts, use the Airtable "Structured Filename" column to find and share the relevant Drive or CDN links in seconds. Field Service Scans Technicians in the field can upload signed site reports via a mobile app; the workflow handles the filing and notifies the office immediately. 💡 Pro Tips Folder IDs**: You can find your GDRIVE_ROOT_FOLDER_ID by looking at the last string in the URL when you are inside that folder in your browser. Structured JSON**: Use the returned auditTrail object to build a log of all uploads in a separate "Master Audit" spreadsheet. Employee IDs**: If no ID is provided, the workflow generates a temporary one using a timestamp to ensure the archive never breaks. Ready to secure your document pipeline? Import this template and connect UploadToURL to build a world-class archiving system in under 20 minutes. Need help with Airtable field mapping? The workflow includes detailed sticky notes explaining the exact field names required for the automation to run.