by Preston Zeller
How It Works This workflow automates the real estate lead qualification process by leveraging property data from BatchData. The automation follows these steps: When a new lead is received through your CRM webhook, the workflow captures their address information It then makes an API call to BatchData to retrieve comprehensive property details A sophisticated scoring algorithm evaluates the lead based on property characteristics like: Property value (higher values earn more points) Square footage (larger properties score higher) Property age (newer constructions score higher) Investment status (non-owner occupied properties earn bonus points) Lot size (larger lots receive additional score) Leads are automatically classified into categories (high-value, qualified, potential, or unqualified) The workflow updates your CRM with enriched property data and qualification scores High-value leads trigger immediate follow-up tasks for your team Notifications are sent to your preferred channel (Slack in this example) The entire process happens within seconds of receiving a new lead, ensuring your sales team can prioritize the most valuable opportunities immediately.. Who It's For This workflow is perfect for: Real estate agents and brokers looking to prioritize high-value property leads Mortgage lenders who need to qualify borrowers based on property assets Home service providers (renovators, contractors, solar installers) targeting specific property types Property investors seeking specific investment opportunities Real estate marketers who want to segment audiences by property value Home insurance agents qualifying leads based on property characteristics Any business that bases lead qualification on property details will benefit from this automated qualification system. About BatchData BatchData is a comprehensive property data provider that offers detailed information about residential and commercial properties across the United States. Their API provides: Property valuation and estimates Ownership information Property characteristics (size, age, bedrooms, bathrooms) Tax assessment data Transaction history Occupancy status (owner-occupied vs. investment) Lot details and dimensions By integrating BatchData with your lead management process, you can automatically verify and enrich leads with accurate property information, enabling more intelligent lead scoring and routing based on actual property characteristics rather than just contact information. This workflow demonstrates how to leverage BatchData's property API to transform your lead qualification process from manual research into an automated, data-driven system that ensures high-value leads receive immediate attention.
by Bastien Laval
Description Boost your productivity and keep your Asana workspace clutter-free with this n8n workflow. It automatically scans for tasks whose due dates have passed and reschedules them to the current date, ensuring no important to-dos slip through the cracks. Additionally, any completed tasks in Asana with an overdue date are removed, maintaining a clear, organized task list. Key Benefits Streamline Task Management**: No more manual updatesโlet the workflow reschedule overdue tasks for you. Optimize Workspace Organization**: Eliminate finished tasks to focus on active priorities and reduce clutter. Save Time and Effort**: Automate repetitive maintenance, freeing you to concentrate on what truly matters. Configuration Steps Add your Asana credentials Schedule the workflow to run at desired intervals (e.g., daily or weekly). Select your Workspace Name and your Assignee Name (user) in the Get user tasks node (Optional) Tailor filtering conditions to match your preferred due-date rules and removal criteria. Activate the workflow and watch your Asana workspace stay up to date and clutter-free.
by WeblineIndia
This smart automation workflow created by the AI development team at WeblineIndia, helps with the daily collection and storage of weather data. Using the OpenWeatherMap API and Airtable, this solution gathers vital weather details such as temperature, humidity, and wind speed. The automation ensures daily updates, creating a dependable historical record of weather patterns for future reference and analysis. Steps: Set Schedule Trigger Configure a Cron node to trigger the workflow daily, for example, at 7 AM. Fetch Weather Data (HTTP Request) Use the HTTP Request node to retrieve weather data from the OpenWeatherMap API. Include your API key and query parameters (e.g., q=London, unit=metric) to specify the city and desired units. Parse Weather Data Utilize a JSON Parse node to extract key weather details, such as temperature, humidity, and wind speed, from the API response. Store Data in Airtable Use the Airtable node to insert the parsed data into the designated Airtable table. Ensure proper mapping of fields like temperature, humidity, and wind speed. Save and Execute Save the workflow and activate it to ensure weather data is fetched and stored automatically every day. Outcome This robust solution, developed by WeblineIndia, reliably collects and archives daily weather data, providing businesses and individuals with an accessible record of weather trends for analysis and decision-making. About WeblineIndia We specialize in creating custom automation solutions and innovative software workflows to help businesses streamline operations and achieve efficiency. This weather data fetcher is just one example of our expertise in delivering value through technology.
by Encoresky
This workflow automates the process of handling conversation transcriptions and distributing key information across your organization. Here's what it does: Trigger: The workflow is initiated via a webhook that receives a transcription (e.g., from a call or meeting). Summarization & Extraction: Using AI, the transcription is summarized, and key information is extracted โ such as action items, departments involved, and client details. Department Notifications: The relevant summarized information is automatically routed to specific departments via email based on content classification. CRM Sync: The summarized version is saved to the associated contact or deal in HubSpot for future reference and visibility. *Multi-Channel Alerts: *The summary is also sent via WhatsApp and Slack to keep internal teams instantly informed, regardless of platform. Use Case: Ideal for sales, customer service, or operations teams who manage client conversations and want to ensure seamless cross-departmental communication, documentation, and follow-up. Apps Used: Webhook (Trigger) OpenAI (or other AI/NLP for summarization) HubSpot Email Slack WhatsApp (via Twilio or third-party provider)
by Robert Breen
๐ซ Email Unsubscribe Handler for Outlook Description This n8n workflow automatically scans recent email replies from your Outlook inbox and identifies unsubscribe requests. If a contact replies with any variation of "unsubscribe" within the past 7 days, the system performs two key actions: Saves the contactโs email address in a BigQuery unsubscribes table (for compliance and tracking). Deletes that contact from the active leads table in BigQuery (to stop future outreach). This flow can be triggered on a schedule (e.g. every 4 hours) or run manually as needed. Key Features ๐ฅ Email Parsing from Outlook: Automatically monitors for replies that contain unsubscribe language. ๐ง Smart Filtering: Captures unsubscribes based on message content, not just subject lines. ๐ BigQuery Integration: Logs unsubscribed emails and removes them from your leads dataset. ๐ค Connect with Me Description Iโm Robert Breen, founder of Ynteractive โ a consulting firm that helps businesses automate operations using n8n, AI agents, and custom workflows. Iโve helped clients build everything from intelligent chatbots to complex sales automations, and Iโm always excited to collaborate or support new projects. If you found this workflow helpful or want to talk through an idea, Iโd love to hear from you. Links ๐ Website: https://www.ynteractive.com ๐บ YouTube: @ynteractivetraining ๐ผ LinkedIn: https://www.linkedin.com/in/robert-breen ๐ฌ Email: rbreen@ynteractive.com
by Arlin Perez
Make your n8n instance faster, cleaner, and more efficient by deleting old workflow executions โ while keeping only the most recent ones you actually need. Whether you're using n8n Cloud or self-hosted, this lightweight workflow helps reduce database/storage usage and improves UI responsiveness, using only official n8n nodes. ๐ Description Automatically clean up old executions in your n8n instance using only official nodes โ no external database queries required. Whether you're on the Cloud version or running self-hosted, this workflow helps you optimize performance and keep your instance tidy by maintaining only the most recent executions per workflow. Ideal for users managing dozens or hundreds of workflows, this solution reduces storage usage and improves the responsiveness of the n8n UI, especially in environments where execution logs can accumulate quickly. โ What It Does Retrieves up to 250 recent executions across all workflows Groups executions by workflow Keeps only the most recent N executions per workflow (value is configurable) Deletes all older executions (regardless of their status: success, error, etc.) Works entirely with native n8n nodes โ no external database access required Optionally: set the number of executions to keep as 0 to delete all past executions from your instance in a single run ๐ ๏ธ How to Set Up ๐โCreate a Personal API Key in your n8n instance: Go to Settings โ API Keys โ Create a new key ๐งโCreate a new n8n API Credential (used by both nodes): In your n8n credentials panel: Name: anything you like (e.g., โInternal API Accessโ) API Key: paste the Personal API Key you just created Base URL: your full n8n instance URL with the /api/v1 path, e.g. https://your-n8n-instance.com/api/v1 โ โUse this credential in both: The Get Many Executions node (to fetch recent executions) The Delete Many Executions node (to remove outdated executions) ๐งฉโIn the โSet Executions to Keepโ node: Edit the variable executionsToKeep and set the number of most recent executions to retain per workflow (e.g. 10) Tip: Set it to 0 to delete all executions ๐ฆโNote: The โGet Many Executionsโ node will retrieve up to 250 executions per run โ this is the maximum allowed by the n8n API. ๐ง โNo further setup is required โ the filtering and grouping logic is handled inside the Code Node automatically. ๐งช Included Nodes Overview ๐ Schedule Trigger โ Set to run daily, weekly, etc. ๐ฅ Get Many Executions โ Fetches past executions via n8n API ๐ ๏ธ Set Executions to Keep โ Set how many recent ones to keep ๐ง Code Node โ Filters out executions to delete per workflow ๐๏ธ Delete Executions โ Deletes outdated executions ๐ก Why Use This? Reduce clutter and improve performance in your n8n instance Maintain execution logs only when theyโre useful Avoid bloating your storage or database with obsolete data Compatible with both n8n Cloud and self-hosted setups Uses only official, supported n8n nodes โ no SQL, no extra setup ๐ This workflow modifies and deletes execution data. Always review and test it first on a staging instance or on a limited set of workflows before using it in production.
by Sankalp Dev
This automation workflow transforms Meta advertising data into executive ready presentation decks, eliminating manual report creation while ensuring stakeholders receive consistent performance insights. It generates professional Google Slides presentations from your ad campaigns and delivers them automatically via email to designated recipients. By combining scheduled data extraction with AI-powered analysis and automated presentation building, you'll receive polished, actionable reports that facilitate strategic advertising decisions and client communication Key Features: Scheduled automated summary deck generation (daily, weekly, or monthly) AI powered data analysis using advanced language models Intelligent presentation generation with actionable recommendations Direct email delivery of formatted summary decks Prerequisites: GoMarble MCP account and API access Anthropic account Google Slides, Google Drive & Gmail accounts n8n instance (cloud or self-hosted) Configuration Time: ~15-20 minutes Step By Step Setup: 1. Connect GoMarble MCP to n8n Follow the integration guide: GoMarble MCP Setup Configure your Meta Ads account credentials in GoMarble platform 2. Configure the Schedule Trigger 3.Customize the Ad Account Settings. Update the account name to match your ad account name. 4. Customise the Report Prompt (Although the workflow includes a pre configured template report prompt) Define specific metrics and KPIs to track Set analysis parameters and report format preferences 5. Set up AI Agent Configuration Configure Anthropic Claude model with your API credentials Connect the GoMarble MCP tools for Meta advertising data 6. Configure Google Services Integration Set up Google Slides OAuth2 API for presentation creation Configure Google Drive OAuth2 API for file management Link Gmail OAuth2 for automated email delivery 7. Customize Email Delivery Set recipient email addresses for stakeholders Customize email subject line and message content Advanced Configuration Modify report prompt to include specific metrics and KPIs Adjust slide content structure (5-slide format: Executive Snapshot, Channel KPIs, Top Campaigns, Under-performers, Action Recommendations) What You'll Get Automated Presentation Creation: Weekly Google Slides decks generated without manual intervention Professional Ads Analysis: Executive-ready performance summaries with key metrics and insights Structured Intelligence: Consistent 5-slide format covering spend, ROAS, campaign performance, and strategic recommendations Direct Stakeholder Delivery: Presentations automatically emailed as attachments to specified recipients Data-Driven Insights: AI-powered analysis of campaign performance with actionable next steps Scalable Reporting: Easy to modify timing, recipients, or content structure as business needs evolve Perfect for marketing teams, agencies, and business owners who need regular Meta advertising performance updates delivered professionally without manual report creation.
by Hemanth Arety
Automatically fetch, curate, and distribute Reddit content digests using AI-powered filtering. This workflow monitors multiple subreddits, ranks posts by relevance, removes spam and duplicates, then delivers beautifully formatted digests to Telegram, Discord, or Slack. Who's it for Perfect for content creators tracking trends, marketers monitoring discussions, researchers following specific topics, and community managers staying informed. Anyone who wants high-quality Reddit updates without manually browsing multiple subreddits. How it works The workflow fetches top posts from your chosen subreddits using Reddit's JSON API (no authentication required). Posts are cleaned, deduplicated, and filtered by upvote threshold and custom keywords. An AI model (Google Gemini, OpenAI, or Claude) then ranks remaining posts by relevance, filters out low-quality content, and generates a formatted digest. The final output is delivered to your preferred messaging platform on a schedule or on-demand. Setup requirements n8n version 1.0+ AI provider API key (Google Gemini recommended - has free tier) At least one messaging platform configured: Telegram bot token + chat ID Discord webhook URL Slack OAuth token + channel access How to set up Open the Configuration node and edit subreddit list, post counts, and keywords Configure the Schedule Trigger or use manual execution Add your AI provider credentials in the AI Content Curator node Enable and configure your preferred delivery platform (Telegram/Discord/Slack) Test with manual execution, then activate the workflow Customization options Subreddits**: Add unlimited subreddits to monitor (comma-separated) Time filters**: Choose from hour, day, week, month, year, or all-time top posts Keywords**: Set focus keywords to prioritize and exclude keywords to filter out Post count**: Adjust how many posts to fetch vs. how many appear in final digest AI prompt**: Customize ranking criteria and output format in the AI node Schedule**: Use cron expressions for hourly, daily, or weekly digests Output format**: Modify the formatting code to match your brand style Add email notifications, database storage, or RSS feed generation by extending the workflow with additional nodes.
by Jimmy Gay
๐ง AI-Powered Auto-Maintenance System for n8n Transform your n8n instance management with this advanced automation system featuring artificial intelligence-driven workflow selection. This template provides comprehensive maintenance operations with smart filtering capabilities. โจ Key Features ๐ค Artificial Intelligence Engine Multi-criteria scoring system for intelligent workflow selection Semantic analysis for business-critical pattern recognition Automated decision-making with configurable thresholds ๐ฏ Core Maintenance Operations Security Audits**: Automated vulnerability scanning with Google Sheets reporting Smart Pause/Resume**: Intelligent workflow suspension during maintenance windows AI Backup Creation**: Selective duplication of high-value workflows Intelligent Export**: Comprehensive system backups with metadata ๐ Enterprise Security Token-based authentication with request validation Protected workflow safeguards (never modifies critical systems) Comprehensive error handling and logging โก Automation & Scheduling Configurable maintenance schedules (daily, weekly, monthly) Webhook-driven operations for external integration Real-time monitoring and statistics ๐ฏ Perfect For DevOps Teams**: Streamline n8n maintenance operations Enterprise Users**: Manage large-scale workflow environments System Administrators**: Automated security and backup management Advanced Users**: Leverage AI for intelligent workflow management ๐ Quick Setup Import the template Configure 4 credentials (n8n API, Google Sheets, Google Drive, Webhook Auth) Set your security token and Google Sheet ID Activate and enjoy automated maintenance! ๐ง AI Intelligence Highlights The system evaluates workflows using 6+ criteria including activity status, complexity, priority tags, business criticality, and recent updates. Workflows are automatically scored and selected based on intelligent thresholds. Selection Logic: Duplicate threshold: โฅ3 points (smart backup selection) Export threshold: โฅ5 points (comprehensive backup) System workflows always protected ๐ Includes 25+ configured nodes with emoji naming 4 detailed markdown documentation cards Pre-configured schedules and examples Comprehensive error handling Statistical reporting and monitoring Perfect for organizations looking to implement intelligent, automated n8n maintenance with minimal manual intervention.
by Didac Fernandez
AI-Powered Financial Document Processing with Google Gemini This comprehensive workflow automates the complete financial document processing pipeline using AI. Upload invoices via chat, drop expense receipts into a folder, or add bank statements - the system automatically extracts, categorizes, and organizes all your financial data into structured Google Sheets. What this workflow does Processes three types of financial documents automatically: Invoice Processing**: Upload PDF invoices through a chat interface and get structured data extraction with automatic file organization Expense Management**: Monitor a Google Drive folder for new receipts and automatically categorize expenses using AI Bank Statement Processing**: Extract and organize transaction data from bank statements with multi-transaction support Financial Analysis**: Query all your financial data using natural language with an AI agent Key Features Multi-AI Persona System**: Four specialized AI personas (Mark, Donna, Victor, Andrew) handle different financial functions Google Gemini Integration**: Advanced document understanding and data extraction from PDFs Smart Expense Categorization**: Automatic classification into 17 business expense categories using LLM Real-time Monitoring**: Continuous folder watching for new documents with automatic processing Natural Language Queries**: Ask questions about your financial data in plain English Automatic File Management**: Intelligent file naming and organization in Google Drive Comprehensive Error Handling**: Robust processing that continues even when individual documents fail How it works Invoice Processing Flow User uploads PDF invoice via chat interface File is saved to Google Drive "Invoices" folder Google Gemini extracts structured data (vendor, amounts, line items, dates) Data is parsed and saved to "Invoice Records" Google Sheet File is renamed as "{Vendor Name} - {Invoice Number}" Confirmation message sent to user Expense Processing Flow User drops receipt PDF into "Expense Receipts" Google Drive folder System detects new file within 1 minute Google Gemini extracts expense data (merchant, amount, payment method) OpenRouter LLM categorizes expense into appropriate business category All data saved to "Expenses Recording" Google Sheet Bank Statement Processing Flow User uploads bank statement to "Bank Statements" folder Google Gemini extracts multiple transactions from statement Custom JavaScript parser handles various bank formats Individual transactions saved to "Bank Transactions Record" Google Sheet Financial Analysis Enable the analysis trigger when needed Ask questions in natural language about your financial data AI agent accesses all three spreadsheets to provide insights Get reports, summaries, and trend analysis What you need to set up Required APIs and Credentials Google Drive API** - For file storage and monitoring Google Sheets API** - For data storage and retrieval Google Gemini API** - For document processing and data extraction OpenRouter API** - For expense categorization (supports multiple LLM providers) Google Drive Folder Structure Create these folders in your Google Drive: "Invoices" - Processed invoice storage "Expense Receipts" - Drop zone for expense receipts (monitored) "Bank Statements" - Drop zone for bank statements (monitored) Google Sheets Setup Create three spreadsheets with these column headers: Invoice Records Sheet: Vendor Name, Invoice Number, Invoice Date, Due Date, Total Amount, VAT Amount, Line Item Description, Quantity, Unit Price, Total Price Expenses Recording Sheet: Merchant Name, Transaction Date, Total Amount, Tax Amount, Payment Method, Line Item Description, Quantity, Unit Price, Total Price, Category Bank Transactions Record Sheet: Transaction ID, Date, Description/Payee, Debit (-), Credit (+), Currency, Running Balance, Notes/Category Use Cases Small Business Accounting**: Automate invoice and expense tracking for bookkeeping Freelancer Financial Management**: Organize client invoices and business expenses Corporate Expense Management**: Streamline employee expense report processing Financial Data Analysis**: Generate insights from historical financial data Bank Reconciliation**: Automate transaction recording and account reconciliation Tax Preparation**: Maintain organized records with proper categorization Technical Highlights Expense Categories**: 17 predefined business expense categories (Cost of Goods Sold, Marketing, Payroll, etc.) Multi-format Support**: Handles various PDF layouts and bank statement formats Scalable Processing**: Processes multiple documents simultaneously Error Recovery**: Continues processing even when individual documents fail Natural Language Interface**: No technical knowledge required for financial queries Real-time Processing**: Documents processed within minutes of upload Benefits Time Savings**: Eliminates manual data entry from financial documents Accuracy**: AI-powered extraction reduces human error Organization**: Automatic file naming and categorization Insights**: Query financial data using natural language Compliance**: Maintains organized records for accounting and audit purposes Scalability**: Handles growing document volumes without additional overhead This workflow transforms tedious financial document processing into an automated, intelligent system that grows with your business needs.
by Luka Zivkovic
Complete Telegram Trivia Bot with AI Question Generation Build a fully-featured Telegram trivia bot that automatically generates fresh questions daily using OpenAI and tracks user progress with NocoDB. Perfect for communities, education, or entertainment! โจ Key Features ๐ค AI Question Generation: Automatically creates 40+ new trivia questions daily across 8 categories ๐ Smart User Management: Tracks scores, prevents question repeats, maintains leaderboards ๐ฎ Game Mechanics: Star-based difficulty scoring, answer history, progress tracking ๐ Competitive Elements: Real-time leaderboards with emoji rankings and user positioning ๐ก๏ธ Robust Architecture: Error handling, state management, and data validation ๐ Perfect For Community Engagement**: Keep Telegram groups active with daily trivia challenges Educational Content**: Create learning experiences with categorized questions Business Applications**: Employee training, customer engagement, lead generation Personal Projects**: Learn n8n automation while building something fun ๐ฑ Supported Commands /start - Welcome new users with setup instructions /question - Get personalized trivia questions (never repeats correctly answered ones) /score - View current points and statistics /leaderboard - See top 10 players with rankings /stats - Detailed accuracy and performance metrics /help - Complete command reference ๐ง How It Works User Journey: User sends /question command to bot System checks their answer history to avoid repeats Displays fresh question with multiple choice options Processes answer, updates score based on difficulty stars Saves complete answer history for future filtering AI Content Pipeline: Daily scheduler triggers question generation OpenAI creates 5 questions per category (8 categories total) Questions automatically saved to NocoDB with difficulty ratings Content includes explanations and proper formatting ๐ ๏ธ Set Up Steps Prerequisites: n8n instance (cloud or self-hosted) NocoDB database (free tier works) OpenAI API key (Not required if you want to add questions yourself) Telegram bot token Database Setup: Create 3 NocoDB tables with the exact field specifications provided in the sticky notes. The workflow includes complete schema documentation. Configuration Time: ~15 minutes for database setup + API keys Detailed Setup Instructions: All setup steps, database schemas, and configuration details are documented in the workflow's sticky notes for easy implementation. ๐ Advanced Features Question History Tracking**: Users never see correctly answered questions again Difficulty-Based Scoring**: 1-5 star rating system with corresponding points Category Management**: 8 different trivia categories for variety State Management**: Proper game flow with idle/waiting states Error Handling**: Graceful fallbacks for all edge cases Scalable Architecture**: Supports unlimited concurrent users ๐ฏ Business Applications Lead Generation**: Capture user data through engaging trivia Employee Training**: Create custom questions for onboarding Customer Engagement**: Keep users active in your Telegram community Educational Tools**: Subject-specific learning with progress tracking Event Activation**: Conferences, workshops, or team building ๐ก Customization Options Modify question categories for your niche Adjust scoring systems and difficulty levels Add custom commands and features Integrate with other platforms or APIs Create specialized question sets ๐ Get Started Ready to build your own AI-powered trivia bot? Start with n8n and follow the comprehensive setup guide included in this workflow template. Next Steps: Import this workflow template Follow the database setup instructions in sticky notes Configure your API credentials Test with sample questions Launch your trivia bot! Turn your friend group into trivia champions with AI-generated questions that spark friendly competition!
by Anna Bui
๐ฏ LinkedIn ICP Lead Qualification Automation Automatically identify and qualify ideal customer prospects from LinkedIn post reactions using AI-powered profile analysis and intelligent data enrichment. Perfect for sales teams and marketing professionals who want to convert LinkedIn engagement into qualified leads without manual research. This workflow transforms post reactions into actionable prospect data with AI-driven ICP classification. Good to know LinkedIn Safety**: Only use cookie-free Apify actors to avoid account detection and suspension risks Daily Processing Limits**: Scrape maximum 1 page of reactions per day (50-100 profiles) to stay under LinkedIn's radar Apify actors cost approximately $0.01-0.05 per profile scraped - budget accordingly for daily processing Includes intelligent rate limiting to prevent API restrictions and maintain LinkedIn account safety AI classification requires clear definition of your Ideal Customer Profile criteria Processing too many profiles or running too frequently will trigger LinkedIn's anti-scraping measures Always monitor your LinkedIn account health and Apify usage patterns for any warning signs How it works Scrapes LinkedIn post reactions using Apify's specialized actor to identify engaged users Extracts and cleans profile data including names, job titles, and LinkedIn URLs Checks against existing Airtable records to prevent duplicate processing and save costs Creates new prospect records with basic information for tracking purposes Enriches profiles with comprehensive LinkedIn data including company details and experience Aggregates and formats profile data for AI analysis and classification Uses AI to analyze prospects against your ICP criteria with detailed reasoning Updates records with ICP classification results and extracted email addresses Implements smart batching and delays to respect API rate limits throughout the process How to use IMPORTANT**: Select cookie-free Apify actors only to avoid LinkedIn account suspension Set up Apify API credentials in both HTTP Request nodes for safe LinkedIn scraping Configure Airtable OAuth2 authentication and select your prospect tracking base Replace the LinkedIn post URL with your target post in the initial scraper node Daily Usage**: Process only 1 page of reactions per day (typically 50-100 profiles) maximum Customize the AI classification prompt with your specific ICP criteria and job titles Test with a small batch first to verify setup and monitor both API costs and LinkedIn account health Schedule workflow to run daily rather than processing large batches to maintain account safety Requirements Apify account with API access and sufficient credits for profile scraping Airtable account with OAuth2 authentication configured OpenAI or compatible AI model credentials for prospect classification LinkedIn post URL with reactions to analyze (minimum 10+ reactions recommended) Clear definition of your Ideal Customer Profile criteria for accurate AI classification Customising this workflow Safety First**: Always verify Apify actors are cookie-free before configuring to protect your LinkedIn account Modify ICP classification criteria in the AI prompt to match your specific target customer profile Set up daily scheduling (not hourly/frequent) to respect LinkedIn's usage patterns and avoid detection Adjust rate limiting delays based on your comfort level with LinkedIn scraping frequency Add additional data fields to Airtable schema for storing custom prospect information Integrate with CRM systems like HubSpot or Salesforce for automatic lead import Set up Slack notifications for new qualified prospects or daily summary reports Create email marketing sequences in tools like Mailchimp for nurturing qualified leads Add lead scoring based on company size, industry, or engagement level for prioritization Consider rotating between different LinkedIn posts to diversify your prospect sources while maintaining daily limits