by Abdullah Alshiekh
What Problem Does It Solve? We’ve all been there: you want to check if a product is cheaper on Amazon or Jumia, but opening a dozen tabs is a pain. Building a bot to do this usually fails because big e-commerce sites love to block scrapers with CAPTCHAs. This workflow fixes that headache by: Taking a product name from a chat message. Using Decodo to handle the hard part—searching Google and scraping the product pages without getting blocked. Using AI to read the messy HTML and pull out just the price and product name. Sending a clean "Best Price" summary back to the user instantly. How to Configure It Telegram Setup Create a bot with BotFather and paste your token into the Telegram node. Make sure your webhook is set up so the bot actually "hears" the messages. Decodo This is the engine that makes the workflow reliable. You'll need to add your Decodo API key in the credentials. We used Decodo here specifically because it handles the proxies and browser fingerprinting for you—so your Amazon requests actually go through instead of failing. AI Setup Plug in your OpenAI API key (or swap the node for Claude/Gemini if you prefer). The system prompt is already set up to ignore ads and find the real price, but feel free to tweak the tone. How It Works Trigger: You text the bot a product name (e.g., "Sony XM5"). Search: The workflow asks Decodo to Google that specific term on sites like Amazon.eg. Scrape: It grabs the URLs and passes them back to Decodo to fetch the page content safely. Extract: The AI reads through the text, finds the lowest price, and ignores the clutter. Reply: The bot texts you back with the best deal found. Customization Ideas Go wider:** Edit the search query to check other stores like Noon or Carrefour. Track trends:** Connect a Google Sheet to log what people are searching for—great for market research. If you need any help Get In Touch
by vinci-king-01
Sales Pipeline Automation Dashboard with AI Lead Intelligence 🎯 Target Audience Sales managers and team leads Business development representatives Marketing teams managing lead generation CRM administrators and sales operations Account executives and sales representatives Sales enablement professionals Revenue operations (RevOps) teams 🚀 Problem Statement Manual lead qualification and sales pipeline management is inefficient and often leads to missed opportunities or poor lead prioritization. This template solves the challenge of automatically scoring, qualifying, and routing leads using AI-powered intelligence to maximize conversion rates and sales team productivity. 🔧 How it Works This workflow automatically processes new leads using AI-powered intelligence, scores and qualifies them based on multiple factors, and automates the entire sales pipeline from lead capture to deal creation. Key Components Dual Trigger System - Scheduled monitoring and webhook triggers for real-time lead processing AI-Powered Lead Intelligence - Advanced scoring algorithm based on 7 key factors Multi-Source Data Enrichment - LinkedIn and Crunchbase integration for comprehensive lead profiles Automated Sales Actions - Intelligent routing, task creation, and follow-up sequences Multi-Platform Integration - HubSpot CRM, Slack notifications, and Google Sheets dashboard 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the lead was processed | "2024-01-15T10:30:00Z" | | lead_id | String | Unique lead identifier | "LEAD-2024-001234" | | first_name | String | Lead's first name | "John" | | last_name | String | Lead's last name | "Smith" | | email | String | Lead's email address | "john@company.com" | | company_name | String | Company name | "Acme Corp" | | job_title | String | Lead's job title | "Marketing Director" | | lead_score | Number | AI-calculated score (0-100) | 85 | | grade | String | Lead grade (A+, A, B+, B, C+) | "A+" | | category | String | Lead category | "Enterprise" | | priority | String | Priority level | "Critical" | | lead_source | String | How the lead was acquired | "Website Form" | | assigned_rep | String | Assigned sales representative | "Senior AE" | | company_size | String | Company employee count | "201-500 employees" | | industry | String | Company industry | "Technology" | | funding_stage | String | Company funding stage | "Series B" | | estimated_value | String | Estimated deal value | "$50K-100K" | 🛠️ Setup Instructions Estimated setup time: 25-30 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials HubSpot CRM account with API access Google Sheets account with API access Slack workspace for notifications (optional) Email service for welcome emails (optional) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up HubSpot CRM Integration Add HubSpot API credentials Grant necessary permissions for contacts, deals, and tasks Configure custom properties for lead scoring and qualification Test the connection to ensure it's working 4. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for sales pipeline data Configure the sheet name (default: "Sales Pipeline") 5. Configure Lead Scoring Parameters Update the lead scoring weights in the Code node Customize ideal customer profile criteria Set automation trigger thresholds Adjust sales rep assignment logic 6. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials for welcome emails Define notification preferences for different lead grades Test notification delivery 7. Configure Triggers Set up webhook endpoint for real-time lead capture Configure scheduled trigger for periodic monitoring Choose appropriate time zones for your business hours Test both trigger mechanisms 8. Test and Validate Run the workflow manually with sample lead data Check HubSpot for proper contact and deal creation Verify Google Sheets data formatting Test all notification channels 🔄 Workflow Customization Options Modify Lead Scoring Algorithm Adjust scoring weights for different factors Add new scoring criteria (geographic location, technology stack, etc.) Customize ideal customer profile parameters Implement industry-specific scoring models Extend Data Enrichment Add more data sources (ZoomInfo, Apollo, etc.) Include social media presence analysis Add technographic data collection Implement intent signal detection Customize Sales Automation Modify follow-up sequences for different lead categories Add more sophisticated sales rep assignment logic Implement territory-based routing Add automated meeting scheduling Output Customization Add data visualization and reporting features Implement sales pipeline analytics Create executive dashboards with key metrics Add conversion rate tracking and analysis 📈 Use Cases Lead Qualification**: Automatically score and qualify incoming leads Sales Pipeline Management**: Streamline the entire sales process Lead Routing**: Intelligently assign leads to appropriate sales reps Follow-up Automation**: Ensure consistent and timely follow-up Sales Intelligence**: Provide comprehensive lead insights Performance Tracking**: Monitor sales team and pipeline performance 🚨 Important Notes Respect LinkedIn and Crunchbase terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update your lead scoring parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Ensure GDPR compliance for lead data processing 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status HubSpot API errors: Check API key and permissions Google Sheets permission errors: Check OAuth2 scope and permissions Lead scoring errors: Review the Code node's JavaScript logic Rate limiting: Adjust request frequency and implement delays Support Resources: ScrapeGraphAI documentation and API reference HubSpot API documentation and developer resources n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Sales automation best practices and guidelines
by Amirul Hakimi
🚀 Enrich CRM Leads with LinkedIn Company Data Using AI Who's it for Sales teams, marketers, and business development professionals who need to automatically enrich their CRM records with detailed company information from LinkedIn profiles. Perfect for anyone doing B2B outreach who wants to personalize their messaging at scale. What it does This workflow transforms bare-bones lead records into rich, personalized prospect profiles by: Automatically scraping LinkedIn company profiles Using AI (GPT-4) to extract key business intelligence Generating 15+ email-ready personalization variables Updating your CRM with structured, actionable data The workflow pulls company overviews, products/services, funding information, recent posts, and converts everything into natural-language variables that can be dropped directly into your outreach templates. How it works Trigger: Workflow starts when a new lead is added to Airtable (or on schedule) Fetch: Retrieves the lead record containing the LinkedIn company URL Scrape: Pulls the raw HTML from the company's LinkedIn profile Clean: Strips HTML tags and formats content for AI processing Analyze: GPT-4 extracts structured company intelligence (overview, products, market presence, recent posts) Transform: Converts analysis into 15+ email-ready variables with natural phrasing Update: Writes enriched data back to your CRM Setup Requirements Airtable account** (free tier works fine) OpenAI API key** (GPT-4o-mini recommended for cost-effectiveness) LinkedIn company URLs** stored in your CRM 5 minutes** for initial configuration How to set up Configure Airtable Connection Replace YOUR_AIRTABLE_BASE_ID with your base ID Replace YOUR_TABLE_ID with your leads table ID Ensure your table has a "LinkedIn Organization URL" field Add your Airtable API credentials Add OpenAI Credentials Click on both OpenAI nodes Add your OpenAI API key GPT-4o-mini is recommended (cost-effective and fast) Set Up Trigger Add a trigger node (Schedule, Webhook, or Airtable trigger) Configure to run when new leads are added or on a daily schedule Test the Workflow Add a test lead with a LinkedIn company URL Execute
by n8n Automation Expert | Template Creator | 2+ Years Experience
🚀 Transform Your Job Hunt with AI-Powered Telegram Bot Turn job searching into a conversational experience! This intelligent Telegram bot automatically scrapes job postings from LinkedIn, Indeed, and Monster, filters for sales & marketing positions, and delivers personalized results directly to your chat. ✨ Key Features Interactive Telegram Commands**: Simple /jobs [keyword] [location] searches Multi-Platform Scraping**: Simultaneous data collection from 3 major job boards AI-Powered Filtering**: Smart relevance detection and experience level classification Real-Time Notifications**: Instant job alerts delivered to Telegram Automated Data Storage**: Saves results to Google Sheets and Airtable Duplicate Removal**: Advanced deduplication across platforms Mobile-First Experience**: Full job search functionality through Telegram 🎯 Perfect For Sales Professionals**: Account managers, sales representatives, business development Marketing Experts**: Digital marketers, marketing managers, growth specialists Recruiters**: Streamlined candidate sourcing and job market analysis Job Seekers**: Hands-free job discovery with instant notifications 🛠️ Setup Requirements Required Credentials: Telegram Bot Token**: Create bot via @BotFather Bright Data API**: Professional web scraping service (LinkedIn/Indeed datasets) Google Sheets OAuth2**: For spreadsheet integration Airtable Token**: Database storage and management Prerequisites: n8n instance with HTTPS enabled (required for Telegram webhooks) Valid domain name with SSL certificate Basic understanding of Telegram bot commands 🔧 How It Works User Experience: Send /start to activate the bot and see available commands Use /jobs sales manager New York to search for specific positions Receive formatted job results instantly in Telegram Click "Apply Now" links to go directly to job postings All jobs automatically saved to your connected spreadsheets Behind the Scenes: Command Processing: Bot parses user input for keywords and location Parallel Scraping: Simultaneous API calls to LinkedIn, Indeed, and Monster AI Processing: Intelligent filtering, experience level detection, remote work identification Data Enhancement: Salary extraction, duplicate removal, relevance scoring Multi-Format Storage: Automatic saving to Google Sheets, Airtable, and JSON export Real-Time Response: Formatted results delivered back to Telegram chat 🎨 Telegram Bot Commands /start - Welcome message and command overview /jobs [keyword] [location] - Search for jobs (e.g., /jobs marketing manager remote) /help - Show detailed help information /status - Check bot status and recent activity 📊 Sample Output The bot delivers beautifully formatted job results: 🎯 Job Search Results 🎯 Found 7 relevant opportunities Platforms: linkedin, indeed, monster Remote jobs: 3 ─────────────────── 💼 Senior Sales Manager 🏢 TechCorp Industries 📍 New York, NY 💰 $80,000 - $120,000 🌐 Remote Available 📊 senior level 🔗 Apply Now 🔒 Security & Best Practices Rate Limiting**: Built-in Telegram API compliance (30 requests/second) Error Handling**: Graceful failure recovery with user-friendly messages Input Validation**: Sanitized user input to prevent injection attacks Credential Management**: Secure API key storage using n8n credentials system HTTPS Enforcement**: Required for production Telegram webhook integration 📈 Benefits & ROI 95% Time Reduction**: Automated job discovery vs manual searching Multi-Source Coverage**: Access 3 major job platforms simultaneously Mobile Accessibility**: Search jobs anywhere using Telegram mobile app Real-Time Alerts**: Never miss new opportunities with instant notifications Data Organization**: Automatic spreadsheet management for job tracking Market Intelligence**: Comprehensive job market analysis and trends 🚀 Advanced Customization Custom Keywords**: Modify filtering logic for specific industries Location Targeting**: Adjust geographic search parameters Experience Levels**: Fine-tune senior/mid/entry level detection Additional Platforms**: Easily add more job boards via HTTP requests Notification Scheduling**: Set up periodic automated job alerts Team Integration**: Deploy for multiple users or team channels 💡 Use Cases Individual Job Seekers**: Personal job hunting assistant Recruitment Agencies**: Streamlined candidate sourcing Sales Teams**: Territory-specific opportunity monitoring Marketing Departments**: Industry trend analysis and competitor tracking Career Coaches**: Client job market research and opportunity identification Ready to revolutionize your job search? Deploy this workflow and start receiving personalized job opportunities directly in Telegram!
by Omer Fayyaz
An intelligent web scraping workflow that automatically routes URLs to site-specific extraction logic, normalizes data across multiple sources, and filters content by freshness to build a unified article feed. What Makes This Different: Intelligent Source Routing** - Uses a Switch node to route URLs to specialized extractors based on source identifier, enabling custom CSS selectors per publisher for maximum accuracy Universal Fallback Parser** - Advanced regex-based extractor handles unknown sources automatically, extracting title, description, author, date, and images from meta tags and HTML patterns Freshness Filtering** - Built-in 45-day freshness threshold filters outdated content before saving, with configurable date validation logic Tier-Based Classification** - Automatically categorizes articles into Tier 1 (0-7 days), Tier 2 (8-14 days), Tier 3 (15-30 days), or Archive based on publication date Rate Limiting & Error Handling** - Built-in 3-second delays between requests prevents server overload, with comprehensive error handling that continues processing even if individual URLs fail Status Tracking** - Updates source spreadsheet with processing status, enabling easy monitoring and retry logic for failed extractions Key Benefits of Multi-Source Content Aggregation: Scalable Architecture** - Easily add new sources by adding a Switch rule and extraction node, no code changes needed for most sites Data Normalization** - Standardizes extracted data across all sources into a consistent format (title, description, author, date, image, canonical URL) Automated Processing** - Schedule-based execution (every 4 hours) or manual triggers keep your feed updated without manual intervention Quality Control** - Freshness filtering ensures only recent, relevant content enters your feed, reducing noise from outdated articles Flexible Input** - Reads from Google Sheets, making it easy to add URLs in bulk or integrate with other systems Comprehensive Metadata** - Captures full article metadata including canonical URLs, publication dates, author information, and featured images Who's it for This template is designed for content aggregators, news monitoring services, content marketers, SEO professionals, researchers, and anyone who needs to collect and normalize articles from multiple websites. It's perfect for organizations that need to monitor competitor content, aggregate industry news, build content databases, track publication trends, or create unified article feeds without manually scraping each site or writing custom scrapers for every source. How it works / What it does This workflow creates a unified article aggregation system that reads URLs from Google Sheets, routes them to site-specific extractors, normalizes the data, filters by freshness, and saves results to a feed. The system: Reads Pending URLs - Fetches URLs with source identifiers from Google Sheets, filtering for entries with "Pending" status Processes with Rate Limiting - Loops through URLs one at a time with a 3-second delay between requests to respect server resources Fetches HTML Content - Downloads page HTML with proper browser headers (User-Agent, Accept, Accept-Language) to avoid blocking Routes by Source - Switch node directs URLs to specialized extractors (Site A, B, C, D) or universal fallback parser based on Source field Extracts Article Data - Site-specific HTML nodes use custom CSS selectors, while fallback uses regex patterns to extract title, description, author, date, image, and canonical URL Normalizes Data - Standardizes all extracted fields into consistent format, handling missing values and trimming whitespace Filters by Freshness - Validates publication dates and filters out articles older than 45 days (configurable threshold) Calculates Tier & Status - Assigns tier classification and freshness status based on article age Saves to Feed - Appends normalized articles to Article Feed sheet with all metadata Updates Status - Marks processed URLs as complete in source sheet for tracking Key Innovation: Source-Based Routing - Unlike generic scrapers that use one-size-fits-all extraction, this workflow uses intelligent routing to apply site-specific CSS selectors. This dramatically improves extraction accuracy while maintaining a universal fallback for unknown sources, making it both precise and extensible. How to set up 1. Prepare Google Sheets Create a Google Sheet with two tabs: "URLs to Process" and "Article Feed" In "URLs to Process" sheet, create columns: URL, Source, Status Add sample data: URLs in URL column, source identifiers (e.g., "Site A", "Site B") in Source column, and "Pending" in Status column In "Article Feed" sheet, the workflow will automatically create columns: Title, Description, Author, datePublished, imageUrl, canonicalUrl, source, sourceUrl, tier, freshnessStatus, extractedAt Verify your Google Sheets credentials are set up in n8n (OAuth2 recommended) 2. Configure Google Sheets Nodes Open the "Read Pending URLs" node and select your spreadsheet from the document dropdown Set sheet name to "URLs to Process" Configure the "Save to Article Feed" node: select same spreadsheet, set sheet name to "Article Feed", operation should be "Append or Update" Configure the "Update URL Status" node: same spreadsheet, "URLs to Process" sheet, operation "Update" Test connection by running the "Read Pending URLs" node manually to verify it can access your sheet 3. Customize Source Routing Open the "Source Router" (Switch node) to see current routing rules for Site A, B, C, D, and fallback To add a new source: Click "Add Rule", set condition: {{ $('Loop Over URLs').item.json.Source }} equals your source name Create a new HTML extraction node for your source with appropriate CSS selectors Connect the new extractor to the "Normalize Extracted Data" node Update the Switch node to route to your new extractor Example CSS selectors for common sites: // WordPress sites title: "h1.entry-title, .post-title" author: ".author-name, .byline a" date: "time.entry-date, time[datetime]" // Modern CMS title: "h1.article__title, article h1" author: ".article__byline a, a[rel='author']" date: "time[datetime], meta[property='article:published_time']" 4. Configure Freshness Threshold Open the "Freshness Filter (45 days)" IF node The current threshold is 45 days (configurable in the condition expression) To change threshold: Modify the expression cutoffDate.setDate(cutoffDate.getDate() - 45) to your desired number of days The filter marks articles as "Fresh" (within threshold) or routes to "Outdated" handler Test with sample URLs to verify date parsing works correctly for your sources 5. Set Up Scheduling & Test The workflow includes both Manual Trigger (for testing) and Schedule Trigger (runs every 4 hours) To customize schedule: Open "Schedule (Every 4 Hours)" node and adjust interval For initial testing: Use Manual Trigger, add 2-3 test URLs to your sheet with Status="Pending" Verify execution: Check that URLs are fetched, routed correctly, extracted, and saved to Article Feed Monitor the "Completion Summary" node output to see processing statistics Check execution logs for any errors in HTML extraction or date parsing Common issues: Missing CSS selectors (update extractor), date format mismatches (adjust date parsing), or rate limiting (increase wait time if needed) Requirements Google Sheets Account** - Active Google account with OAuth2 credentials configured in n8n for reading and writing spreadsheet data Source Spreadsheet** - Google Sheet with "URLs to Process" and "Article Feed" tabs, properly formatted with required columns n8n Instance** - Self-hosted or cloud n8n instance with access to external websites (HTTP Request node needs internet connectivity) Source Knowledge** - Understanding of target website HTML structure to configure CSS selectors for site-specific extractors (or use fallback parser for unknown sources)
by Luis Hernandez
Overview This comprehensive n8n workflow automates the generation and distribution of detailed monthly technical support reports from GLPI (IT Service Management platform). The workflow intelligently calculates SLA compliance, analyzes technician performance, and delivers professionally formatted HTML reports via email. ✨ Key Features Intelligent SLA Calculation Business Hours Tracking: Automatically calculates resolution time considering only working hours (excludes weekends and lunch breaks) Configurable Schedule: Customizable work hours (default: 8 AM - 12 PM, 1 PM - 6 PM) Dynamic SLA Monitoring: Real-time compliance tracking with configurable thresholds (default: 24 hours) Visual Indicators: Color-coded alerts for critical SLA breaches and high-volume warnings Comprehensive Reporting General Summary: Total cases, open, in-progress, resolved, and closed tickets Performance Metrics: Total and average resolution hours in both decimal and formatted (hours/minutes) display Technician Breakdown: Individual performance analysis per technician including case distribution and SLA compliance Smart Alerts: Automatic warnings for high case volumes (>100 in-progress) and critical SLA levels (<50%) Professional Email Delivery Responsive HTML Design: Mobile-optimized email templates with elegant styling Dynamic Content: Conditional formatting based on performance metrics Automatic Scheduling: Monthly execution on the 6th day to ensure accurate SLA measurement 💼 Business Benefits Time Savings Eliminates Manual Work: Saves 2-4 hours per month previously spent compiling reports manually Automated Data Collection: No more exporting CSVs or copying data between systems One-Click Setup: Configure once and receive reports automatically every month Improved Decision Making Real-Time Insights: Identify bottlenecks and performance issues immediately Technician Accountability: Clear visibility into individual and team performance SLA Compliance Tracking: Proactively manage service level agreements before they become critical Enhanced Communication Stakeholder Ready: Professional reports suitable for management presentations Consistent Format: Standardized metrics ensure month-over-month comparability Instant Distribution: Automatic email delivery to relevant stakeholders 🔧 Technical Specifications Requirements n8n instance (self-hosted or cloud) GLPI server with API access enabled Gmail account (or any SMTP-compatible email service) GLPI API credentials (App-Token and User credentials) Configuration Points Variables Node: Server URL, API tokens, entity name, work hours, SLA limits Schedule Trigger: Monthly execution timing (default: 6th of each month) Email Recipient: Target email address for report delivery Date Range Logic: Automatic previous month calculation Data Processing Retrieves up to 999 tickets per execution (configurable) Filters by entity and date range Excludes weekends and non-business hours from calculations Groups data by technician for detailed analysis 📋 Setup Instructions Prerequisites GLPI Configuration: Enable API and configure the Tickets panel with required fields (ID, -Title, Status, Opening Date, Closing Date, Resolution Date, Priority, Requester, Assigned To) API Credentials: Create Basic Auth credentials in n8n for GLPI API access Email Authentication: Set up Gmail OAuth2 or SMTP credentials in n8n Implementation Steps Import the workflow JSON into your n8n instance Configure the Variables node with your GLPI server details and business hours Set up GLPI API credentials in the HTTP Request nodes Configure email credentials in the Gmail node Update the recipient email address Test the workflow manually before enabling the schedule Activate the workflow for automatic monthly execution 🎯 Use Cases IT Support Teams: Track helpdesk performance and SLA compliance Service Managers: Monitor team productivity and identify training needs Executive Reporting: Provide high-level summaries to stakeholders Resource Planning: Identify workload distribution and capacity issues Compliance Auditing: Maintain historical records of SLA performance 📈 ROI Impact Time Savings: 24-48 hours annually in manual reporting eliminated Error Reduction: Eliminates human calculation errors in SLA tracking Faster Response: Early alerts enable proactive issue resolution Better Visibility: Data-driven insights improve team management
by Intuz
This n8n template from Intuz provides a complete solution to automate your entire invoicing process. It intelligently syncs confirmed sales orders from your Airtable base to QuickBooks, automatically creating new customers if they don't exist before generating a perfectly matched invoice. It then logs all invoice details back into Airtable, creating a flawless, end-to-end financial workflow. Use Cases 1. Accounting & Finance Teams: Automatically generate QuickBooks invoices from new orders confirmed in Airtable. Keep all invoices and customer details synced across systems in real time. 2. Sales & Operations Teams: Track order status and billing progress directly from Airtable without switching platforms. Ensure every confirmed sale automatically triggers an invoice in QuickBooks. 3. Business Owners / Admins: Eliminate double-entry between Airtable and QuickBooks. Maintain accurate, audit-ready financial records with minimal effort. How it works 1. Trigger from Airtable: The workflow starts instantly when a sales order is ready to be invoiced in your Airtable base (triggered via a webhook). 2. Check for Customer in QuickBooks: It searches your QuickBooks account to see if the customer from the sales order already exists. 3. Create New Customer (If Needed): If the customer is not found, it automatically creates a new customer record in QuickBooks using the details from your Airtable Customers table. 4. Create QuickBooks Invoice: Using the correct customer record (either existing or newly created), it gathers all order line items from Airtable and generates a detailed invoice in QuickBooks. 5. Log Invoice Back to Airtable: After the invoice is successfully created, the workflow updates your Airtable base by adding a new record to your Invoices & Payments table and updating the original Confirmed Orders record with the new QuickBooks Invoice ID, marking it as synced. Key Requirements to Use This Template 1. n8n Instance: An active n8n account (Cloud or self-hosted). 2. Airtable Base: An Airtable base on a "Pro" plan or higher with tables for Confirmed Orders, Customers, Order Lines, Product & Service, and Invoices & Payments. Field names must match those in the setup guide. 3. QuickBooks Online Account: An active QuickBooks Online account with API access. Step-by-Step Setup Instructions Step 1: Import and Configure the n8n Workflow Import Workflow:** In n8n, import the Client-Quickbook-Invoices-via-AirTable.json file. Get Webhook URL:** Click on the first node, "Webhook". Copy the "Test URL". Keep this n8n tab open. Configure Airtable Nodes:** There are six Airtable nodes. For each one, connect your Airtable credentials and select the correct Base and Table. Configure QuickBooks Nodes:** There are four QuickBooks-related nodes. For each one, connect your QuickBooks Online credentials. CRITICAL:** Click on the "Create Invoice URL" (HTTP Request) node. You must edit the URL and replace the placeholder number (9341455145770046) with your own QuickBooks Company ID. (Find this in your QuickBooks account settings under "Billing & Subscription"). Save and Activate**: Click "Save", then toggle the workflow to "Active". After activating, copy the new "Production URL" from the Webhook node. Customization Guide You can adapt this template for various workflows by tweaking a few nodes: Use a different Airtable Base:** Update the Base ID and Table ID in all Airtable nodes (Get Orders Records, Get Customer Details, Get Products, etc.). Switch from Sandbox to Live QuickBooks:** Replace the Sandbox company ID and endpoint in the “Create Invoice URL” node with your production QuickBooks company ID. Add more invoice details:** Edit the Code and Parse in HTTP nodes to include additional fields (like Tax, Shipping, or Notes). Support multiple currencies:** Add a “Currency” field mapping in both Airtable and QuickBooks nodes. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Workflow Automation Click here- Get Started
by Abdullah Alshiekh
What Problem Does It Solve? Brands and marketers spend hours manually searching Google for product reviews. Reading through multiple websites to gauge general sentiment is tedious and inefficient. It is difficult to spot recurring customer complaints or praises without aggregating data. This workflow solves these by: Instantly searching and scraping review content from the web. Using AI to read and score the sentiment of every review found. Generating a consolidated "Executive Summary" with key quotes and actionable advice. How to Configure It Telegram Setup Connect your Telegram Bot credentials in n8n. Set the Get Message node to watch for text messages. Search & Scraping (Decodo) Connect your Decodo credentials (requires a Web Scraping API plan). This handles both the Google Search and the content extraction. AI Setup Add your Google Gemini API key. The prompts are pre-configured to act as a "Strict Data Analyst," but you can edit the system prompt in the AI Agent node to match your preferred tone. How It Works Trigger:** You send a company or product name (e.g., "XQ Pharma") to your Telegram bot. Search:** The workflow uses Decodo to Google search for "[Name] reviews" and extracts the top URL results. Scrape:** It visits the review pages and strips away the HTML code to get clean text. Analyze (Loop):** The first AI Agent reads the text and determines the sentiment (Positive/Neutral/Negative) and key topics. Report:* A second AI Agent collects all the analysis pieces and writes a final summary containing a *Sentiment Score, **Customer Voice (direct quotes), and an Actionable Verdict. Delivery:** The final report is sent back to you as a Telegram message. Customization Ideas Change the Source:** Modify the search query to target specific platforms (e.g., "site:reddit.com [Product] reviews"). Change the Output:* Send the final report to a *Slack channel* or *Email** for your team to see. Database Logging:* Save the "Actionable Verdict" and sentiment scores into *Notion* or *Airtable** to track brand reputation over time. Competitor Analysis:** Use it to research competitor products instead of your own to find their weaknesses. If you need any help Get in Touch
by Khairul Muhtadin
Stop wasting hours manually hunting for business leads. This workflow automates the entire process from scraping Google Maps to extracting contact emails all triggered from your phone via Telegram. What It Does Send a single message to your Telegram bot (Sector; Limit; MapsURL) and the system takes over. It scrapes business data from Google Maps using Apify, generates AI-powered company summaries via OpenAI, hunts for contact emails from business websites using Jina AI, then stores everything neatly in Google Sheets. Who It's For Sales reps building cold outreach lists, marketing agencies prospecting new clients, or anyone who needs targeted local business data fast — without paying for overpriced lead databases. Why It's Worth It Manual research that takes 4 hours gets done in under 5 minutes for 50 leads. Pay only for what you use (Apify + OpenAI) instead of fixed monthly subscriptions. AI deduplication keeps your CRM clean and consistent. What You'll Need | Tool | Purpose | |------|---------| | n8n | Workflow engine | | Apify | Google Maps scraper | | OpenAI API | Summaries & email extraction | | Google Sheets | Lead storage | | Telegram Bot | Mobile trigger interface | | Jina AI | Website-to-text conversion | Quick Setup Import the JSON workflow into your n8n instance Connect credentials: Telegram bot token, Apify API key, OpenAI key, Google account Set up your Sheet with the matching column headers Test with: Coffee Shops; 5; https://www.google.com/maps/search/coffee+shops+london How the Logic Works The workflow runs a two-stage loop per business. First it saves core data (name, phone, address). If a website exists, it then attempts email enrichment. This way, you never lose basic lead data even if a website crawl fails. Extend It Further Swap Google Sheets for HubSpot or Pipedrive, push results to a Slack sales channel, or chain a Gmail node to auto-send intro emails the moment a lead is found. Created by: Khaisa Studio Category: Marketing | Tags: Lead Gen, AI, Google Maps, Telegram Need custom workflows? Contact us Connect with the creator: Portfolio • Workflows • LinkedIn • Medium • Threads
by inderjeet Bhambra
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works? The Content Strategy AI Pipeline is an intelligent, multi-stage content creation system that transforms simple user prompts into polished, ready-to-publish content. The system intelligently extracts platform requirements, audience insights, and brand tone from user requests, then develops strategic reasoning and emotional connection strategies before crafting compelling content outlines and final publication-ready posts or articles. Supporting both social media platforms (Instagram, LinkedIn, X, Facebook, TikTok) and blog content. Key Differentiators: Strategic thinking approach, emotional intelligence integration, platform-native optimization, zero-editing-required output, and professional content strategist-level quality through multi-model AI orchestration. Technical Points Multi-model AI orchestration for specialized tasks Emotional psychology integration for audience connection Platform algorithm optimization built-in Industry-standard content strategy methodology automated Enterprise-grade reliability with session management and memory API-ready architecture for integration into existing workflows Test Inputs Sample Request: "Create an Instagram post for a fitness coach targeting busy moms, tone should be motivational and relatable" Expected Flow: Platform: Instagram → Niche: Fitness → Audience: Busy Moms → Tone: Motivational → Output: 125-150 word post with hashtags `
by vinci-king-01
Product Price Monitor with Mailgun and MongoDB ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically scrapes multiple e-commerce sites, records weekly product prices in MongoDB, analyzes seasonal trends, and emails a concise report to retail stakeholders via Mailgun. It helps retailers make informed inventory and pricing decisions by providing up-to-date pricing intelligence. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted, desktop, or n8n.cloud) ScrapeGraphAI community node installed and activated MongoDB database (Atlas or self-hosted) Mailgun account with a verified domain Publicly reachable n8n Webhook URL (if self-hosted) Required Credentials ScrapeGraphAI API Key** – Enables web scraping across target sites MongoDB Credentials** – Connection string (MongoDB URI) with read/write access Mailgun API Key & Domain** – To send summary emails MongoDB Collection Schema | Field | Type | Example Value | Notes | |-----------------|----------|---------------------------|---------------------------------------------| | productId | String | SKU-12345 | Unique identifier you define | | productName | String | Women's Winter Jacket | Human-readable name | | timestamp | Date | 2024-09-15T00:00:00Z | Ingest date (automatically added) | | price | Number | 79.99 | Scraped price | | source | String | example-shop.com | Domain where price was scraped | How it works This workflow automatically scrapes multiple e-commerce sites, records weekly product prices in MongoDB, analyzes seasonal trends, and emails a concise report to retail stakeholders via Mailgun. It helps retailers make informed inventory and pricing decisions by providing up-to-date pricing intelligence. Key Steps: Webhook Trigger**: Starts the workflow on a scheduled HTTP call or manual trigger. Code (Prepare Products)**: Defines the list of SKUs/URLs to monitor. Split In Batches**: Processes products in manageable chunks to respect rate limits. ScrapeGraphAI (Scrape Price)**: Extracts price, availability, and currency from each product URL. Merge (Combine Results)**: Re-assembles all batch outputs into one dataset. MongoDB (Upsert Price History)**: Stores each price point for historical analysis. If (Seasonal Trend Check)**: Compares current price against historical average to detect anomalies. Set (Email Payload)**: Formats the trend report for email. Mailgun (Send Email)**: Emails weekly summary to specified recipients. Respond to Webhook**: Returns “200 OK – Report Sent” response for logging. Set up steps Setup Time: 15-20 minutes Install Community Node In n8n, go to “Settings → Community Nodes” and install @n8n-community/nodes-scrapegraphai. Create Credentials Add ScrapeGraphAI API key under Credentials. Add MongoDB credentials (type: MongoDB). Add Mailgun credentials (type: Mailgun). Import Workflow Download the JSON template, then in n8n click “Import” and select the file. Configure Product List Open the Code (Prepare Products) node and replace the example array with your product objects { id, name, url }. Adjust Cron/Schedule If you prefer a fully automated schedule, replace the Webhook with a Cron node (e.g., every Monday at 09:00). Verify MongoDB Collection Ensure the collection (default: productPrices) exists or let n8n create it on first run. Set Recipients In the Mailgun node, update the to, from, and subject fields. Execute Test Run Manually trigger the Webhook URL or run the workflow once to verify data flow and email delivery. Activate Toggle the workflow to “Active” so it runs automatically each week. Node Descriptions Core Workflow Nodes: Webhook** – Entry point that accepts a GET/POST call to start the job. Code (Prepare Products)** – Outputs an array of products to monitor. Split In Batches** – Limits scraping to N products per request to avoid banning. ScrapeGraphAI** – Scrapes the HTML of a product page and parses pricing data. Merge** – Re-combines batch results for streamlined processing. MongoDB** – Inserts or updates each product’s price history document. If** – Determines whether price deviates > X% from the season average. Set** – Builds an HTML/text email body containing the findings. Mailgun** – Sends the email via Mailgun REST API. Respond to Webhook** – Returns an HTTP response for logging/monitoring. Sticky Notes** – Provide in-workflow documentation (no execution). Data Flow: Webhook → Code → Split In Batches Split In Batches → ScrapeGraphAI → Merge Merge → MongoDB → If If (true) → Set → Mailgun → Respond to Webhook Customization Examples Change Scraping Frequency (Cron) // Cron node settings { "mode": "custom", "cronExpression": "0 6 * * 1,4" // Monday & Thursday 06:00 } Extend Data Points (Reviews Count, Stock) // In ScrapeGraphAI extraction config { "price": "css:span.price", "inStock": "css:div.availability", "reviewCount": "regex:\"(\\d+) reviews\"" } Data Output Format The workflow outputs structured JSON data: { "productId": "SKU-12345", "productName": "Women's Winter Jacket", "timestamp": "2024-09-15T00:00:00Z", "price": 79.99, "currency": "USD", "source": "example-shop.com", "trend": "5% below 3-month average" } Troubleshooting Common Issues ScrapeGraphAI returns empty data – Confirm selectors/XPath are correct; test with ScrapeGraphAI playground. MongoDB connection fails – Verify IP-whitelisting for Atlas or network connectivity for self-hosted instance. Mail not delivered – Check Mailgun logs for bounce or spam rejection, and ensure from domain is verified. Performance Tips Use smaller batch sizes (e.g., 5 URLs) to avoid target site rate-limit blocks. Cache static product info; scrape only fields that change (price, stock). Pro Tips: Integrate the IF node with n8n’s Slack node to push urgent price drops to a channel. Add a Function node to calculate moving averages for deeper analysis. Store raw HTML snapshots in S3/MinIO for auditability and debugging.
by Meak
LinkedIn Job-Based Cold Email System Most outreach tools rely on generic lead lists and recycled contact data. This workflow builds a live, personalized lead engine that scrapes new LinkedIn job posts, finds company decision-maker emails, and generates custom cold emails using GPT — all fully automated through n8n. Benefits Automated daily scraping of “Marketing Manager” jobs in Belgium Real-time leads from companies currently hiring for marketing roles Filters out HR and staffing agencies to keep only real businesses Enriches each company with verified CEO, Sales, and Marketing emails Generates unique, human-like cold emails and subject lines with GPT-4o Saves clean data to Google Sheets and drafts personalized Gmail messages How It Works Schedule Trigger runs every morning at 08:00. Apify LinkedIn Scraper collects new “Marketing Manager” jobs in Belgium. Remove Duplicates ensures each company appears only once. Filter Staffing excludes recruiters, HR agencies, and interim firms. Save Useful Infos extracts core company data — name, domain, size, description. Filter Domain & Size keeps valid websites and companies under 100 employees. Anymailfinder API looks up CEO, Sales, and Marketing decision-maker emails. Merge + If Node validates email results and removes invalid entries. Split Out + Deduplicate ensures unique, verified contacts. Extract Lead Name (Code Node) separates first and last names. Google Sheets Node appends all enriched lead data to your master sheet. GPT-4o (LangChain) writes a 100–120 word personalized cold email. GPT-4o (LangChain) creates a short, casual subject line. Gmail Draft Node builds a ready-to-send email using both outputs. Wait Node loops until all leads are processed. Who Is This For B2B agencies targeting Belgian SMEs Outbound marketers using job postings as purchase intent signals Freelancers or founders running lean, automated outreach systems Growth teams building scalable cold email engines Setup Apify**: use curious_coder~linkedin-jobs-scraper actor + API token Anymailfinder**: header auth with decision-maker categories (ceo, sales, marketing) Google Sheets**: connect a sheet named “LinkedIn Job Scraper” and map columns OpenAI (GPT-4o)**: insert your API key into both LangChain nodes Gmail**: OAuth2 connection; resource set to draft n8n**: store all credentials securely; set HTTP nodes to continue on error ROI & Results Save 1–3 hours per day on manual research and outreach prep Contact active hiring companies when they need marketing help most Scale to multiple industries or regions by changing search URLs Outperform paid lead databases with fresh, verified data Strategy Insights Add funding or tech-stack data for better lead scoring A/B test GPT subject lines and log open rates in Sheets Schedule GPT follow-ups 3 and 7 days later for full automation Push all enriched data to your CRM for advanced segmentation Use hiring signals to trigger ad audiences or retargeting campaigns Check Out My Channel For more advanced automation workflows that generate real client results, check out my YouTube channel — where I share the exact systems I use to automate outreach, scale agency pipelines, and close deals faster.