by Khairul Muhtadin
Decodo Amazon Product Recommender delivers instant, AI-powered shopping recommendations directly through Telegram. Send any product name and receive Amazon product analysis featuring price comparisons, ratings, sales data, and categorized recommendations (budget, premium, best value) in under 40 seconds—eliminating hours of manual research. Why Use This Workflow? Time Savings: Reduce product research from 45+ minutes to under 30 seconds Decision Quality: Compare 20+ products automatically with AI-curated recommendations Zero Manual Work: Complete automation from message input to formatted recommendations Ideal For E-commerce Entrepreneurs:** Quickly research competitor products, pricing strategies, and market trends for inventory decisions Smart Shoppers & Deal Hunters:** Get instant product comparisons with sales volume data and discount tracking before purchasing Product Managers & Researchers:** Analyze Amazon marketplace positioning, customer sentiment, and pricing ranges for competitive intelligence How It Works Trigger: User sends product name via Telegram (e.g., "iPhone 15 Pro Max case") AI Validation: Gemini 2.5 Flash extracts core product keywords and validates input authenticity Data Collection: Decodo API scrapes Amazon search results, extracting prices, ratings, reviews, sales volume, and product URLs Processing: JavaScript node cleans data, removes duplicates, calculates value scores, and categorizes products (top picks, budget, premium, best value, most popular) Intelligence Layer: AI generates personalized recommendations with Telegram-optimized markdown formatting, shortened product names, and clean Amazon URLs Output & Delivery: Formatted recommendations sent to user with categorized options and direct purchase links Error Handling: Admin notifications via separate Telegram channel for workflow monitoring Setup Guide Prerequisites | Requirement | Type | Purpose | |-------------|------|---------| | n8n instance | Essential | Workflow execution platform | | Decodo Account | Essential | Amazon product data scraping | | Telegram Bot Token | Essential | Chat interface for user interactions | | Google Gemini API | Essential | AI-powered product validation and recommendations | | Telegram Account | Optional | Admin error notifications | Installation Steps Import the JSON file to your n8n instance Configure credentials: Decodo API: Sign up at decodo.com → Dashboard → Scraping APIs → Web Advanced → Copy BASIC AUTH TOKEN Telegram Bot: Message @BotFather on Telegram → /newbot → Copy HTTP API token (format: 123456789:ABCdefGHI...) Google Gemini: Obtain API key from Google AI Studio for Gemini 2.5 Flash model Update environment-specific values: Replace YOUR-CHAT-ID in "Notify Admin" node with your Telegram chat ID for error notifications Verify Telegram webhook IDs are properly configured Customize settings: Adjust AI prompt in "Generate Recommendations" node for different output formats Set character limits (default: 2500) for Telegram message length Test execution: Send test message to your Telegram bot: "iPhone 15 Pro" Verify processing status messages appear Confirm recommendations arrive with properly formatted links Customization Options Basic Adjustments: Character Limit**: Modify 2500 in AI prompt to adjust response length (Telegram max: 4096) Advanced Enhancements: Multi-language Support**: Add language detection and translation nodes for international users Price Tracking**: Integrate Google Sheets to log historical prices and trigger alerts on drops Image Support**: Enable Telegram photo messages with product images from scraping results Troubleshooting Common Issues: | Problem | Cause | Solution | |---------|-------|----------| | "No product detected" for valid inputs | AI validation too strict or ambiguous query | Add specific product details (model number, brand) in user input | | Empty recommendations returned | Decodo API rate limit or Amazon blocking | Wait 60 seconds between requests; verify Decodo account status | | Telegram message formatting broken | Special characters in product names | Ensure Telegram markdown mode is set to "Markdown" (legacy) not "MarkdownV2" | Use Case Examples Scenario 1: E-commerce Store Owner Challenge: Needs to quickly assess competitor pricing and product positioning for new inventory decisions without spending hours browsing Amazon Solution: Sends "wireless earbuds" to bot, receives categorized analysis of 20+ products with price ranges ($15-$250), top sellers, and discount opportunities Result: Identifies $35-$50 price gap in market, sources comparable product, achieves 40% profit margin Scenario 2: Smart Shopping Enthusiast Challenge: Wants to buy a laptop backpack but overwhelmed by 200+ Amazon options with varying prices and unclear value propositions Solution: Messages "laptop backpack" to bot, gets AI recommendations sorted by budget ($30), premium ($50+), best value (highest discount + good ratings), and most popular (by sales volume) Result: Purchases "Best Value" recommendation with 35% discount, saves $18 and 45 minutes of research time Created by: Khaisa Studio Category: AI | Productivity | E-commerce | Tags: amazon, telegram, ai, product-research, shopping, automation, gemini Need custom workflows? Contact us Connect with the creator: Portfolio • Workflows • LinkedIn • Medium • Threads
by Ranjan Dailata
Who this is for This workflow is designed for: Automation engineers building AI-powered data pipelines Product managers & analysts needing structured insights from web pages Researchers & content teams extracting summaries from documentation or articles HR, compliance, and knowledge teams converting unstructured web content into structured records n8n self-hosted users leveraging advanced scraping and LLM enrichment It is ideal for anyone who wants to transform any public URL into structured data + clean summaries automatically. What problem this workflow solves Web content is often unstructured, verbose, and inconsistent, making it difficult to: Extract structured fields reliably Generate consistent summaries Reuse data across spreadsheets, dashboards, or databases Eliminate manual copy-paste and interpretation This workflow solves the problem of turning arbitrary web pages into machine-readable JSON and human-readable summaries, without custom scrapers or manual parsing logic. What this workflow does The workflow integrates Decodo, Google Gemini, and Google Sheets to perform automated extraction of structured data. Here’s how it works step-by-step: Input Setup The workflow begins when the user executes it manually or passes a valid URL. The input includes url. Profile Extraction with Decodo Accepts any valid URL as input Scrapes the page content using Decodo Uses Google Gemini to: Extract structured data in JSON format Generate a concise, factual summary Cleans and parses AI-generated JSON safely Merges structured data and summary output Stores the final result in Google Sheets for reporting or downstream automation JSON Parsing & Merging The Code Node cleans and parses the JSON output from the AI for reliable downstream use. The Merge Node combines both structured data and the AI-generated summary. Data Storage in Google Sheets The Google Sheets Node appends or updates the record, storing the structured JSON and summary into a connected spreadsheet. End Output A unified, machine-readable data in JSON + an executive-level summary suitable data analysis or downstream automation. Setup Instructions Prerequisites n8n account** with workflow editor access Decodo API credentials** - You need to register, login and obtain the Basic Authentication Token via Decodo Dashboard Google Gemini (PaLM) API access** Google Sheets OAuth credentials** Setup Steps Import the workflow into your n8n instance. Configure Credentials Add your Decodo API credentials in the Decodo node. Connect your Google Gemini (PaLM) credentials for both AI nodes. Authenticate your Google Sheets account. Edit Input Node In the Set the Input Fields node, replace the default URL with your desired profile or dynamic data source. Run the Workflow Trigger manually or via webhook integration for automation. Verify that structured profile data and summary are written to the linked Google Sheet. How to customize this workflow to your needs You can easily extend or adapt this workflow: Modify Structured Output Change the Gemini extraction prompt to match your own JSON schema Add required fields such as authors, dates, entities, or metadata Improve Summarization Adjust summary length or tone (technical, executive, simplified) Add multi-language summarization using Gemini Change Output Destination Replace Google Sheets with: Databases (Postgres, MySQL) Notion Slack / Email File storage (JSON, CSV) Add Validation or Filtering Insert IF nodes to: Reject incomplete data Detect errors or hallucinated output Trigger alerts for malformed JSON Scale the Workflow Replace manual trigger with: Webhook Scheduled trigger Batch URL processing Summary This workflow provides a powerful, generic solution for converting unstructured web pages into structured, AI-enriched datasets. By combining Decodo for scraping, Google Gemini for intelligence, and Google Sheets for persistence, it enables repeatable, scalable, and production-ready data extraction without custom scrapers or brittle parsing logic.
by vinci-king-01
Sales Pipeline Automation Dashboard with AI Lead Intelligence 🎯 Target Audience Sales managers and team leads Business development representatives Marketing teams managing lead generation CRM administrators and sales operations Account executives and sales representatives Sales enablement professionals Revenue operations (RevOps) teams 🚀 Problem Statement Manual lead qualification and sales pipeline management is inefficient and often leads to missed opportunities or poor lead prioritization. This template solves the challenge of automatically scoring, qualifying, and routing leads using AI-powered intelligence to maximize conversion rates and sales team productivity. 🔧 How it Works This workflow automatically processes new leads using AI-powered intelligence, scores and qualifies them based on multiple factors, and automates the entire sales pipeline from lead capture to deal creation. Key Components Dual Trigger System - Scheduled monitoring and webhook triggers for real-time lead processing AI-Powered Lead Intelligence - Advanced scoring algorithm based on 7 key factors Multi-Source Data Enrichment - LinkedIn and Crunchbase integration for comprehensive lead profiles Automated Sales Actions - Intelligent routing, task creation, and follow-up sequences Multi-Platform Integration - HubSpot CRM, Slack notifications, and Google Sheets dashboard 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the lead was processed | "2024-01-15T10:30:00Z" | | lead_id | String | Unique lead identifier | "LEAD-2024-001234" | | first_name | String | Lead's first name | "John" | | last_name | String | Lead's last name | "Smith" | | email | String | Lead's email address | "john@company.com" | | company_name | String | Company name | "Acme Corp" | | job_title | String | Lead's job title | "Marketing Director" | | lead_score | Number | AI-calculated score (0-100) | 85 | | grade | String | Lead grade (A+, A, B+, B, C+) | "A+" | | category | String | Lead category | "Enterprise" | | priority | String | Priority level | "Critical" | | lead_source | String | How the lead was acquired | "Website Form" | | assigned_rep | String | Assigned sales representative | "Senior AE" | | company_size | String | Company employee count | "201-500 employees" | | industry | String | Company industry | "Technology" | | funding_stage | String | Company funding stage | "Series B" | | estimated_value | String | Estimated deal value | "$50K-100K" | 🛠️ Setup Instructions Estimated setup time: 25-30 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials HubSpot CRM account with API access Google Sheets account with API access Slack workspace for notifications (optional) Email service for welcome emails (optional) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up HubSpot CRM Integration Add HubSpot API credentials Grant necessary permissions for contacts, deals, and tasks Configure custom properties for lead scoring and qualification Test the connection to ensure it's working 4. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for sales pipeline data Configure the sheet name (default: "Sales Pipeline") 5. Configure Lead Scoring Parameters Update the lead scoring weights in the Code node Customize ideal customer profile criteria Set automation trigger thresholds Adjust sales rep assignment logic 6. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials for welcome emails Define notification preferences for different lead grades Test notification delivery 7. Configure Triggers Set up webhook endpoint for real-time lead capture Configure scheduled trigger for periodic monitoring Choose appropriate time zones for your business hours Test both trigger mechanisms 8. Test and Validate Run the workflow manually with sample lead data Check HubSpot for proper contact and deal creation Verify Google Sheets data formatting Test all notification channels 🔄 Workflow Customization Options Modify Lead Scoring Algorithm Adjust scoring weights for different factors Add new scoring criteria (geographic location, technology stack, etc.) Customize ideal customer profile parameters Implement industry-specific scoring models Extend Data Enrichment Add more data sources (ZoomInfo, Apollo, etc.) Include social media presence analysis Add technographic data collection Implement intent signal detection Customize Sales Automation Modify follow-up sequences for different lead categories Add more sophisticated sales rep assignment logic Implement territory-based routing Add automated meeting scheduling Output Customization Add data visualization and reporting features Implement sales pipeline analytics Create executive dashboards with key metrics Add conversion rate tracking and analysis 📈 Use Cases Lead Qualification**: Automatically score and qualify incoming leads Sales Pipeline Management**: Streamline the entire sales process Lead Routing**: Intelligently assign leads to appropriate sales reps Follow-up Automation**: Ensure consistent and timely follow-up Sales Intelligence**: Provide comprehensive lead insights Performance Tracking**: Monitor sales team and pipeline performance 🚨 Important Notes Respect LinkedIn and Crunchbase terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update your lead scoring parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Ensure GDPR compliance for lead data processing 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status HubSpot API errors: Check API key and permissions Google Sheets permission errors: Check OAuth2 scope and permissions Lead scoring errors: Review the Code node's JavaScript logic Rate limiting: Adjust request frequency and implement delays Support Resources: ScrapeGraphAI documentation and API reference HubSpot API documentation and developer resources n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Sales automation best practices and guidelines
by WeblineIndia
Real-Time WooCommerce Return Surge Detection with Slack Alerts & Airtable Logging This n8n workflow monitors WooCommerce refund activity to detect unusual spikes in product returns at the SKU level. It compares return volumes across rolling 24-hour windows, alerts teams in Slack when defined thresholds are exceeded and logs all detected events into Airtable for tracking and analysis. 🚀 Quick Start – Get This Running Fast Import the workflow into n8n. Connect your WooCommerce API credentials. Configure Slack and Airtable credentials. Set your preferred schedule interval. Activate the workflow and start monitoring returns automatically. What It Does This workflow is designed to automatically detect abnormal return behavior in a WooCommerce store. On every scheduled run, it fetches recent orders and refunds directly from the WooCommerce REST API. Refund records are mapped back to their original orders to accurately identify affected SKUs. Using a rolling time-window comparison, the workflow calculates current versus previous return counts per SKU. It identifies significant increases—either large percentage spikes or unusually high absolute return volumes. This ensures early detection of potential product quality, packaging or fulfillment issues. When a return surge is detected, the workflow sends a structured alert to a Slack channel and stores the alert data in Airtable. This creates a searchable, historical log that supports investigations, trend analysis and operational decision-making. Who’s It For This workflow is ideal for: eCommerce operations teams. Quality assurance and product managers. Customer support leads. Supply chain and fulfillment teams. Store owners running WooCommerce at scale. Requirements to Use This Workflow To use this workflow, you will need: An active WooCommerce store with REST API access. WooCommerce API credentials** (Consumer Key & Secret). An active Slack workspace with permission to post messages. An Airtable base and table for logging alerts. An n8n instance (self-hosted or cloud). How It Works & How To Set Up Workflow Execution Flow Schedule Trigger runs the workflow at a fixed interval. Time Window node defines current and previous 24-hour comparison windows. HTTP Orders fetches recent WooCommerce orders. HTTP Refunds fetches refund records. Orders_Fetch (Code) maps refunds to parent orders and extracts SKU-level data. Refund_details (Code) aggregates returns, compares windows, and calculates increases. IF Node checks surge conditions: ≥100% increase OR ≥25 current returns Set Fields enriches data with status, run date, and cooldown key. Slack Node sends a formatted alert message. Code Node normalizes Slack output into structured fields. Airtable Node stores alert records for future reference. Setup Instructions Replace {your_woocommerce_domain} with your actual store domain. Verify WooCommerce API permissions allow order and refund access. Select the correct Slack channel in the Slack node. Ensure Airtable column names match the workflow mappings. How To Customize Nodes You can easily adapt this workflow by: Changing the schedule frequency in the Schedule Trigger. Adjusting WINDOW_HOURS in the Code nodes. Modifying alert thresholds in the IF node. Customizing the Slack message format. Adding or removing Airtable fields for reporting needs. Add-ons (Optional Enhancements) This workflow can be extended with: Email or Microsoft Teams notifications. Jira or Linear ticket creation. Product auto-pause for extreme return spikes. Dashboard reporting using BI tools. Cooldown logic to prevent repeated alerts per SKU. Use Case Examples Common use cases include: Detecting defective product batches early. Identifying packaging or shipping damage trends. Monitoring supplier quality issues. Supporting refund root-cause analysis. Improving customer satisfaction metrics. There can be many more operational and analytical use cases based on your business needs. Troubleshooting Guide | Issue | Possible Cause | Solution | |------|---------------|----------| | No Slack alerts | Threshold not met | Lower IF condition limits | | Empty SKU values | Missing SKU in WooCommerce | Use product name or ID fallback | | No data in Airtable | Column mismatch | Verify field names and types | | API errors | Invalid credentials | Re-authorize WooCommerce API | | Duplicate alerts | Frequent schedule | Add cooldown or deduplication logic | Need Help? Need assistance setting this up or customizing it for your business? WeblineIndia can help you implement, extend or build similar automation workflows tailored to your operational needs. Whether you want advanced alerting, deeper analytics or cross-system integrations, our team is ready to help you get the most out of n8n automation.
by Khairul Muhtadin
Decodo Amazon Product Recommender delivers instant, AI-powered shopping recommendations directly through Telegram. Send any product name and receive Amazon product analysis featuring price comparisons, ratings, sales data, and categorized recommendations (budget, premium, best value) in under 40 seconds—eliminating hours of manual research. Why Use This Workflow? Time Savings: Reduce product research from 45+ minutes to under 30 seconds Decision Quality: Compare 20+ products automatically with AI-curated recommendations Zero Manual Work: Complete automation from message input to formatted recommendations Ideal For E-commerce Entrepreneurs:** Quickly research competitor products, pricing strategies, and market trends for inventory decisions Smart Shoppers & Deal Hunters:** Get instant product comparisons with sales volume data and discount tracking before purchasing Product Managers & Researchers:** Analyze Amazon marketplace positioning, customer sentiment, and pricing ranges for competitive intelligence How It Works Trigger: User sends product name via Telegram (e.g., "iPhone 15 Pro Max case") AI Validation: Gemini 2.5 Flash extracts core product keywords and validates input authenticity Data Collection: Decodo API scrapes Amazon search results, extracting prices, ratings, reviews, sales volume, and product URLs Processing: JavaScript node cleans data, removes duplicates, calculates value scores, and categorizes products (top picks, budget, premium, best value, most popular) Intelligence Layer: AI generates personalized recommendations with Telegram-optimized markdown formatting, shortened product names, and clean Amazon URLs Output & Delivery: Formatted recommendations sent to user with categorized options and direct purchase links Error Handling: Admin notifications via separate Telegram channel for workflow monitoring Setup Guide Prerequisites | Requirement | Type | Purpose | |-------------|------|---------| | n8n instance | Essential | Workflow execution platform | | Decodo Account | Essential | Amazon product data scraping | | Telegram Bot Token | Essential | Chat interface for user interactions | | Google Gemini API | Essential | AI-powered product validation and recommendations | | Telegram Account | Optional | Admin error notifications | Installation Steps Import the JSON file to your n8n instance Configure credentials: Decodo API: Sign up at decodo.com → Dashboard → Scraping APIs → Web Advanced → Copy BASIC AUTH TOKEN Telegram Bot: Message @BotFather on Telegram → /newbot → Copy HTTP API token (format: 123456789:ABCdefGHI...) Google Gemini: Obtain API key from Google AI Studio for Gemini 2.5 Flash model Update environment-specific values: Replace YOUR-CHAT-ID in "Notify Admin" node with your Telegram chat ID for error notifications Verify Telegram webhook IDs are properly configured Customize settings: Adjust AI prompt in "Generate Recommendations" node for different output formats Set character limits (default: 2500) for Telegram message length Test execution: Send test message to your Telegram bot: "iPhone 15 Pro" Verify processing status messages appear Confirm recommendations arrive with properly formatted links Customization Options Basic Adjustments: Character Limit**: Modify 2500 in AI prompt to adjust response length (Telegram max: 4096) Advanced Enhancements: Multi-language Support**: Add language detection and translation nodes for international users Price Tracking**: Integrate Google Sheets to log historical prices and trigger alerts on drops Image Support**: Enable Telegram photo messages with product images from scraping results Troubleshooting Common Issues: | Problem | Cause | Solution | |---------|-------|----------| | "No product detected" for valid inputs | AI validation too strict or ambiguous query | Add specific product details (model number, brand) in user input | | Empty recommendations returned | Decodo API rate limit or Amazon blocking | Wait 60 seconds between requests; verify Decodo account status | | Telegram message formatting broken | Special characters in product names | Ensure Telegram markdown mode is set to "Markdown" (legacy) not "MarkdownV2" | Use Case Examples Scenario 1: E-commerce Store Owner Challenge: Needs to quickly assess competitor pricing and product positioning for new inventory decisions without spending hours browsing Amazon Solution: Sends "wireless earbuds" to bot, receives categorized analysis of 20+ products with price ranges ($15-$250), top sellers, and discount opportunities Result: Identifies $35-$50 price gap in market, sources comparable product, achieves 40% profit margin Scenario 2: Smart Shopping Enthusiast Challenge: Wants to buy a laptop backpack but overwhelmed by 200+ Amazon options with varying prices and unclear value propositions Solution: Messages "laptop backpack" to bot, gets AI recommendations sorted by budget ($30), premium ($50+), best value (highest discount + good ratings), and most popular (by sales volume) Result: Purchases "Best Value" recommendation with 35% discount, saves $18 and 45 minutes of research time Created by: Khaisa Studio Category: AI | Productivity | E-commerce | Tags: amazon, telegram, ai, product-research, shopping, automation, gemini Need custom workflows? Contact us Connect with the creator: Portfolio • Workflows • LinkedIn • Medium • Threads
by Amirul Hakimi
🚀 Enrich CRM Leads with LinkedIn Company Data Using AI Who's it for Sales teams, marketers, and business development professionals who need to automatically enrich their CRM records with detailed company information from LinkedIn profiles. Perfect for anyone doing B2B outreach who wants to personalize their messaging at scale. What it does This workflow transforms bare-bones lead records into rich, personalized prospect profiles by: Automatically scraping LinkedIn company profiles Using AI (GPT-4) to extract key business intelligence Generating 15+ email-ready personalization variables Updating your CRM with structured, actionable data The workflow pulls company overviews, products/services, funding information, recent posts, and converts everything into natural-language variables that can be dropped directly into your outreach templates. How it works Trigger: Workflow starts when a new lead is added to Airtable (or on schedule) Fetch: Retrieves the lead record containing the LinkedIn company URL Scrape: Pulls the raw HTML from the company's LinkedIn profile Clean: Strips HTML tags and formats content for AI processing Analyze: GPT-4 extracts structured company intelligence (overview, products, market presence, recent posts) Transform: Converts analysis into 15+ email-ready variables with natural phrasing Update: Writes enriched data back to your CRM Setup Requirements Airtable account** (free tier works fine) OpenAI API key** (GPT-4o-mini recommended for cost-effectiveness) LinkedIn company URLs** stored in your CRM 5 minutes** for initial configuration How to set up Configure Airtable Connection Replace YOUR_AIRTABLE_BASE_ID with your base ID Replace YOUR_TABLE_ID with your leads table ID Ensure your table has a "LinkedIn Organization URL" field Add your Airtable API credentials Add OpenAI Credentials Click on both OpenAI nodes Add your OpenAI API key GPT-4o-mini is recommended (cost-effective and fast) Set Up Trigger Add a trigger node (Schedule, Webhook, or Airtable trigger) Configure to run when new leads are added or on a daily schedule Test the Workflow Add a test lead with a LinkedIn company URL Execute
by n8n Automation Expert | Template Creator | 2+ Years Experience
🚀 Transform Your Job Hunt with AI-Powered Telegram Bot Turn job searching into a conversational experience! This intelligent Telegram bot automatically scrapes job postings from LinkedIn, Indeed, and Monster, filters for sales & marketing positions, and delivers personalized results directly to your chat. ✨ Key Features Interactive Telegram Commands**: Simple /jobs [keyword] [location] searches Multi-Platform Scraping**: Simultaneous data collection from 3 major job boards AI-Powered Filtering**: Smart relevance detection and experience level classification Real-Time Notifications**: Instant job alerts delivered to Telegram Automated Data Storage**: Saves results to Google Sheets and Airtable Duplicate Removal**: Advanced deduplication across platforms Mobile-First Experience**: Full job search functionality through Telegram 🎯 Perfect For Sales Professionals**: Account managers, sales representatives, business development Marketing Experts**: Digital marketers, marketing managers, growth specialists Recruiters**: Streamlined candidate sourcing and job market analysis Job Seekers**: Hands-free job discovery with instant notifications 🛠️ Setup Requirements Required Credentials: Telegram Bot Token**: Create bot via @BotFather Bright Data API**: Professional web scraping service (LinkedIn/Indeed datasets) Google Sheets OAuth2**: For spreadsheet integration Airtable Token**: Database storage and management Prerequisites: n8n instance with HTTPS enabled (required for Telegram webhooks) Valid domain name with SSL certificate Basic understanding of Telegram bot commands 🔧 How It Works User Experience: Send /start to activate the bot and see available commands Use /jobs sales manager New York to search for specific positions Receive formatted job results instantly in Telegram Click "Apply Now" links to go directly to job postings All jobs automatically saved to your connected spreadsheets Behind the Scenes: Command Processing: Bot parses user input for keywords and location Parallel Scraping: Simultaneous API calls to LinkedIn, Indeed, and Monster AI Processing: Intelligent filtering, experience level detection, remote work identification Data Enhancement: Salary extraction, duplicate removal, relevance scoring Multi-Format Storage: Automatic saving to Google Sheets, Airtable, and JSON export Real-Time Response: Formatted results delivered back to Telegram chat 🎨 Telegram Bot Commands /start - Welcome message and command overview /jobs [keyword] [location] - Search for jobs (e.g., /jobs marketing manager remote) /help - Show detailed help information /status - Check bot status and recent activity 📊 Sample Output The bot delivers beautifully formatted job results: 🎯 Job Search Results 🎯 Found 7 relevant opportunities Platforms: linkedin, indeed, monster Remote jobs: 3 ─────────────────── 💼 Senior Sales Manager 🏢 TechCorp Industries 📍 New York, NY 💰 $80,000 - $120,000 🌐 Remote Available 📊 senior level 🔗 Apply Now 🔒 Security & Best Practices Rate Limiting**: Built-in Telegram API compliance (30 requests/second) Error Handling**: Graceful failure recovery with user-friendly messages Input Validation**: Sanitized user input to prevent injection attacks Credential Management**: Secure API key storage using n8n credentials system HTTPS Enforcement**: Required for production Telegram webhook integration 📈 Benefits & ROI 95% Time Reduction**: Automated job discovery vs manual searching Multi-Source Coverage**: Access 3 major job platforms simultaneously Mobile Accessibility**: Search jobs anywhere using Telegram mobile app Real-Time Alerts**: Never miss new opportunities with instant notifications Data Organization**: Automatic spreadsheet management for job tracking Market Intelligence**: Comprehensive job market analysis and trends 🚀 Advanced Customization Custom Keywords**: Modify filtering logic for specific industries Location Targeting**: Adjust geographic search parameters Experience Levels**: Fine-tune senior/mid/entry level detection Additional Platforms**: Easily add more job boards via HTTP requests Notification Scheduling**: Set up periodic automated job alerts Team Integration**: Deploy for multiple users or team channels 💡 Use Cases Individual Job Seekers**: Personal job hunting assistant Recruitment Agencies**: Streamlined candidate sourcing Sales Teams**: Territory-specific opportunity monitoring Marketing Departments**: Industry trend analysis and competitor tracking Career Coaches**: Client job market research and opportunity identification Ready to revolutionize your job search? Deploy this workflow and start receiving personalized job opportunities directly in Telegram!
by Intuz
This n8n template from Intuz provides a complete solution to automate your entire invoicing process. It intelligently syncs confirmed sales orders from your Airtable base to QuickBooks, automatically creating new customers if they don't exist before generating a perfectly matched invoice. It then logs all invoice details back into Airtable, creating a flawless, end-to-end financial workflow. Use Cases 1. Accounting & Finance Teams: Automatically generate QuickBooks invoices from new orders confirmed in Airtable. Keep all invoices and customer details synced across systems in real time. 2. Sales & Operations Teams: Track order status and billing progress directly from Airtable without switching platforms. Ensure every confirmed sale automatically triggers an invoice in QuickBooks. 3. Business Owners / Admins: Eliminate double-entry between Airtable and QuickBooks. Maintain accurate, audit-ready financial records with minimal effort. How it works 1. Trigger from Airtable: The workflow starts instantly when a sales order is ready to be invoiced in your Airtable base (triggered via a webhook). 2. Check for Customer in QuickBooks: It searches your QuickBooks account to see if the customer from the sales order already exists. 3. Create New Customer (If Needed): If the customer is not found, it automatically creates a new customer record in QuickBooks using the details from your Airtable Customers table. 4. Create QuickBooks Invoice: Using the correct customer record (either existing or newly created), it gathers all order line items from Airtable and generates a detailed invoice in QuickBooks. 5. Log Invoice Back to Airtable: After the invoice is successfully created, the workflow updates your Airtable base by adding a new record to your Invoices & Payments table and updating the original Confirmed Orders record with the new QuickBooks Invoice ID, marking it as synced. Key Requirements to Use This Template 1. n8n Instance: An active n8n account (Cloud or self-hosted). 2. Airtable Base: An Airtable base on a "Pro" plan or higher with tables for Confirmed Orders, Customers, Order Lines, Product & Service, and Invoices & Payments. Field names must match those in the setup guide. 3. QuickBooks Online Account: An active QuickBooks Online account with API access. Step-by-Step Setup Instructions Step 1: Import and Configure the n8n Workflow Import Workflow:** In n8n, import the Client-Quickbook-Invoices-via-AirTable.json file. Get Webhook URL:** Click on the first node, "Webhook". Copy the "Test URL". Keep this n8n tab open. Configure Airtable Nodes:** There are six Airtable nodes. For each one, connect your Airtable credentials and select the correct Base and Table. Configure QuickBooks Nodes:** There are four QuickBooks-related nodes. For each one, connect your QuickBooks Online credentials. CRITICAL:** Click on the "Create Invoice URL" (HTTP Request) node. You must edit the URL and replace the placeholder number (9341455145770046) with your own QuickBooks Company ID. (Find this in your QuickBooks account settings under "Billing & Subscription"). Save and Activate**: Click "Save", then toggle the workflow to "Active". After activating, copy the new "Production URL" from the Webhook node. Customization Guide You can adapt this template for various workflows by tweaking a few nodes: Use a different Airtable Base:** Update the Base ID and Table ID in all Airtable nodes (Get Orders Records, Get Customer Details, Get Products, etc.). Switch from Sandbox to Live QuickBooks:** Replace the Sandbox company ID and endpoint in the “Create Invoice URL” node with your production QuickBooks company ID. Add more invoice details:** Edit the Code and Parse in HTTP nodes to include additional fields (like Tax, Shipping, or Notes). Support multiple currencies:** Add a “Currency” field mapping in both Airtable and QuickBooks nodes. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Workflow Automation Click here- Get Started
by Luis Hernandez
Overview This comprehensive n8n workflow automates the generation and distribution of detailed monthly technical support reports from GLPI (IT Service Management platform). The workflow intelligently calculates SLA compliance, analyzes technician performance, and delivers professionally formatted HTML reports via email. ✨ Key Features Intelligent SLA Calculation Business Hours Tracking: Automatically calculates resolution time considering only working hours (excludes weekends and lunch breaks) Configurable Schedule: Customizable work hours (default: 8 AM - 12 PM, 1 PM - 6 PM) Dynamic SLA Monitoring: Real-time compliance tracking with configurable thresholds (default: 24 hours) Visual Indicators: Color-coded alerts for critical SLA breaches and high-volume warnings Comprehensive Reporting General Summary: Total cases, open, in-progress, resolved, and closed tickets Performance Metrics: Total and average resolution hours in both decimal and formatted (hours/minutes) display Technician Breakdown: Individual performance analysis per technician including case distribution and SLA compliance Smart Alerts: Automatic warnings for high case volumes (>100 in-progress) and critical SLA levels (<50%) Professional Email Delivery Responsive HTML Design: Mobile-optimized email templates with elegant styling Dynamic Content: Conditional formatting based on performance metrics Automatic Scheduling: Monthly execution on the 6th day to ensure accurate SLA measurement 💼 Business Benefits Time Savings Eliminates Manual Work: Saves 2-4 hours per month previously spent compiling reports manually Automated Data Collection: No more exporting CSVs or copying data between systems One-Click Setup: Configure once and receive reports automatically every month Improved Decision Making Real-Time Insights: Identify bottlenecks and performance issues immediately Technician Accountability: Clear visibility into individual and team performance SLA Compliance Tracking: Proactively manage service level agreements before they become critical Enhanced Communication Stakeholder Ready: Professional reports suitable for management presentations Consistent Format: Standardized metrics ensure month-over-month comparability Instant Distribution: Automatic email delivery to relevant stakeholders 🔧 Technical Specifications Requirements n8n instance (self-hosted or cloud) GLPI server with API access enabled Gmail account (or any SMTP-compatible email service) GLPI API credentials (App-Token and User credentials) Configuration Points Variables Node: Server URL, API tokens, entity name, work hours, SLA limits Schedule Trigger: Monthly execution timing (default: 6th of each month) Email Recipient: Target email address for report delivery Date Range Logic: Automatic previous month calculation Data Processing Retrieves up to 999 tickets per execution (configurable) Filters by entity and date range Excludes weekends and non-business hours from calculations Groups data by technician for detailed analysis 📋 Setup Instructions Prerequisites GLPI Configuration: Enable API and configure the Tickets panel with required fields (ID, -Title, Status, Opening Date, Closing Date, Resolution Date, Priority, Requester, Assigned To) API Credentials: Create Basic Auth credentials in n8n for GLPI API access Email Authentication: Set up Gmail OAuth2 or SMTP credentials in n8n Implementation Steps Import the workflow JSON into your n8n instance Configure the Variables node with your GLPI server details and business hours Set up GLPI API credentials in the HTTP Request nodes Configure email credentials in the Gmail node Update the recipient email address Test the workflow manually before enabling the schedule Activate the workflow for automatic monthly execution 🎯 Use Cases IT Support Teams: Track helpdesk performance and SLA compliance Service Managers: Monitor team productivity and identify training needs Executive Reporting: Provide high-level summaries to stakeholders Resource Planning: Identify workload distribution and capacity issues Compliance Auditing: Maintain historical records of SLA performance 📈 ROI Impact Time Savings: 24-48 hours annually in manual reporting eliminated Error Reduction: Eliminates human calculation errors in SLA tracking Faster Response: Early alerts enable proactive issue resolution Better Visibility: Data-driven insights improve team management
by Omer Fayyaz
An intelligent web scraping workflow that automatically routes URLs to site-specific extraction logic, normalizes data across multiple sources, and filters content by freshness to build a unified article feed. What Makes This Different: Intelligent Source Routing** - Uses a Switch node to route URLs to specialized extractors based on source identifier, enabling custom CSS selectors per publisher for maximum accuracy Universal Fallback Parser** - Advanced regex-based extractor handles unknown sources automatically, extracting title, description, author, date, and images from meta tags and HTML patterns Freshness Filtering** - Built-in 45-day freshness threshold filters outdated content before saving, with configurable date validation logic Tier-Based Classification** - Automatically categorizes articles into Tier 1 (0-7 days), Tier 2 (8-14 days), Tier 3 (15-30 days), or Archive based on publication date Rate Limiting & Error Handling** - Built-in 3-second delays between requests prevents server overload, with comprehensive error handling that continues processing even if individual URLs fail Status Tracking** - Updates source spreadsheet with processing status, enabling easy monitoring and retry logic for failed extractions Key Benefits of Multi-Source Content Aggregation: Scalable Architecture** - Easily add new sources by adding a Switch rule and extraction node, no code changes needed for most sites Data Normalization** - Standardizes extracted data across all sources into a consistent format (title, description, author, date, image, canonical URL) Automated Processing** - Schedule-based execution (every 4 hours) or manual triggers keep your feed updated without manual intervention Quality Control** - Freshness filtering ensures only recent, relevant content enters your feed, reducing noise from outdated articles Flexible Input** - Reads from Google Sheets, making it easy to add URLs in bulk or integrate with other systems Comprehensive Metadata** - Captures full article metadata including canonical URLs, publication dates, author information, and featured images Who's it for This template is designed for content aggregators, news monitoring services, content marketers, SEO professionals, researchers, and anyone who needs to collect and normalize articles from multiple websites. It's perfect for organizations that need to monitor competitor content, aggregate industry news, build content databases, track publication trends, or create unified article feeds without manually scraping each site or writing custom scrapers for every source. How it works / What it does This workflow creates a unified article aggregation system that reads URLs from Google Sheets, routes them to site-specific extractors, normalizes the data, filters by freshness, and saves results to a feed. The system: Reads Pending URLs - Fetches URLs with source identifiers from Google Sheets, filtering for entries with "Pending" status Processes with Rate Limiting - Loops through URLs one at a time with a 3-second delay between requests to respect server resources Fetches HTML Content - Downloads page HTML with proper browser headers (User-Agent, Accept, Accept-Language) to avoid blocking Routes by Source - Switch node directs URLs to specialized extractors (Site A, B, C, D) or universal fallback parser based on Source field Extracts Article Data - Site-specific HTML nodes use custom CSS selectors, while fallback uses regex patterns to extract title, description, author, date, image, and canonical URL Normalizes Data - Standardizes all extracted fields into consistent format, handling missing values and trimming whitespace Filters by Freshness - Validates publication dates and filters out articles older than 45 days (configurable threshold) Calculates Tier & Status - Assigns tier classification and freshness status based on article age Saves to Feed - Appends normalized articles to Article Feed sheet with all metadata Updates Status - Marks processed URLs as complete in source sheet for tracking Key Innovation: Source-Based Routing - Unlike generic scrapers that use one-size-fits-all extraction, this workflow uses intelligent routing to apply site-specific CSS selectors. This dramatically improves extraction accuracy while maintaining a universal fallback for unknown sources, making it both precise and extensible. How to set up 1. Prepare Google Sheets Create a Google Sheet with two tabs: "URLs to Process" and "Article Feed" In "URLs to Process" sheet, create columns: URL, Source, Status Add sample data: URLs in URL column, source identifiers (e.g., "Site A", "Site B") in Source column, and "Pending" in Status column In "Article Feed" sheet, the workflow will automatically create columns: Title, Description, Author, datePublished, imageUrl, canonicalUrl, source, sourceUrl, tier, freshnessStatus, extractedAt Verify your Google Sheets credentials are set up in n8n (OAuth2 recommended) 2. Configure Google Sheets Nodes Open the "Read Pending URLs" node and select your spreadsheet from the document dropdown Set sheet name to "URLs to Process" Configure the "Save to Article Feed" node: select same spreadsheet, set sheet name to "Article Feed", operation should be "Append or Update" Configure the "Update URL Status" node: same spreadsheet, "URLs to Process" sheet, operation "Update" Test connection by running the "Read Pending URLs" node manually to verify it can access your sheet 3. Customize Source Routing Open the "Source Router" (Switch node) to see current routing rules for Site A, B, C, D, and fallback To add a new source: Click "Add Rule", set condition: {{ $('Loop Over URLs').item.json.Source }} equals your source name Create a new HTML extraction node for your source with appropriate CSS selectors Connect the new extractor to the "Normalize Extracted Data" node Update the Switch node to route to your new extractor Example CSS selectors for common sites: // WordPress sites title: "h1.entry-title, .post-title" author: ".author-name, .byline a" date: "time.entry-date, time[datetime]" // Modern CMS title: "h1.article__title, article h1" author: ".article__byline a, a[rel='author']" date: "time[datetime], meta[property='article:published_time']" 4. Configure Freshness Threshold Open the "Freshness Filter (45 days)" IF node The current threshold is 45 days (configurable in the condition expression) To change threshold: Modify the expression cutoffDate.setDate(cutoffDate.getDate() - 45) to your desired number of days The filter marks articles as "Fresh" (within threshold) or routes to "Outdated" handler Test with sample URLs to verify date parsing works correctly for your sources 5. Set Up Scheduling & Test The workflow includes both Manual Trigger (for testing) and Schedule Trigger (runs every 4 hours) To customize schedule: Open "Schedule (Every 4 Hours)" node and adjust interval For initial testing: Use Manual Trigger, add 2-3 test URLs to your sheet with Status="Pending" Verify execution: Check that URLs are fetched, routed correctly, extracted, and saved to Article Feed Monitor the "Completion Summary" node output to see processing statistics Check execution logs for any errors in HTML extraction or date parsing Common issues: Missing CSS selectors (update extractor), date format mismatches (adjust date parsing), or rate limiting (increase wait time if needed) Requirements Google Sheets Account** - Active Google account with OAuth2 credentials configured in n8n for reading and writing spreadsheet data Source Spreadsheet** - Google Sheet with "URLs to Process" and "Article Feed" tabs, properly formatted with required columns n8n Instance** - Self-hosted or cloud n8n instance with access to external websites (HTTP Request node needs internet connectivity) Source Knowledge** - Understanding of target website HTML structure to configure CSS selectors for site-specific extractors (or use fallback parser for unknown sources)
by inderjeet Bhambra
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works? The Content Strategy AI Pipeline is an intelligent, multi-stage content creation system that transforms simple user prompts into polished, ready-to-publish content. The system intelligently extracts platform requirements, audience insights, and brand tone from user requests, then develops strategic reasoning and emotional connection strategies before crafting compelling content outlines and final publication-ready posts or articles. Supporting both social media platforms (Instagram, LinkedIn, X, Facebook, TikTok) and blog content. Key Differentiators: Strategic thinking approach, emotional intelligence integration, platform-native optimization, zero-editing-required output, and professional content strategist-level quality through multi-model AI orchestration. Technical Points Multi-model AI orchestration for specialized tasks Emotional psychology integration for audience connection Platform algorithm optimization built-in Industry-standard content strategy methodology automated Enterprise-grade reliability with session management and memory API-ready architecture for integration into existing workflows Test Inputs Sample Request: "Create an Instagram post for a fitness coach targeting busy moms, tone should be motivational and relatable" Expected Flow: Platform: Instagram → Niche: Fitness → Audience: Busy Moms → Tone: Motivational → Output: 125-150 word post with hashtags `
by Meak
LinkedIn Job-Based Cold Email System Most outreach tools rely on generic lead lists and recycled contact data. This workflow builds a live, personalized lead engine that scrapes new LinkedIn job posts, finds company decision-maker emails, and generates custom cold emails using GPT — all fully automated through n8n. Benefits Automated daily scraping of “Marketing Manager” jobs in Belgium Real-time leads from companies currently hiring for marketing roles Filters out HR and staffing agencies to keep only real businesses Enriches each company with verified CEO, Sales, and Marketing emails Generates unique, human-like cold emails and subject lines with GPT-4o Saves clean data to Google Sheets and drafts personalized Gmail messages How It Works Schedule Trigger runs every morning at 08:00. Apify LinkedIn Scraper collects new “Marketing Manager” jobs in Belgium. Remove Duplicates ensures each company appears only once. Filter Staffing excludes recruiters, HR agencies, and interim firms. Save Useful Infos extracts core company data — name, domain, size, description. Filter Domain & Size keeps valid websites and companies under 100 employees. Anymailfinder API looks up CEO, Sales, and Marketing decision-maker emails. Merge + If Node validates email results and removes invalid entries. Split Out + Deduplicate ensures unique, verified contacts. Extract Lead Name (Code Node) separates first and last names. Google Sheets Node appends all enriched lead data to your master sheet. GPT-4o (LangChain) writes a 100–120 word personalized cold email. GPT-4o (LangChain) creates a short, casual subject line. Gmail Draft Node builds a ready-to-send email using both outputs. Wait Node loops until all leads are processed. Who Is This For B2B agencies targeting Belgian SMEs Outbound marketers using job postings as purchase intent signals Freelancers or founders running lean, automated outreach systems Growth teams building scalable cold email engines Setup Apify**: use curious_coder~linkedin-jobs-scraper actor + API token Anymailfinder**: header auth with decision-maker categories (ceo, sales, marketing) Google Sheets**: connect a sheet named “LinkedIn Job Scraper” and map columns OpenAI (GPT-4o)**: insert your API key into both LangChain nodes Gmail**: OAuth2 connection; resource set to draft n8n**: store all credentials securely; set HTTP nodes to continue on error ROI & Results Save 1–3 hours per day on manual research and outreach prep Contact active hiring companies when they need marketing help most Scale to multiple industries or regions by changing search URLs Outperform paid lead databases with fresh, verified data Strategy Insights Add funding or tech-stack data for better lead scoring A/B test GPT subject lines and log open rates in Sheets Schedule GPT follow-ups 3 and 7 days later for full automation Push all enriched data to your CRM for advanced segmentation Use hiring signals to trigger ad audiences or retargeting campaigns Check Out My Channel For more advanced automation workflows that generate real client results, check out my YouTube channel — where I share the exact systems I use to automate outreach, scale agency pipelines, and close deals faster.