by Harshil Agrawal
This workflow allows you to create, update, and get a monitor using the UptimeRobot node. UptimeRobot node: This node creates a new monitor of the type HTTP(S). UptimeRobot1 node: This node will update the monitor that we created in the previous node. UptimeRobot2 node: This node will get the information of the monitor that we created in the previous node.
by Harshil Agrawal
This workflow allows you to create a group, add members to the group, and get the members of the group. Bitwarden node: This node will create a new group called documentation in Bitwarden. Bitwarden1 node: This node will get all the members from Bitwarden. Bitwarden2 node: This node will update all the members in the group that we created earlier. Bitwarden3 node: This node will get all the members in the group that we created earlier.
by David Olusola
๐๏ธ Auto-Create Airtable CRM Records for Zoom Attendees This workflow automatically logs every Zoom meeting attendee into an Airtable CRM โ capturing their details for sales follow-up, reporting, or onboarding. โ๏ธ How It Works Zoom Webhook โ Captures participant join event. Normalize Data โ Extracts attendee name, email, join/leave times. Airtable โ Saves/updates record with meeting + contact info. ๐ ๏ธ Setup Steps 1. Zoom Create a Zoom App with meeting.participant_joined event. Paste workflow webhook URL. 2. Airtable Create a base called CRM. Table: Attendees. Columns: Meeting ID Topic Name Email Join Time Leave Time Duration Tag 3. n8n Replace YOUR_AIRTABLE_BASE_ID + YOUR_AIRTABLE_TABLE_ID in the workflow. Connect Airtable API key. ๐ Example Airtable Row | Meeting ID | Topic | Name | Email | Join Time | Duration | Tag | |------------|--------------|----------|--------------------|----------------------|----------|----------| | 999-123-456 | Sales Demo | Sarah L. | sarah@email.com | 2025-08-30T10:02:00Z | 45 min | New Lead | โก With this workflow, every Zoom attendee becomes a structured CRM record automatically.
by vinci-king-01
Amazon Keyboard Product Scraper with AI and Google Sheets Integration ๐ฏ Target Audience E-commerce analysts and researchers Product managers tracking competitor keyboards Data analysts monitoring Amazon keyboard market trends Business owners conducting market research Developers building product comparison tools ๐ Problem Statement Manual monitoring of Amazon keyboard products is time-consuming and error-prone. This template solves the challenge of automatically collecting, structuring, and storing keyboard product data for analysis, enabling data-driven decision making in the competitive keyboard market. ๐ง How it Works This workflow automatically scrapes Amazon keyboard products using AI-powered web scraping and stores them in Google Sheets for comprehensive analysis and tracking. Key Components Scheduled Trigger - Runs the workflow at specified intervals to keep data fresh and up-to-date AI-Powered Scraping - Uses ScrapeGraphAI to intelligently extract product information from Amazon search results with natural language processing Data Processing - Transforms and structures the scraped data for optimal spreadsheet compatibility Google Sheets Integration - Automatically saves product data to your spreadsheet with proper column mapping ๐ Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | title | String | Product name and model | "Logitech MX Keys Advanced Wireless Illuminated Keyboard" | | url | URL | Direct link to Amazon product page | "https://www.amazon.com/dp/B07S92QBCX" | | category | String | Product category classification | "Electronics" | ๐ ๏ธ Setup Instructions Estimated setup time: 10-15 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Step-by-Step Configuration 1. Install Community Nodes Install ScrapeGraphAI community node npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Select or create a target spreadsheet for data storage Configure the sheet name (default: "Sheet1") 4. Customize Amazon Search Parameters Update the websiteUrl parameter in the ScrapeGraphAI node Modify search terms, filters, or categories as needed Adjust the user prompt to extract additional fields if required 5. Configure Schedule Trigger Set your preferred execution frequency (daily, weekly, etc.) Choose appropriate time zones for your business hours Consider Amazon's rate limits when setting frequency 6. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Validate that all required fields are being captured ๐ Workflow Customization Options Modify Search Criteria Change the Amazon URL to target specific keyboard categories Add price filters, brand filters, or rating requirements Update search terms for different product types Extend Data Collection Modify the user prompt to extract additional fields (price, rating, reviews) Add data processing nodes for advanced analytics Integrate with other data sources for comprehensive market analysis Output Customization Change Google Sheets operation from "append" to "upsert" for deduplication Add data validation and cleaning steps Implement error handling and retry logic ๐ Use Cases Competitive Analysis**: Track competitor keyboard pricing and features Market Research**: Monitor trending keyboard products and categories Inventory Management**: Keep track of available keyboard options Price Monitoring**: Track price changes over time Product Development**: Research market gaps and opportunities ๐จ Important Notes Respect Amazon's terms of service and rate limits Consider implementing delays between requests for large datasets Regularly review and update your scraping parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly ๐ง Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Data formatting issues: Review the Code node's JavaScript logic Rate limiting: Adjust schedule frequency and implement delays Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations
by vinci-king-01
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. News Article Scraping and Analysis with AI and Google Sheets Integration ๐ฏ Target Audience News aggregators and content curators Media monitoring professionals Market researchers tracking industry news PR professionals monitoring brand mentions Journalists and content creators Business analysts tracking competitor news Academic researchers collecting news data ๐ Problem Statement Manual news monitoring is time-consuming and often misses important articles. This template solves the challenge of automatically collecting, structuring, and storing news articles from any website for comprehensive analysis and tracking. ๐ง How it Works This workflow automatically scrapes news articles from websites using AI-powered extraction and stores them in Google Sheets for analysis and tracking. Key Components Scheduled Trigger**: Runs automatically at specified intervals to collect fresh content AI-Powered Scraping**: Uses ScrapeGraphAI to intelligently extract article titles, URLs, and categories from any news website Data Processing**: Formats extracted data for optimal spreadsheet compatibility Automated Storage**: Saves all articles to Google Sheets with metadata for easy filtering and analysis ๐ Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | title | String | Article headline and title | "'My friend died right in front of me' - Student describes moment air force jet crashed into school" | | url | URL | Direct link to the article | "https://www.bbc.com/news/articles/cglzw8y5wy5o" | | category | String | Article category or section | "Asia" | ๐ ๏ธ Setup Instructions Estimated setup time: 10-15 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Step-by-Step Configuration 1. Install Community Nodes Install ScrapeGraphAI community node npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Select or create a target spreadsheet for data storage Configure the sheet name (default: "Sheet1") 4. Customize News Source Parameters Update the websiteUrl parameter in the ScrapeGraphAI node Modify the target news website URL as needed Adjust the user prompt to extract additional fields if required Test with a small website first before scaling to larger news sites 5. Configure Schedule Trigger Set your preferred execution frequency (daily, hourly, etc.) Choose appropriate time zones for your business hours Consider the news website's update frequency when setting intervals 6. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Validate that all required fields are being captured ๐ Workflow Customization Options Modify News Sources Change the website URL to target different news sources Add multiple news websites for comprehensive coverage Implement filters for specific topics or categories Extend Data Collection Modify the user prompt to extract additional fields (author, date, summary) Add sentiment analysis for article content Integrate with other data sources for comprehensive analysis Output Customization Change Google Sheets operation from "append" to "upsert" for deduplication Add data validation and cleaning steps Implement error handling and retry logic ๐ Use Cases Media Monitoring**: Track mentions of your brand, competitors, or industry keywords Content Curation**: Automatically collect articles for newsletters or content aggregation Market Research**: Monitor industry trends and competitor activities News Aggregation**: Build custom news feeds for specific topics or sources Academic Research**: Collect news data for research projects and analysis Crisis Management**: Monitor breaking news and emerging stories ๏ฟฝ๏ฟฝ Important Notes Respect the target website's terms of service and robots.txt Consider implementing delays between requests for large datasets Regularly review and update your scraping parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly ๐ง Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Data formatting issues: Review the Code node's JavaScript logic Rate limiting: Adjust schedule frequency and implement delays Pro Tips: Keep detailed configuration notes in the sticky notes within the workflow Test with a small website first before scaling to larger news sites Consider adding filters in the Code node to exclude certain article types or categories Monitor execution logs for any issues and adjust parameters accordingly Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations
by PDF Vector
Overview Organizations struggle to make their document repositories searchable and accessible. Users waste time searching through lengthy PDFs, manuals, and documentation to find specific answers. This workflow creates a powerful API service that instantly answers questions about any document or image, perfect for building customer support chatbots, internal knowledge bases, or interactive documentation systems. What You Can Do This workflow creates a RESTful webhook API that accepts questions about documents and returns intelligent, contextual answers. It processes various document formats including PDFs, Word documents, text files, and images using OCR when needed. The system maintains conversation context through session management, caches responses for performance, provides source references with page numbers, handles multiple concurrent requests, and integrates seamlessly with chatbots, support systems, or custom applications. Who It's For Perfect for developer teams building conversational interfaces, customer support departments creating self-service solutions, technical writers making documentation interactive, organizations with extensive knowledge bases, and SaaS companies wanting to add document Q&A features. Ideal for anyone who needs to make large document repositories instantly searchable through natural language queries. The Problem It Solves Traditional document search returns entire pages or sections, forcing users to read through irrelevant content to find answers. Support teams repeatedly answer the same questions that are already documented. This template creates an intelligent Q&A system that provides precise, contextual answers to specific questions, reducing support tickets by up to 60% and improving user satisfaction. Setup Instructions Install the PDF Vector community node from n8n marketplace Configure your PDF Vector API key Set up the webhook URL for your API endpoint Configure Redis or database for session management Set response caching parameters Test the API with sample documents and questions Key Features RESTful API Interface**: Easy integration with any application Multi-Format Support**: Handle PDFs, Word docs, text files, and images OCR Processing**: Extract text from scanned documents and screenshots Contextual Answers**: Provide relevant responses with source citations Session Management**: Enable conversational follow-up questions Response Caching**: Improve performance for frequently asked questions Analytics Tracking**: Monitor usage patterns and popular queries Error Handling**: Graceful fallbacks for unsupported documents API Usage Example POST https://your-n8n-instance.com/webhook/doc-qa Content-Type: application/json { "documentUrl": "https://example.com/user-manual.pdf", "question": "How do I reset my password?", "sessionId": "user-123", "includePageNumbers": true } Customization Options Add authentication and rate limiting for production use, implement multi-document search across entire repositories, create specialized prompts for technical documentation or legal documents, add automatic language detection and translation, build conversation history tracking for better context, integrate with Zendesk, Intercom, or other support systems, and enable direct file upload support for local documents. Note: This workflow uses the PDF Vector community node. Make sure to install it from the n8n community nodes collection before using this template.
by Entech Solutions
Short Description Automatically exports customer records from NetSuite and syncs them into Salesforce โ creating or updating Accounts and Contacts depending on whether the NetSuite record represents a company or an individual. Who is this for / Use case This template is designed for businesses and integration developers who: Use NetSuite as their ERP or CRM system. Need to automatically sync customer data (both companies and individuals) into Salesforce. Want a reliable, scalable solution that supports pagination and upserts. Prefer a ready-to-use, easily customizable workflow built on n8n. How it works Initialize pagination offset. Fetch customer records from NetSuite in batches via REST API. Split the retrieved array into individual items. Check record type โ company or individual. Upsert to Salesforce: For companies โ creates/updates Account records. For individuals โ creates/updates both Account and Contact (linked to the Account). Merge results from both branches. Update pagination offset for the next batch. Repeat until all records are processed, then end the workflow. Setup / Configuration Import the workflow into your n8n instance. Set your NetSuite and Salesforce credentials. Ensure your Salesforce Account/Contact objects have a matching External ID field. (Optional) Adjust mapping fields or filters to your data needs. Execute manually or trigger on a schedule (e.g., daily sync).
by vinci-king-01
How it works This workflow automatically analyzes real estate market sentiment by scraping investment forums and news sources, then provides AI-powered market predictions and investment recommendations. Key Steps Scheduled Trigger - Runs on a cron schedule to regularly monitor market sentiment. Multi-Source Scraping - Uses ScrapeGraphAI to extract discussions from BiggerPockets forums and real estate news articles. Sentiment Analysis - JavaScript nodes analyze text content for bullish/bearish keywords and calculate sentiment scores. Market Prediction - Generates investment recommendations (buy/sell/hold) based on sentiment analysis with confidence levels. Timing Optimization - Provides optimal timing recommendations considering seasonal factors and market urgency. Investment Advisor Alerts - Formats comprehensive reports with actionable investment advice. Telegram Notifications - Sends formatted alerts directly to your Telegram channel for instant access. Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for web scraping. Set up Telegram bot - Create a bot via @BotFather and add your bot token and chat ID. Customize data sources - Update the URLs to target specific real estate forums or news sources. Adjust schedule frequency - Modify the cron expression based on how often you want sentiment updates. Test sentiment analysis - Run manually first to ensure the analysis logic works for your market. Configure alert preferences - Customize the alert formatting and priority levels as needed. Technologies Used ScrapeGraphAI** - For extracting structured data from real estate forums and news sites JavaScript Code Nodes** - For sentiment analysis, market prediction, and timing optimization Schedule Trigger** - For automated execution using cron expressions Telegram Integration** - For instant mobile notifications and team alerts JSON Data Processing** - For structured sentiment analysis and market intelligence
by vinci-king-01
Influencer Content Monitor with ScrapeGraphAI Analysis and ROI Tracking ๐ฏ Target Audience Marketing managers and brand managers Influencer marketing agencies Social media managers Digital marketing teams Brand partnerships coordinators Marketing analysts and strategists Campaign managers ROI and performance analysts ๐ Problem Statement Manual monitoring of influencer campaigns is time-consuming and often misses critical performance insights, brand mentions, and ROI calculations. This template solves the challenge of automatically tracking influencer content, analyzing engagement metrics, detecting brand mentions, and calculating campaign ROI using AI-powered analysis and automated workflows. ๐ง How it Works This workflow automatically monitors influencer profiles and content using ScrapeGraphAI for intelligent analysis, tracks brand mentions and sponsored content, calculates performance metrics, and provides comprehensive ROI analysis for marketing campaigns. Key Components Daily Schedule Trigger - Runs automatically every day at 9:00 AM to monitor influencer campaigns ScrapeGraphAI - Influencer Profiles - Uses AI to extract profile data and recent posts from Instagram Content Analyzer - Analyzes post content for engagement rates and quality scoring Brand Mention Detector - Identifies brand mentions and sponsored content indicators Campaign Performance Tracker - Tracks campaign metrics and KPIs Marketing ROI Calculator - Calculates return on investment for campaigns ๐ Data Analysis Specifications The template analyzes and tracks the following metrics: | Metric Category | Data Points | Description | Example | |----------------|-------------|-------------|---------| | Profile Data | Username, Followers, Following, Posts Count, Bio, Verification Status | Basic influencer profile information | "@influencer", "100K followers", "Verified" | | Post Analysis | Post URL, Caption, Likes, Comments, Date, Hashtags, Mentions | Individual post performance data | "5,000 likes", "150 comments" | | Engagement Metrics | Engagement Rate, Content Quality Score, Performance Tier | Calculated performance indicators | "3.2% engagement rate", "High performance" | | Brand Detection | Brand Mentions, Sponsored Content, Mention Count | Brand collaboration tracking | "Nike mentioned", "Sponsored post detected" | | Campaign Performance | Total Reach, Total Engagement, Average Engagement, Performance Score | Overall campaign effectiveness | "50K total reach", "85.5 performance score" | | ROI Analysis | Total Investment, Estimated Value, ROI Percentage, Cost per Engagement | Financial performance metrics | "$2,500 investment", "125% ROI" | ๐ ๏ธ Setup Instructions Estimated setup time: 20-25 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Instagram accounts to monitor (influencer usernames) Campaign budget and cost data for ROI calculations Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Schedule Trigger Configure the daily schedule (default: 9:00 AM UTC) Adjust timezone to match your business hours Set appropriate frequency for your monitoring needs 4. Configure Influencer Monitoring Update the websiteUrl parameter with target influencer usernames Customize the user prompt to extract specific profile data Set up monitoring for multiple influencers if needed Configure brand keywords for mention detection 5. Customize Brand Detection Update brand keywords in the Brand Mention Detector node Add sponsored content indicators (#ad, #sponsored, etc.) Configure brand mention sensitivity levels Set up competitor brand monitoring 6. Configure ROI Calculations Update cost estimates in the Marketing ROI Calculator Set value per engagement and reach metrics Configure campaign management costs Adjust ROI calculation parameters 7. Test and Validate Run the workflow manually with test data Verify all analysis steps complete successfully Check data accuracy and calculation precision Validate ROI calculations with actual campaign data ๐ Workflow Customization Options Modify Monitoring Parameters Adjust monitoring frequency (hourly, daily, weekly) Add more social media platforms (TikTok, YouTube, etc.) Customize engagement rate calculations Modify content quality scoring algorithms Extend Brand Detection Add more sophisticated brand mention detection Implement sentiment analysis for brand mentions Include competitor brand monitoring Add automated alert systems for brand mentions Customize Performance Tracking Modify performance tier thresholds Add more detailed engagement metrics Implement trend analysis and forecasting Include audience demographic analysis Output Customization Add integration with marketing dashboards Implement automated reporting systems Create alert systems for performance drops Add campaign comparison features ๐ Use Cases Influencer Campaign Monitoring**: Track performance of influencer partnerships Brand Mention Detection**: Monitor brand mentions across influencer content ROI Analysis**: Calculate return on investment for marketing campaigns Competitive Intelligence**: Monitor competitor brand mentions Performance Optimization**: Identify top-performing content and influencers Campaign Reporting**: Generate automated reports for stakeholders ๐จ Important Notes Respect Instagram's terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update brand keywords and detection parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider data privacy and compliance requirements Ensure accurate cost data for ROI calculations ๐ง Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Instagram access issues: Check account accessibility and rate limits Brand detection false positives: Adjust keyword sensitivity ROI calculation errors: Verify cost and value parameters Schedule trigger failures: Check timezone and cron expression Data parsing errors: Review the Code node's JavaScript logic Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Instagram API documentation and best practices Influencer marketing analytics best practices ROI calculation methodologies and standards
by Piotr Sobolewski
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works This workflow keeps you perfectly informed about your upcoming week's schedule. Every Sunday evening, it automatically: Fetches all your scheduled meetings for the upcoming week (Monday to Sunday) from your connected calendar. Compiles a concise summary of these meetings, including dates, times, and event titles. Sends you a single, organized email so you can quickly review your agenda and prepare for the week ahead. Stay ahead of your schedule with this proactive meeting notification system. Set up steps Setting up this workflow is quick and easy, typically taking less than 10 minutes. You'll primarily need to: Authenticate your preferred calendar service (e.g., Google Calendar). Connect your email sending service (e.g., Gmail). Specify the email address where you want to receive the summary. All detailed setup instructions and specific configuration guidance are provided within the workflow itself using sticky notes.
by Chris Rudy
Who's it for Service businesses that handle installation appointments and need an efficient approval process. Perfect for HVAC companies, internet providers, appliance installers, or any business that requires team coordination before confirming customer appointments. How it works This workflow automates your entire installation booking process from form submission to customer communication. When a customer submits a booking request, your team gets notified via Slack with easy approve/reject buttons. Based on the team's decision, customers automatically receive either a confirmation email with appointment details or a friendly reschedule request with a new booking link. How to set up Connect your accounts: Link your Slack workspace and Gmail account to n8n Configure the Slack channel: Update the SLACK_CHANNEL_ID in the "Set Fields" node with your desired channel Customize your branding: Update COMPANY_NAME, CONTACT_PERSON, and RESCHEDULE_LINK variables Deploy the form: Use the form trigger URL on your website or share it directly with customers Test the workflow: Submit a test booking to ensure everything works smoothly Requirements Active Slack workspace with a designated channel for booking notifications Gmail account for sending automated emails Basic n8n account (free tier works perfectly) How to customize the workflow The workflow is designed for easy customization through the "Set Fields" node. You can modify: Time slots**: Edit the dropdown options in the form trigger Email templates**: Customize confirmation and reschedule email content Slack notifications**: Adjust the message format and approval options Form fields**: Add or remove customer information fields as needed The configuration variables in the "Set Fields" node make it easy to adapt this workflow to your specific business needs without touching the core logic.
by Rahul Joshi
Description: Never leave your leads waiting! This n8n workflow template ensures every inquiry gets a timely and professional responseโwhether itโs business hours or after hours. By checking submission times, the automation sends tailored email replies, updates your team instantly, and ensures no lead goes unnoticed. This workflow monitors a Google Sheet for new form responses, waits briefly to capture complete data, checks if the submission falls within business hours, and then sends either a standard business reply or a polite after-hours acknowledgment. Additionally, it alerts your team via Telegram with lead details for quick follow-up. Perfect for sales, support, and operations teams managing inbound leads. Features ๐ Monitors Google Sheets for new or updated lead form responses โ Adds a short delay to ensure clean, complete data capture โฐ Detects whether submissions fall within business hours (9 AMโ6 PM, MonโFri) ๐ง Sends automated Gmail replies (business hours vs. after-hours messaging) ๐ฒ Notifies your team instantly on Telegram with lead details ๐ Fully automated: hands-free lead acknowledgment + team alerts How It Works Google Sheets Trigger โ Watches for new form submissions in your lead sheet. Data Validation โ Waits 5 minutes to ensure complete entry. Business Hours Check โ Determines if the inquiry is within working hours. Email Response โ Sends tailored Gmail reply (business hours or after-hours). Telegram Notification โ Instantly notifies your team with lead details. Setup Instructions Google Sheets Setup Create a Google Sheet with these columns: Name Email Address Phone Number Message Submission Time (timestamp) Connect your sheet to n8n using Google Sheets credentials. Gmail Setup Connect your Gmail account in n8n credentials. Prepare two email templates: Business Hours Reply: โHi {{Name}}, thank you for reaching out! Our team will get back to you shortly.โ After-Hours Reply: โHi {{Name}}, thank you for contacting us! Our team will get back to you tomorrow.โ Telegram Setup Create a Telegram bot via @BotFather. Add your bot to the target group or chat. Store the bot token and chatId securely in n8n credentials. Workflow Configuration Import the workflow into your n8n instance. Replace hardcoded values with n8n credential references. Rename the โEdit Fieldsโ node to โFormat Lead Data for Notificationโ for clarity. Test by submitting a sample lead form entry. Customization โฐ Business Hours: Adjust the time window (e.g., 8 AMโ8 PM, MonโSat) in the workflow logic. ๐ง Email Templates: Personalize subject lines, add signatures, or include links to resources. ๐ฒ Notification Details: Choose which fields (e.g., phone number, notes) appear in the Telegram alert. โฑ Delay Time: Change the default 5-minute buffer to suit your formโs response timing. Security Best Practices โ Do not hardcode Gmail or Telegram credentials. Always use n8n credentials. โ Remove private data (chatIds, sheet IDs) before sharing templates. โ Restrict credential access to authorized team members. Requirements Google Sheets (with structured form responses) Gmail account for automated replies Telegram bot & chat for notifications n8n instance (self-hosted or cloud) This workflow is perfect for: ๐ข Sales Teams handling high lead volume ๐ฌ Support Teams ensuring fast first responses ๐ Businesses offering 24/7 responsiveness without manual effort ๐ฒ Operations Teams needing structured alerts and accountability