by slow-groovin@api2o.com
AI Comprehensive Research on User's Query with Gemini and Web Search What is this? Perform comprehensive research on a user's query by dynamically generating search terms, querying the web using Google Search (by Gemini) , reflecting on the results to identify knowledge gaps, and iteratively refining its search until it can provide a well-supported answer with citations. (like Perplexity) This workflow is a reproduction of gemini-fullstack-langgraph-quickstart in N8N. The gemini‑fullstack‑langgraph‑quickstart is a demo by the Google‑Gemini team that showcases how to build a powerful full‑stack AI agent using Gemini and LangGraph How It Works Generate Query 💬 generates one or more search queries tasks based on the User's question. uses Gemini 2.0 Flash Web Research 🌐 execute web search tasks using the native Google Search API tool in combination with Gemini 2.0 Flash. Reflection 📚 Identifies knowledge gaps and generates potential follow-up queries. Setup Configure API Credentials: Create Google Gemini(PaLM) Api Credential using you own Gemini key Connect the credential with three nodes: Google Gemini Chat Model and GeminiSearch and reflection Configure Redis Source: prepare a Redis service that can be accessed by n8n Create Redis Crediential and connect it with all Redis node Customize Try using different Gemini models. Try modifying the parameters number_of_initial_queries and max_research_loops. Why use Redis? Use Redis as an external storage to maintain global variables (counter, search results, etc.) This workflow contains a loop process, which need global variables (as State in LangGraph). It is difficult to achieve global variables management without external storage in n8n.
by Dvir Sharon
🎯 Automated TikTok Influencer Discovery & Analysis A complete n8n automation that discovers TikTok influencers using Bright Data, evaluates their fit using Claude AI, and sends personalized outreach emails. Designed for marketing teams and brands that need a scalable, intelligent way to find and connect with relevant creators. 📋 Overview This workflow provides a full-service influencer discovery pipeline: it finds TikTok profiles using search keywords, uses AI to assess alignment with your brand, and initiates contact with qualified influencers. Ideal for influencer marketing, brand partnerships, and campaign planning. ✨ Key Features 🔍 Keyword-Based Discovery** Locate TikTok influencers by specific niche-related keywords. 📊 Bright Data Integration** Access accurate TikTok profile data from Bright Data’s datasets. 🤖 AI-Powered Analysis** Claude AI evaluates each profile's fit with your brand based on bio, content, and more. 📧 Smart Email Notifications** Sends tailored outreach emails to creators deemed highly relevant. 📈 Data Storage** Google Sheets stores profile details, AI evaluation results, and outreach status. 🎯 Intelligent Filtering** Processes only influencers who meet your criteria (e.g., 5000+ followers, industry match). ⚡ Fast & Reliable** Uses professional scraping with robust error handling. 🔄 Batch Processing** Supports bulk influencer processing through a single automated flow. 🎯 What This Workflow Does Input Search Keywords** – TikTok terms for finding niche creators Business Info** – Brand description and industry Collaboration Criteria** – Follower count minimum, niche alignment Processing Steps Form Submission TikTok Discovery via Bright Data Data Extraction and Normalization Save to Google Sheets Relevance Scoring via Claude AI Filtering Based on AI Score + Follower Count Personalized Email Outreach Output Data Points | Field | Description | Example | |---------------|------------------------------------|----------------------------------| | Profile ID | TikTok profile identifier | tiktoker123456 | | Username | TikTok handle | @creativecreator | | URL | Profile link | https://tiktok.com/@creativecreator | | Description | Creator bio | "Fashion & lifestyle content..." | | Followers | Total follower count | 50,000 | | Collaboration | AI assessment of brand fit | "Highly Relevant" | | Analysis | 50-word Claude AI relevance summary| "Strong alignment with fashion..."| 🚀 Setup Instructions Prerequisites n8n (cloud or self-hosted) Bright Data account with TikTok access Google Sheets + Gmail Anthropic Claude API key 10–15 minutes setup time Step-by-Step Setup Import Workflow via JSON in n8n Configure Bright Data – Add API credentials and dataset ID Google Sheets – Setup credentials and map columns Claude AI – Insert API key and select desired model Gmail – Authenticate Gmail and update mail node settings Update Variables – Replace placeholders with business info Test & Launch – Submit a sample form and verify all outputs 📖 Usage Guide Adding Search Keywords Submit the form with search terms, business description, and industry category to trigger the workflow. Understanding AI Analysis Emails are sent only if: Collaboration status = Highly Relevant Follower count ≥ 5000 Industry alignment confirmed Claude AI returns a 50-word analysis justifying the match Customizing Filters Edit the "Find the Collaborator" prompt to adjust: Follower thresholds Industry relevance Additional metrics (e.g., engagement rate) Viewing Results Google Sheets log includes: Influencer metadata AI scores and rationale Collaboration status Email delivery timestamp 🔧 Customization Options Add More Fields:** Engagement rate, contact email, content themes Email Personalization:** Customize message templates or integrate other mail services Enhanced Filtering:** Use engagement rates, region, content frequency 🚨 Troubleshooting | Issue | Fix | |-------|-----| | Bright Data fails | Recheck API and dataset ID | | No influencer data | Adjust keywords or dataset scope | | Sheets permission error | Re-authenticate and check sharing | | Claude fails | Validate API key and prompt | | Emails not sent | Re-auth Gmail or update recipient field | | Form not triggering | Reconfirm webhook URL and permissions | Advanced Debugging Check n8n execution logs Run individual nodes for pinpointing failures Confirm all data formats Handle API rate limits Add error-catch nodes for retries 📊 Use Cases & Examples Brand Discovery:** Fashion, tech, fitness creators Competitor Insights:** Find influencers used by rival brands Campaign Planning:** Build targeted influencer lists Market Research:** Identify creator trends across regions ⚙️ Advanced Configuration Batch Execution:** Process multiple keywords with delay logic Engagement Metrics:** Scrape and calculate likes-to-follower ratios CRM Integration:** Sync qualified profiles to HubSpot, Salesforce, or Slack 📈 Performance & Limits Processing Time:** 3–5 minutes per keyword Concurrency:** 3–5 simultaneous fetches (depends on plan) Accuracy:** >95% influencer data reliability Success Rate:** 90%+ for outreach and processing
by Roman Rozenberger
How it works • Extract AI Overviews from Google Search - Receives data from browser extension via webhook • Convert HTML to Markdown - Automatically processes and cleans AI Overview content • Store in Google Sheets - Archives all extracted AI Overviews with metadata and sources • Generate SEO Guidelines - AI analyzes page content vs AI Overview to suggest improvements • Automate Analysis - Batch process multiple URLs and schedule regular checks Set up steps • Import workflow - Load the JSON template into your n8n instance (2 minutes) • Configure Google Sheets - Set up OAuth connection and create spreadsheet with required columns (5 minutes) • Set up AI provider - Add OpenRouter API credentials for Gemini 2.5 Pro (3 minutes) • Install browser extension - Deploy the companion Chrome/Firefox extension for data extraction (5 minutes) • Test webhook endpoint - Verify the connection between extension and n8n workflow (2 minutes) Total setup time: ~15 minutes What you'll need: Google account for Sheets integration Google Sheet template with required columns OpenRouter API key for Gemini 2.5 Pro model access Browser extension: Chrome Extension or Firefox Add-on n8n instance (local or cloud) Use cases: SEO agencies** - Monitor AI Overview presence for client keywords Content marketers** - Analyze what content gets featured in AI Overviews E-commerce** - Track AI Overview coverage for product-related searches Research** - Build datasets of AI Overview content across different topics The workflow comes with a free browser extension (Chrome | Firefox) that automatically extracts AI Overview content from Google Search and sends it via webhook to your n8n workflow for processing and analysis. GitHub Repository: https://github.com/romek-rozen/ai-overview-extractor/ Detailed Setup Instructions - AI Overview Extractor Prerequisites n8n instance** (local or cloud) - version 1.95.3+ Google account** for Sheets integration OpenRouter API account** for Gemini 2.5 Pro access Browser** (Chrome/Firefox) for the extension Step 1: Import the Workflow Open n8n and navigate to Workflows Click "Add workflow" → "Import from JSON" Upload the AI_OVERVIES_EXTRACTOR_TEMPLATE.json file Save the workflow Step 2: Configure Google Sheets Create Google Sheets Document Create new Google Sheet with these columns: extractedAt | searchQuery | sources | markdown | myURL | task | guidelines | key Here is public google sheet template: https://docs.google.com/spreadsheets/d/15xqZ2dTiLMoyICYnnnRV-HPvXfdgVeXowr8a7kU4uHk/edit?gid=0#gid=0 Copy the Google Sheets URL (you'll need it for the workflow) Set up Google Sheets Credentials In n8n, go to Settings → Credentials Click "Add credential" → "Google Sheets OAuth2 API" Follow the OAuth setup to authorize n8n access to Google Sheets Name the credential (e.g., "Google Sheets AI Overview") Configure Google Sheets Nodes Update these nodes with your Google Sheets URL: Get URLs to Analyze Save AI Overview to Sheets Save SEO Guidelines to Sheets In each node: Set documentId to your Google Sheets URL Set sheetName to your Google Sheets URL Select your Google Sheets credential Step 3: Configure AI Provider (OpenRouter) Get OpenRouter API Key Sign up at https://openrouter.ai/ Generate API key in your account settings Add credits to your account Set up OpenRouter Credentials In n8n, go to Settings → Credentials Click "Add credential" → "OpenRouter API" Enter your API key Name the credential (e.g., "OpenRouter AI Overview") Configure OpenRouter Node Select the Gemini 2.5 Pro Model node Choose your credential from the dropdown Verify the model (default: google/gemini-2.5-pro-preview) Step 4: Install Browser Extension Install in Chrome Official Extension (Recommended) Visit: https://chromewebstore.google.com/detail/ai-overview-extractor/cbkdfibgmhicgnmmdanlhnebbgonhjje Click "Add to Chrome" Install in Firefox Official Add-on Visit: https://addons.mozilla.org/en-US/firefox/addon/ai-overview-extractor/ Click "Add to Firefox" Step 5: Configure Webhook Connection Get Webhook URL In n8n workflow, click on the Webhook node Copy the webhook URL (should be like: http://localhost:5678/webhook/ai-overview-extractor-template-123456789) Configure Extension Go to Google Search and perform any search with AI Overview Click the browser extension button (AI Overview Extractor) In webhook configuration section, paste your webhook URL Click "Test" - should show ✅ Test successful Click "Save" to store the configuration Step 6: Activate and Test Activate Workflow In n8n, toggle the workflow to "Active" (top right switch) Verify all nodes are properly configured Test End-to-End Go to Google Search Search for something that shows AI Overview Use the extension to extract AI Overview Send via webhook - check your Google Sheets for the data Verify the markdown conversion worked correctly Optional: Batch Analysis Setup For SEO Analysis Features In your Google Sheets, add URLs in the myURL column Set task column to "create guidelines" Run the workflow manually or wait for the 15-minute scheduler Check guidelines column for AI-generated SEO recommendations Troubleshooting Webhook Issues Ensure n8n is running on port 5678 Check if workflow is activated Verify webhook URL format Google Sheets Errors Confirm OAuth credentials are working Check sheet URL format Verify column names match exactly Ensure nodes Get URLs to Analyze, Save AI Overview to Sheets, and Save SEO Guidelines to Sheets are properly configured OpenRouter Issues Check API key validity Ensure sufficient account credits Try different models if Gemini 2.5 Pro fails Verify the Gemini 2.5 Pro Model node is properly connected Extension Problems Check browser console for errors Verify extension is properly installed Ensure you're on google.com/search pages Confirm webhook URL is correctly configured in extension Next Steps Customize AI prompts** in the Generate SEO Recommendations node for your specific needs Adjust scheduler frequency** (default: 15 minutes) Add more URL analysis** by populating Google Sheets Monitor usage** and API costs Support GitHub Issues**: https://github.com/romek-rozen/ai-overview-extractor/issues n8n Community**: https://community.n8n.io/ Template Documentation**: Check the included README files
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for This workflow automates the real-time extraction of Job Descriptions and Salary Information from job listing pages using Bright Data MCP and analyzes content using OpenAI GPT-4o mini. This workflow is ideal for: Recruiters & HR Tech Startups**: Automate job data collection from public listings Market Intelligence Teams**: Analyze compensation trends across companies or geographies Job Boards & Aggregators**: Power search results with structured, enriched listings AI Workflow Builders**: Extend to other career platforms or automate resume-job match analysis Analysts & Researchers**: Track hiring signals and salary benchmarks in real time What problem is this workflow solving? Traditional scraping of job portals can be challenging due to cluttered content, anti-scraping measures, and inconsistent formatting. Manually analyzing salary ranges and job descriptions is tedious and error-prone. This workflow solves the problem by: Simulating user behavior using Bright Data MCP Client to bypass anti-scraping systems Extracting structured, clean job data in Markdown format Using OpenAI GPT-4o mini to analyze and extract precise salary details and refined job descriptions Merging and formatting the result for easy consumption Delivering final output via webhook, Google Sheets, or file system What this workflow does Components & Flow Input Nodes job_search_url: The job listing or search result URL job_role: The title or role being searched for (used in logging/formatting) MCP Client Operations MCP Salary Data Extractor Simulates browser behavior and scrapes salary-related content (if available) MCP Job Description Extractor Extracts full job description as structured Markdown content OpenAI GPT-4o mini Nodes Salary Information Extractor Uses GPT-4o mini to detect, clean, and standardize salary range data (if any) Job Description Refiner Extracts role responsibilities, qualifications, and benefits from unstructured text Company Information Extractor Uses Bright Data MCP and GPT-4o mini to extract the company information Merge Node Combines the refined job description and extracted salary information into a unified JSON response object Aggregate node Aggregates the job description and salary information into a single JSON response object Final Output Handling The output is handled in three different formats depending on your downstream needs: Save to Disk** Output stored with filename including timestamp and job role Google Sheet Update** Adds a new row with job role, salary, summary, and link Webhook Notification** Pushes merged response to an external system Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. In n8n, configure the OpenAi account credentials. In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs Modify Input Source Change the job_search_url to point to any job board or aggregator Customize job_role to reflect the type of jobs being analyzed Tweak LLM Prompts (Optional) Refine GPT-4o mini prompts to extract additional fields like benefits, tech stacks, remote eligibility Change Output Format Customize the merged object to output JSON, CSV, or Markdown based on downstream needs Add additional destinations (e.g., Slack, Airtable, Notion) via n8n nodes
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for The DNB Company Search & Extract workflow is designed for professionals who need to gather structured business intelligence from Dun & Bradstreet (DNB). It is ideal for: Market Researchers B2B Sales & Lead Generation Experts Business Analysts Investment Analysts AI Developers Building Financial Knowledge Graphs What problem is this workflow solving? Gathering business information from the DNB website usually involves manual browsing, copying company details, and organizing them in spreadsheets. This workflow automates the entire data collection pipeline — from searching DNB via Google, scraping relevant pages, to structuring the data and saving it in usable formats. What this workflow does This workflow performs automated search, scraping, and structured extraction of DNB company profiles using Bright Data’s MCP search agents and OpenAI’s 4o mini model. Here's what it includes: Set Input Fields: Provide search_query and webhook_notification_url. Bright Data MCP Client (Search): Performs Google search for the DNB company URL. Markdown Scrape from DNB: Scrapes the company page using Bright Data and returns it as markdown. OpenAI LLM Extraction: Transforms markdown into clean structured data. Extracts business information (company name, size, address, industry, etc.) Webhook Notification: Sends structured response to your provided webhook. Save to Disk: Persists the structured data locally for logging or auditing. Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. In n8n, configure the OpenAi account credentials. In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token>. Update the Set input fields for search_query and webhook_notification_url. Update the file name and path to persist on disk. How to customize this workflow to your needs Search Engine**: Default is Google, but you can change the MCP client engine to Bing, or Yandex if needed. Company Scope**: Modify search query logic for niche filtering, e.g., "biotech startups site:dnb.com". Structured Fields**: Customize the LLM prompt to extract additional fields like CEO name, revenue, or ratings. Integrations**: Push output to Notion, Airtable, or CRMs like HubSpot using additional n8n nodes. Formatting**: Convert output to PDF or CSV using built-in File and Spreadsheet nodes.
by InfyOm Technologies
✅ What problem does this workflow solve? Automatically monitor multiple websites every 5 minutes, log downtime, notify your team instantly via multiple channels, and track uptime/downtime in a Google Sheet—without relying on expensive monitoring tools. ⚙️ What does this workflow do? Triggers every 5 minutes to monitor website health. Fetches a list of website URLs from a Google Sheet. Checks the status of each website one by one. Sends instant alerts if a website is down (Email, Slack, Telegram, Voice Call). Logs downtime events in Google Sheets. Tracks when websites are back up and updates the log. Sends recovery notifications when a site is live again (Email, Slack, Telegram). 🔧 Setup 📄 Google Sheets Setup Sheet 1: List of website URLs to monitor. Sheet 2: Log to store uptime/downtime records. Sample Format: https://docs.google.com/spreadsheets/d/1_VVpkIvpYQigw5q0KmPXUAC2aV2rk1nRQLQZ7YK2KwY/edit?usp=sharing ✉️ Gmail, Slack & Telegram Setup Connect Gmail, Slack, and Telegram to n8n. Configure each service with proper credentials or OAuth. 📞 Vapi (Voice Call) Setup Create a Vapi account. Generate an API key. Configure API Parameters (vapi_api_key, assistant_id, number, phone_number_id) on VAPI Node. Insert the First Message specified in the Workflow. 🧠 How it Works ⏱ 1. Scheduled Monitoring A Schedule Trigger runs the workflow every 5 minutes. It reads the list of URLs from the Google Sheet and loops through each one. 🌍 2. Website Health Check Each website is pinged to check if it’s online. 🔴 3. If Website is Down: It verifies if a downtime record already exists. If not, it: Adds a new row in the Google Sheet with the timestamp. Sends notifications via: 📧 Email 💬 Slack 📲 Telegram 📞 Voice Call via Vapi 🟢 4. If Website is Back Up: It fetches the matching downtime record. Updates the sheet with: ✅ Uptime timestamp ⏱ Total downtime duration Sends recovery notifications via: 📧 Email 💬 Slack 📲 Telegram (No phone call is made for uptime.) 👤 Who can use it? This is perfect for: 🚀 Startups 👨💻 Freelance Developers 🛠 SaaS Product Owners 🖥 IT/DevOps Teams If you're looking to replace tools like UptimeRobot, Pingdom, or StatusCake, this no-code solution gives you full control, customization, and cost-efficiency.
by Cecilia
Enable smart, real-time answers in your WhatsApp groups using a custom webhook, Pinecone vector database, and no Facebook Business setup. > 🟡 Note: This template uses a custom WhatsApp webhook. It does not use the official WhatsApp Business API. 👥 Who is this for? This workflow is designed for individuals and teams who want to enable smart WhatsApp group automation — without going through Meta’s official WhatsApp Business API. Ideal for small businesses, internal teams, communities, and personal power users. ❓ What problem is this solving? Setting up WhatsApp bots with intelligent responses often requires approval from Meta and a verified business account. This workflow removes those barriers by using a self-hosted webhook to handle incoming messages and respond using a document-trained AI via Pinecone. ⚙️ What this workflow does Connects a regular WhatsApp number to a custom webhook Adds the bot to any group chat (it stays silent unless mentioned) Indexes documents from Google Drive into Pinecone Responds with intelligent, context-aware answers from your custom knowledge base Auto-updates its knowledge every minute as the document changes 🛠️ Setup Step 1: Connect Google Drive Set up your Google Drive credentials in n8n Step 2: Configure Pinecone Create an index in Pinecone Dimension: 1536 Select this index in both Pinecone nodes Click Test Workflow to ingest your document into Pinecone Step 3: Get Access to the WhatsApp Webhook Fill out this form to request access You’ll receive a WhatsApp confirmation for linking Step 4: Test WhatsApp Integration ✅ One-on-one test: Send a message from another number 👥 Group test: Add the bot to a group; it will only respond when tagged 🧩 How to customize this workflow Modify the system prompt inside the AI agent node to control tone and behavior Update the connected Google Doc to match your specific domain (e.g. FAQs, SOPs, product manuals) Adjust the Pinecone sync frequency if you want updates more or less often 📚 Use cases Customer Support**: Instant, intelligent replies in WhatsApp without live agents Team Knowledge Bot**: Tag the bot for quick access to SOPs and internal docs Community Groups**: Automate common questions while keeping noise low Personal AI Assistant**: A WhatsApp chatbot trained on your notes and files 📝 Sticky Note Suggestion 💬 What this template does: > Enables an AI bot in your WhatsApp group that answers questions based on a Google Doc you provide. It uses a custom webhook, Google Drive, and Pinecone. 🔧 Requirements: > Google Drive account > Pinecone account with an index (dimension 1536) > Access to the custom WhatsApp webhook (see setup steps)
by Praveena
Why Teachers now spend 3-4 hours per lesson creating materials and resources from scratch. With additional/special needs, this makes it difficult to create additional materials. This is unsustainable and takes their time away from teaching. Tailored for UK teachers but can be expanded globally with prompt and form enhancements. How it works I built a system with three specialized AI agents that create complete lesson packages and automatically uploads a document in Google drive and puts an appointment in calendar to review the document. Features Research agent to pull specific information including special education needs and curriculums. The scoring and assessment agent to generate tailored assessment plans, assignments, grading mechanism based on chosen requirements. The integration agent just provides ideas to expand to other tools. In nfuture there is opportunity to add on Kahoot or other tools to create quizzes. Finally the enriched document is emailed and a calendar invite is sent for review. What you need N8N Any LLM API Key (I used OpenAI) Google drive integration Google calendar integration Modify the email id from XXX@gmail.com to your Email id in email component. Support Watch this video for intro on how it works. Contact me on info@pankstr.com for any queries.
by Dr. Firas
AI-powered WhatsApp booking system with instant SMS confirmations Who is this for? This workflow is designed for solo entrepreneurs, consultants, coaches, clinics, or any business that handles client appointments and wants to automate the entire scheduling experience via WhatsApp — without the need for live agents. What problem is this workflow solving? Responding to inbound messages, collecting booking details, suggesting available times, and sending reminders can be a huge time drain. This workflow eliminates manual handling by: Automating WhatsApp conversations with an AI assistant Booking appointments directly into Cal.com Sending timely SMS reminders before appointments It ensures you never miss a lead or a follow-up — even while you sleep. What this workflow does From a single WhatsApp message, the workflow: Triggers via a WhatsApp webhook Uses GPT-4 to handle conversation flow and qualify the prospect Collects name, email, selected service Calls Cal.com API to fetch available time slots Books the appointment and stores it in Google Sheets Sends a confirmation message via WhatsApp Periodically scans for upcoming appointments Sends SMS reminders to clients 2 hours before their session Setup Connect your Webhook node to a WhatsApp API (e.g., 360dialog, Twilio, or Ultramsg) Add your OpenAI API key for the GPT-4 nodes Configure your Cal.com API key and set your calendar ID Link your Google Sheets with fields like: name, email, date, time, status, reminder_sent Connect your SMS service (e.g., sms77) with API credentials Adjust the schedule in the reminder node as needed How to customize this workflow to your needs Change the language or tone of the AI assistant** by editing the system prompt in the GPT node Filter available time slots** by service, team member, or duration Modify the reminder timing** (e.g., 1 hour before, 24h before, etc.) Add conditional logic** to route users to different booking flows based on their responses Integrate additional CRMs** or notification channels like email or Slack 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Jimleuk
This n8n template demonstrates one approach to achieve a more natural and less frustration conversations with AI agents by reducing interrupts by predicting the end of user utterances. When we text or chat casually, it's not uncommon to break our sentences over multiple messages or when it comes to voice, break our speech with the odd pause or umms and ahhs. If an agent replies to every message, it's likely to interrupt us before we finish our thoughts and it can get very annoying! Previously, I demonstrated a simple technique for buffering each incoming message by 5 seconds but that approach still suffers in some scenarios when more time is needed. This technique has no arbitrary time limit and instead uses AI to figure out when its the agent's turn based on the user's message, allowing for the user to take all the time they need. How it works Telegram messages are received but no reply is generated for them by default. Instead they are sent to the prediction subworkflow to determine if a reply should be generated. The prediction subworkflow begins by checking Redis for the current user's prediction session state. If this is a new "utterance", it kicks off the "predict end of utterance" loop - the purpose of which is to buffer messages in a smart way! New users message can continue to be accepted by the workflow until enough is collected to allow our prediction classifier to determine the end of the utterance has been reached. The loop is then broken and the buffered chat messages are combined and sent to the AI agent to generate a response and sent to the user via the telegram node. The prediction session state is then deleted to signal the workflow is ready to start again with a new message. How to use This system sits between your preferred chat platform and the AI agent so all you need to do is replace the telegram nodes as required. Where LLM-only prediction isn't working well enough, consider more traditional code-based checking of heuristics to improve the detection. Ideally you'll want a fast but accurate LLM so your user isn't waiting longer than they have to - at time of writing Gemini-2.5-flash-lite was the fastest in testing but keep a look out for smaller and more powerful LLMs in the future. Requirements Gemini for LLM Redis for session management Telegram for chat platform
by Incrementors
🛒 Google Maps Business Phone Number Scraper Using Bright Data API & Google Sheets Integration This template requires a self-hosted n8n instance to run. An automated workflow that extracts business information including phone numbers from Google Maps using Bright Data's API and saves the data to Google Sheets for easy access and analysis. 📋 Overview This workflow provides an automated solution for extracting business contact information from Google Maps based on location and keyword searches. Perfect for lead generation, market research, competitor analysis, and business directory creation. ✨ Key Features 🎯 Form-Based Input: Easy-to-use form for location and keyword submission 🗺️ Google Maps Integration: Uses Bright Data's Google Maps dataset for accurate business data 📊 Comprehensive Data Extraction: Extracts business names, addresses, phone numbers, ratings, and more 📧 Automated Processing: Handles the entire scraping process automatically 📈 Google Sheets Storage: Automatically saves extracted data to organized spreadsheets 🔄 Smart Status Checking: Monitors scraping progress with automatic retry logic ⚡ Fast & Reliable: Professional scraping with built-in error handling 🎯 Customizable Output: Configurable data fields for specific business needs 🎯 What This Workflow Does Input Location:** Geographic area to search (city, state, country) Keywords:** Business type or industry keywords Processing Form Submission: User submits location and keywords through web form API Request: Sends scraping request to Bright Data's Google Maps dataset Status Monitoring: Continuously checks scraping progress Data Retrieval: Fetches completed business data when ready Data Storage: Saves extracted information to Google Sheets Error Handling: Implements retry logic for failed requests Output Data Points | Field | Description | Example | |-------|-------------|---------| | Business Name | Official business name from Google Maps | "Joe's Pizza Restaurant" | | Phone Number | Contact phone number | "+1-555-123-4567" | | Address | Complete business address | "123 Main St, New York, NY 10001" | | Rating | Google Maps rating score | 4.5 | | URL | Google Maps listing URL | "https://maps.google.com/..." | 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with Google Maps dataset access 5-10 minutes for setup Step 1: Import the Workflow Copy the JSON workflow code from the provided file In n8n: Workflows → + Add workflow → Import from JSON Paste JSON and click Import Step 2: Configure Bright Data Set up Bright Data credentials: In n8n: Credentials → + Add credential → HTTP Request Auth Enter your Bright Data API key Test the connection Configure dataset: Ensure you have access to Google Maps dataset (gd_m8ebnr0q2qlklc02fz) Verify dataset permissions in Bright Data dashboard Step 3: Configure Google Sheets Integration Create a Google Sheet: Go to Google Sheets Create a new spreadsheet named "Business Data" or similar Copy the Sheet ID from URL: https://docs.google.com/spreadsheets/d/SHEET_ID_HERE/edit Set up Google Sheets credentials: In n8n: Credentials → + Add credential → Google Sheets OAuth2 API Complete OAuth setup and test connection Prepare your data sheet with columns: Column A: Name Column B: Address Column C: Rating Column D: Phone Number Column E: URL Step 4: Update Workflow Settings Update Google Sheets node: Open "Save to Google Sheets" node Replace the document ID with your Sheet ID Select your Google Sheets credential Choose the correct sheet/tab name Update Bright Data nodes: Open HTTP Request nodes Replace BRIGHT_DATA_API_KEY with your actual API key Verify dataset ID matches your subscription Step 5: Test & Activate Test the workflow: Activate workflow (toggle switch) Submit test form with location: "New York" and keywords: "restaurants" Verify data appears in Google Sheet Check for proper phone number extraction 📖 Usage Guide Submitting Search Requests Access the form URL provided by n8n Enter the desired location (city, state, or country) Enter relevant keywords (business type, industry, etc.) Submit the form and wait for processing Understanding the Results Your Google Sheet will populate with business data including: Complete business contact information Verified phone numbers from Google Maps Accurate addresses and ratings Direct links to Google Maps listings 🔧 Customization Options Adding More Data Points Edit the "Bright Data API - Request Business Data" node to capture additional fields: Business descriptions Operating hours Reviews count Website URLs Photos and videos Modifying Search Parameters Customize the search behavior: Adjust "limit_per_input" for more or fewer results Modify search type and discovery method Add geographical coordinates for precise targeting 🚨 Troubleshooting Common Issues & Solutions 1. "Bright Data connection failed" Cause:** Invalid API credentials or dataset access Solution:** Verify credentials in Bright Data dashboard, check dataset permissions 2. "No business data extracted" Cause:** Invalid search parameters or no results found Solution:** Try broader keywords or different locations, verify dataset availability 3. "Google Sheets permission denied" Cause:** Incorrect credentials or sheet permissions Solution:** Re-authenticate Google Sheets, check sheet sharing settings 4. "Workflow execution timeout" Cause:** Large search results or slow API response Solution:** Reduce search scope, increase timeout settings, check internet connection 📊 Use Cases & Examples 1. Lead Generation Goal:** Find potential customers in specific areas Search for businesses by industry and location Extract contact information for outreach campaigns Build targeted prospect lists 2. Market Research Goal:** Analyze local business landscape Study competitor density in target markets Identify market gaps and opportunities Gather business intelligence for strategic planning 3. Directory Creation Goal:** Build comprehensive business directories Create industry-specific business listings Maintain updated contact databases Support local business communities 📈 Performance & Limits Expected Performance Processing time:** 1-5 minutes per search depending on results Data accuracy:** 95%+ for active Google Maps listings Success rate:** 90%+ for accessible businesses Concurrent requests:** Depends on Bright Data plan limits Resource Usage Memory:** ~50MB per execution Storage:** Minimal (data stored in Google Sheets) API calls:** 2-3 Bright Data calls + 1 Google Sheets call per search Bandwidth:** ~1-2MB per search request Execution time:** 2-5 minutes for typical searches Scaling Considerations Rate limiting:** Respect Bright Data API limits Error handling:** Implement retry logic for failed requests Data validation:** Add checks for incomplete business data Cost optimization:** Monitor API usage to control expenses Batch processing:** Group multiple searches for efficiency 🤝 Support & Community Getting Help n8n Community Forum:** community.n8n.io Documentation:** docs.n8n.io Bright Data Support:** Contact through your dashboard GitHub Issues:** Report bugs and feature requests Contributing Share improvements with the community Report issues and suggest enhancements Create variations for specific use cases Document best practices and lessons learned 🎯 Ready to Use! This workflow provides a solid foundation for automated Google Maps business data extraction. Customize it to fit your specific needs and use cases. Your workflow URL: https://your-n8n-instance.com/workflow/google-maps-scraper For any questions or support, please contact: info@incrementors.com or fill out this form: https://www.incrementors.com/contact-us/
by Agent Studio
Restore backed up workflows from GitHub to your n8n workspace. This workflow was inspired by this one that lets you back up your n8n workflows to GitHub. It will let you restore your backed up workflows in your workspace, without creating duplicates. In case of issue with your instance, it will save you a lot of time to restore them. How it works It retrieves the workflows saved in a GitHub repository. Then compares these saved workflows with the ones in your n8n workspace based on the name. It will only create them if they don't already exist. Set up steps Open the "Global" node and set your own information (see Configuration below) Click on "Test workflow" It will run through all the workflows in the GitHub repository, check if the name doesn't already exist in your workspace and, in this case, create it. Configuration repo.owner: your GitHub owner name repo.name: your GitHub repository name repo.path: the path within the GitHub repository