by Basil Irfan
Streamline restaurant reservations on WhatsApp Overview This n8n template automates table bookings via WhatsApp, letting users request, confirm, and manage reservations without manual intervention. It leverages AI to parse messages, apply group discounts, check availability, and send natural confirmations—all within a single, reusable workflow. Key Features AI‑powered parsing & responses**: Extracts guest name, date, time, and party size from free‑form WhatsApp messages and generates friendly confirmations.. Availability lookup**: Integrates with Google Sheets, Airtable, or MySQL to verify slot availability in real time. Automated reminders**: Optionally schedules follow‑up messages 24 hours before the booking. Modular design**: Swap triggers, storage, or messaging nodes to fit your infrastructure. How It Works Trigger: Incoming WhatsApp message via WhatsApp Business Cloud API. Parse & Validate: AI Function node extracts intent and guest details. Calculate Discount: Custom Function node computes group discount. Compose Confirmation: Open Ai text model generates a personalized response. Send Message:Request node posts back to WhatsApp. Optional Reminder: Wait node + HTTP Request for pre‑booking follow‑up. Requirements WhatsApp Business Cloud API access n8n Cloud or self‑hosted instance Reservation datastore (Google Sheets, Airtable, MySQL) Open ai key for AI text generation Customization Tips Menu Attachments**: Add media nodes to send PDFs or images. Alternate Slot Suggestions**: Use AI to propose new times if a slot is full. Upsell Offers**: Follow up with add‑on suggestions (e.g., wine pairings). Localization**: Extend prompts for multilingual support.
by Max Mitcham
An intelligent automation workflow that processes website demo requests, qualifies leads using AI-powered analysis, and automatically nurtures prospects through personalized follow-up sequences to maximize conversion rates. Overview This workflow transforms raw website leads into qualified prospects through intelligent filtering, enrichment, and personalized nurturing. It combines AI-powered qualification with automated follow-up to ensure high-quality leads receive immediate attention while nurturing those needing additional touchpoints. 🔄 Workflow Process 1. Entry Point - Webhook Website form submission capture Receives demo requests from website forms in real-time Captures lead data including LinkedIn URL, email, use case, and referral source 2. Initial Routing Filter Source-based lead classification Filters out low-quality leads from "lead_capture_box" sources Routes qualified submissions to enrichment process 3. Lead Enrichment Comprehensive data enhancement Enriches LinkedIn profile data via Trigify API Gathers additional company and professional intelligence 4. AI Qualification Engine Intelligent prospect evaluation Uses Claude AI to assess lead quality across multiple criteria: B2B company validation Geographic filtering (US, UK, Europe, Australia) Senior-level job titles or strategic keywords Current employment verification 5. Booking Verification Check Conversion status validation Checks Cal.com API to verify demo scheduling Routes booked leads to completion, non-booked to nurturing 6. AI-Powered Follow-up Research Personalized nurturing preparation Researches prospect's company using AI and web search Generates personalized follow-up messaging based on use case and company context 7. Email Campaign Integration Automated nurturing execution Adds qualified, non-booking leads to Instantly.ai email campaigns Includes personalized research for tailored outreach 🛠️ Technology Stack n8n**: Workflow orchestration Trigify API**: Lead enrichment Claude AI**: Qualification and personalized research Clay**: CRM integration Cal.com API**: Booking verification Instantly.ai**: Email campaign automation ✨ Key Features Real-time lead processing and AI-powered qualification Geographic and demographic filtering for market focus Automated booking verification and conversion tracking Personalized follow-up research and content generation Multi-platform integration for seamless lead management 🎯 Ideal Use Cases Perfect for B2B companies with demo-driven sales processes: SaaS companies requiring product demonstrations B2B service providers needing qualified prospect identification Sales teams managing high-volume inbound lead qualification Organizations with international markets requiring geographic focus 📈 Business Impact Transform website visitors into qualified sales opportunities: Lead Quality Enhancement**: AI filtering ensures only qualified prospects reach sales Conversion Optimization**: Systematic follow-up increases demo booking rates Sales Efficiency**: Automated qualification frees teams for high-value activities Personalized Engagement**: Research-driven follow-up increases response rates 💡 Strategic Advantage This workflow creates a sophisticated qualification funnel that combines automation with personalization. By using AI-powered assessment and research-driven follow-up, it ensures qualified prospects receive appropriate attention while preventing resource waste on unqualified leads. The system maximizes the value extracted from every website visitor by focusing sales efforts on highest-probability opportunities while automatically nurturing prospects who need additional touchpoints to convert.
by Hossein Karami
Who’s it for Teams that track absences in Everhour and want a shared Google Calendar view for quick planning. Ideal for managers, HR/OPS, and teammates who need instant visibility into approved time off. What it does Pulls approved time-off from Everhour on a schedule Creates/updates all-day events per day of absence in Google Calendar Removes stale events if a request changes or is canceled Uses a stable external key (everhour:ASSIGNMENT_ID:YYYY-MM-DD) to avoid duplicates How it works A Schedule Trigger runs periodically → the workflow fetches Everhour assignments, filters approved time-off, expands multi-day requests into single-day items, then searches by external key to either create or update events. Separate cleanup steps list calendar events and delete any that are no longer present in Everhour. How to set up In n8n, create an HTTP Header Auth credential for Everhour with header `X-Api-Key: {YOUR_EVERHOUR_API_KEY} `. Add a Google Calendar OAuth credential. Open the Config node and set your calendarId (e.g., team@group.calendar.google.com). Enable the workflow and choose your schedule. Requirements Everhour account with API access Google Calendar (workspace or personal) n8n Cloud or self-hosted How to customize the workflow Adjust the schedule (hourly/daily). Filter by user or time-off type. Tweak the event title/description templates. Point to multiple calendars (duplicate the create/update branch per calendar).
by David Olusola
🌤️ Weather Alerts via SMS (OpenWeather + Twilio) This workflow checks the current weather and forecast every 6 hours using the OpenWeather API, and automatically sends an SMS alert via Twilio if severe conditions are detected. It’s great for keeping teams, family, or field workers updated about extreme heat, storms, or snow. ⚙️ How It Works Check Every 6 Hours A Cron node triggers the workflow every 6 hours. Frequency can be adjusted based on your needs. Fetch Current Weather & Forecast Calls OpenWeather API for both current conditions and the 24-hour forecast. Retrieves temperature, precipitation, wind speed, and weather descriptions. Analyze Weather Data A Code node normalizes the weather data. Detects alert conditions such as: Extreme heat (≥95°F) Extreme cold (≤20°F) Severe storms (thunderstorm, tornado) Rain or snow High winds (≥25 mph) Also checks upcoming forecast for severe weather in the next 24 hours. Alert Needed? If no severe conditions → workflow stops. If alerts exist → proceed to SMS formatting. Format SMS Alert Prepares a compact, clear SMS message with: Current conditions Detected alerts Next 3 hours forecast preview Example: 🌤️ WEATHER ALERT - New York, US NOW: 98°F, clear sky 🚨 ALERTS (1): 🔥 EXTREME HEAT: 98°F (feels like 103°F) 📅 NEXT 3 HOURS: 1 PM: 99°F, sunny 2 PM: 100°F, sunny 3 PM: 100°F, partly cloudy Send Weather SMS Twilio sends the SMS to configured phone numbers. Supports multiple recipients. Log Alert Sent Logs the alert type, urgency, and timestamp. Useful for auditing and troubleshooting. 🛠️ Setup Steps 1. OpenWeather API Sign up at openweathermap.org. Get a free API key (1000 calls/day). Update the API key and location (city or lat/long) in the HTTP Request nodes. 2. Twilio Setup Sign up at twilio.com. Get your Account SID & Auth Token. Buy a Twilio phone number (≈ $1/month). Add Twilio credentials in n8n. 3. Recipients In the Send Weather SMS node, update phone numbers (format: +1234567890). You can add multiple recipients. 4. Customize Alert Conditions Default alerts: rain, snow, storms, extreme temps, high winds. Modify the Analyze Weather Data node to fine-tune conditions. 📲 Example SMS Output 🌤️ WEATHER ALERT - New York, US NOW: 35°F, light snow 🚨 ALERTS (2): ❄️ SNOW ALERT: light snow 💨 HIGH WINDS: 28 mph 📅 NEXT 3 HOURS: 10 AM: 34°F, snow 11 AM: 33°F, snow 12 PM: 32°F, overcast ⏰ Alert sent: 08/29/2025, 09:00 AM ⚡ With this workflow, you’ll always know when bad weather is on the way — keeping you, your team, or your customers safe and informed.
by Nitin Garg
How it works Form Trigger accepts a question and optional settings (folder ID, search depth) Cookie Validation checks if Skool session is still active BuildId Extraction dynamically extracts Skool's build ID from homepage Keyword Extraction uses Claude Haiku to extract 1-2 search keywords Multi-Page Search fetches 1-10 pages of Skool search results Post Aggregation collects all posts with content and comments AI Analysis uses Claude Sonnet to analyze posts and answer your question Google Doc Report creates a detailed research document in your Drive HTML Response returns a beautiful summary page Key Features Auto BuildId Detection - No manual updates when Skool changes Cookie Expiration Handling - Clear error messages when session expires Configurable Search Depth - Search 1-10 pages (default: 5) Token Protection - Limits content to control API costs Dual AI Models - Haiku for keywords (cheap), Sonnet for analysis (powerful) Set up steps Time required: 10-15 minutes Get your Skool session cookie from browser DevTools Get an Anthropic API key from console.anthropic.com Set up Google Docs OAuth2 credential in n8n Create a Google Drive folder for research docs Update the Config node with your values: COOKIE - Your Skool session cookie ANTHROPIC_API_KEY - Your Claude API key DEFAULT_FOLDER_ID - Your Google Drive folder ID COMMUNITY - Your Skool community slug Who is this for? Members of Skool communities searching past discussions Community managers researching common questions Anyone building knowledge bases from Skool content Estimated costs Per search:** $0.02-0.10 (Claude Haiku + Sonnet) Skool cookies expire every 7-14 days (requires refresh) 🏷️ Suggested Tags skool, community, search, ai, claude, anthropic, google-docs, research, knowledge-base, form
by Hashir Bin Waseem
AI-powered Meeting Summaries and Action Items to Slack and ClickUp How it Works Webhook Trigger: The workflow starts when Fireflies notifies that a transcription has finished. Transcript Retrieval: The transcript is pulled from Fireflies based on the meeting ID. Pre-processing: The transcript is split into sentences and then aggregated into a raw text block. AI Summarization: The aggregated transcript is sent to Google Gemini, which generates a short summary and a structured list of action items. Post-processing: The AI response is cleaned and formatted into JSON. Action items are mapped to titles and descriptions. Distribution: The meeting summary is posted to Slack. Action items are created as tasks in ClickUp. Use Case This workflow is designed for teams that want to reduce the manual effort of writing meeting notes and extracting action items. Automatically generate a clear and concise meeting summary Share the summary instantly with your team on Slack Ensure action items are not lost by automatically creating tasks in ClickUp Ideal for distributed teams, project managers, and product teams managing recurring meetings Requirements n8n instance** set up and running Fireflies.ai account** with API access to meeting transcripts Google Gemini API (via PaLM credentials)** for AI-powered summarization Slack account** with OAuth2 credentials connected in n8n ClickUp account** with OAuth2 credentials connected in n8n
by David Olusola
🌟 Send Daily Motivational Quote to Slack This workflow automatically posts an inspiring motivational quote to your Slack channel every morning at 8 AM. It uses the free ZenQuotes.io API (no API key required) to fetch quotes and delivers them to your team in Slack. ⚙️ How It Works Trigger at 8 AM A Cron node runs daily at 8 AM EST (America/New_York timezone by default). Fetch a Random Quote The workflow sends an HTTP Request to ZenQuotes.io API to retrieve a motivational quote. Format the Message A Code node structures the quote into a Slack-friendly message, adding styling, emojis, and the author’s name. Post to Slack Finally, the Slack node sends the motivational message to your chosen Slack channel (default: #general). 🛠️ Setup Steps 1. Connect Slack App Go to api.slack.com → Create a new app. Add OAuth scopes: chat:write channels:read Install the app to your Slack workspace. Copy credentials into n8n. 2. Configure Slack Channel Default is #general. Update the Slack node if you want to post to another channel. 3. Adjust Timezone (Optional) Workflow is set to America/New_York timezone. Change under workflow → settings → timezone if needed. ✅ Example Slack Output 🌟 Daily Motivation 🌟 "Success is not final, failure is not fatal: it is the courage to continue that counts." — Winston Churchill ⚡ Once enabled, your team will receive a motivational quote in Slack every morning at 8 AM — simple, automatic, and uplifting!
by ObisDev
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. LinkedIn Automation Outreach Workflow Documentation Inline Notes for Each Node 1. On form submission Trigger Node - Manual Start 📝 Note: "Manual trigger to start the LinkedIn scraping and outreach process. This node initiates the workflow when you want to begin lead processing." 2. Scrape profiles from a linkedin search HTTP Request/Browserflow Node 📝 Note: "Scrapes LinkedIn profiles based on search criteria (e.g., automation specialists in Lagos). Returns JSON array with profile data including names, URLs, taglines, locations, and summaries. Uses scrapeProfilesFromSearch.linkedinSearch() function." 3. Split Out1 Split Out Node 📝 Note: "Converts the JSON array of profiles into individual items for processing. Each profile becomes a separate execution path. Field to split: 'data'. This enables personalized message generation for each contact." 4. Limit Limit Node 📝 Note: "Controls batch size for processing (currently set to 3 items). Prevents overwhelming the AI agent and helps with rate limiting. Adjust max items based on your subscription limits and testing needs." 5. AI Agent LangChain AI Agent Node 📝 Note: "Generates personalized LinkedIn and email outreach messages using profile data. Uses Groq Chat Model (llama3-8b-8192) for cost-effective text generation. Input: Individual profile data. Output: Structured JSON with personalized messages. System prompt focuses on networking approach rather than sales." 6. Code1 JavaScript Code Node 📝 Note: "Processes AI-generated messages and formats data for LinkedIn automation. Extracts connection message, profile URL, and adds automation parameters. Includes error handling for malformed AI responses and random delay generation. Prepares data structure compatible with Browserflow LinkedIn automation." 7. Send a linkedin message1 Browserflow/HTTP Node 📝 Note: "Automates LinkedIn connection requests with personalized messages. Uses formatted data from Code1 node including target URL and message content. Includes built-in delays and retry logic to avoid LinkedIn rate limiting. ⚠️ Currently shows error - check Browserflow configuration and credentials." Workflow Architecture Overview Flow Type: Sequential Processing with Batch Control Purpose: Automated LinkedIn networking outreach for automation professionals Target Audience: Lagos-based automation specialists and similar professionals Detailed Workflow Description 🎯 LinkedIn Automation Outreach Workflow for Networking This sophisticated n8n workflow automates the entire process of discovering, analyzing, and reaching out to potential networking contacts in the automation industry. Designed specifically for automation agency owners and professionals looking to build meaningful connections within their local tech community. 🔄 Workflow Process: Stage 1: Data Collection The workflow begins with a manual trigger that initiates a comprehensive LinkedIn profile scraping operation. Using advanced web scraping techniques, it searches for automation specialists, particularly focusing on the Lagos tech ecosystem. The scraping function targets professionals with expertise in tools like n8n, Make.com, AI automation, and workflow optimization. Stage 2: Data Processing & Segmentation Once the profile data is collected, the Split Out node transforms the bulk JSON response into individual processing items. This crucial step enables personalized treatment of each contact. The Limit node provides batch control, allowing you to test with smaller groups (currently 3 profiles) before scaling to larger datasets. Stage 3: AI-Powered Personalization The AI Agent represents the workflow's intelligence core, utilizing Groq's LLaMA model for cost-effective, high-quality text generation. Each profile receives a customized analysis that identifies: Specific technical expertise and tools Geographic and industry connections Potential collaboration opportunities Shared professional interests The AI generates both LinkedIn connection messages and email alternatives, ensuring multiple touchpoint options. Messages focus on genuine networking value rather than sales pitches, emphasizing knowledge sharing, collaboration opportunities, and community building. Stage 4: Message Optimization & Formatting The JavaScript Code node serves as the workflow's data orchestrator, transforming AI-generated content into automation-ready formats. It handles: Response validation and error recovery LinkedIn-specific message formatting Automation parameter injection (delays, retry logic) Fallback email preparation Metadata tracking for campaign analysis Stage 5: Automated Outreach Execution The final Browserflow integration automates the actual LinkedIn connection process. It navigates to each target profile, sends personalized connection requests, and implements intelligent delays to maintain LinkedIn compliance. Built-in error handling ensures workflow resilience even when individual requests fail. 🎖️ Key Features: Intelligent Batch Processing**: Controlled processing prevents rate limiting Dual-Channel Approach**: LinkedIn + email backup ensures message delivery Geographic Targeting**: Lagos-focused networking for local community building AI-Driven Personalization**: Each message uniquely crafted based on profile analysis Error Resilience**: Comprehensive error handling maintains workflow stability Compliance-First Design**: Built-in delays and limits respect platform policies 🎯 Use Cases: Building local automation professional networks Identifying potential collaboration partners Market research on automation service providers Community building for tech meetups and events Knowledge sharing network development ⚡ Technical Specifications: Model**: Groq LLaMA3-8B for cost-effective AI generation Processing Capacity**: 3-item batches (scalable) Message Types**: LinkedIn connections + email alternatives Automation Platform**: Browserflow for LinkedIn interaction Error Handling**: Multi-layer validation and recovery Personalization Depth**: 3-5 specific talking points per contact This workflow represents a sophisticated approach to professional networking automation, balancing efficiency with authentic relationship building. It's particularly valuable for automation professionals who understand the importance of genuine connections over mass outreach tactics.
by Yang
Who is this for? This workflow is perfect for content strategists, SEO specialists, marketing agencies, and virtual assistants who need to quickly audit and collect blog content from client websites into a structured Google Sheet without doing manual crawling and copy-pasting. What problem is this workflow solving? Manually visiting a website, finding blog posts, and copying content into a spreadsheet is time-consuming and prone to errors. This workflow automates the process: it crawls a website, filters only blog-related pages, scrapes the article content, and stores everything neatly in Google Sheets for easy analysis and content strategy planning. What this workflow does The workflow starts when a client submits their website URL through a form. A Google Sheet is automatically created and headers are added for organizing the audit. Dumpling AI then crawls the website to discover all available pages, while the automation filters out only blog-related URLs. Each blog page is scraped for content, and the structured results (URL, crawled page, and website content) are appended row by row into the Google Sheet. Nodes Overview Form Trigger – Form Submission (Client URL) Captures the client’s website URL to start the workflow. Google Sheets – Create Blog Audit Sheet Creates a new Google Sheet with a title based on the submitted URL. Set – Set Sheet Headers Defines the headers: Url, Crawled_pages, website_content. Code – Format Header Row Formats the headers properly before sending them to the sheet. HTTP Request – Insert Headers into Sheet Updates the Google Sheet with the prepared header row. HTTP Request – Dumpling AI: Crawl Website Crawls the submitted URL to discover internal pages. Code – Extract Blog URLs Filters the crawl results and keeps only URLs that match common blog patterns (e.g., /blog/, /articles/, /posts/). HTTP Request – Dumpling AI: Scrape Blog Pages Scrapes the text content from each filtered blog page. Set – Prepare Row Data Maps the URL, blog page link, and scraped content into structured fields. Google Sheets – Save Blog Data to Google Sheets Appends the structured data into the audit sheet row by row. 📝 Notes Set up Dumpling AI and generate your API key from: Dumpling AI Google Sheets must be connected with write permissions enabled. You can change the crawl depth or limit (currently set to 10 pages) in the Dumpling AI: Crawl Website node. The Extract Blog URLs node uses regex patterns to detect blog content. You can customize these patterns to match your website’s URL structure.
by EmailListVerify
How to scrape emails from websites This workflow will : Try to find emails by scraping the website via http request If no result is found, it will use EmailListVerify email finder API to guess an email address Scraping email via http request is a cost-effective way to find email addresses, so it can save you a few bucks to use it before calling any email finder API. Who is for This workflow will help you transform a list of websites into a list of leads with email addresses. This is a handy workflow for any lead generation specialist. Pay attention that this workflow will usually return only generic emails like "contact@". Those generic emails are useful when you target small businesses; the owner usually monitors those emails. However, I don't advise this workflow to target enterprise customers. Requirements In order to use this workflow, you will need: To copy this Google sheet template Get an API key for EmailListVerify You then need to edit the setup of the 3 stages highlighted with a yellow sticky note, and you will be good to go.
by Eumentis
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. What It Does This workflow automatically discovers recently seed-funded startups by monitoring RSS feeds for funding announcements. It uses Bright Data to scrape full article content, then extracts structured company information using OpenAI (GPT). The data is exported to an Excel sheet on OneDrive, providing sales teams with a real-time list of qualified leads without any manual effort. How It Works Trigger & Article Discovery: Monitors curated RSS feeds for articles mentioning seed funding and triggers the workflow on new article detection. Content Scraping & Preparation: Scrapes full article content and converts it into clean markdown format for AI processing. Data Extraction with AI: Uses OpenAI to extract structured details like company name, website, LinkedIn profile, founders, and funding amount. Structured Data Output & Storage: Appends extracted data to an Excel sheet on OneDrive via Microsoft Graph API. Prerequisites RSS Feed URL**: A valid RSS feed source that provides seed funding articles for startups. Bright Data Credentials**: Active Bright Data account with access credentials (API token ) to enable article scraping. OpenAI API Key**: An OpenAI account with an API key and access to GPT-4.1-MINI models for data extraction. Microsoft OAuth2 API Credentials**: OAuth2 credentials (Client ID, Secret, Tenant ID) with access scopes to use Microsoft Graph API for Excel integration. Excel Sheet in SharePoint**: A pre-created Excel file hosted on OneDrive or SharePoint with the following column headers: createdAt, companyName, companyWebsite, companyLinkedIn, fundingAmount, founderName, founderLinkedIn, articleLink Excel File & Sheet Identifiers**: The Drive ID, File ID and Sheet ID of your Excel sheet stored on OneDrive or SharePoint, required by the Microsoft Graph API for appending rows using the HTTP node in n8n. Need help with the setup? Feel free to contact us How to Set It Up Follow these steps to configure and run the workflow: Import the Workflow Copy the provided n8n workflow template. In your n8n instance, go to Editor UI > paste this workflow. Configure the RSS Feed Node Open the RSS trigger node. Replace the default URL with your RSS feed URL. Ensure the polling interval matches your desired frequency (e.g., every 15 minutes or 1 hour). Set Up Bright Data Node Add your Bright Data credentials. Follow the documentation to complete the setup. Configure OpenAI Integration Add your OpenAI API key as a credential in n8n. Ensure the model is set to gpt-4.1-MINI. Follow the documentation to complete the setup. Configure Excel File Integration Open the HTTP node responsible for sending data to the Excel sheet via Microsoft Graph API. Replace the placeholder values in the API endpoint URL with your actual File ID and Sheet ID from the Excel file stored on OneDrive or SharePoint. https://graph.microsoft.com/v1.0/drives/{{drive-id}}/items/{{file-id}}/workbook/tables/{ {{ sheet-id }} }/rows This URL is used to append data to the specified Excel sheet range. Next, set up Microsoft OAuth2 credentials in n8n: Go to n8n > Credentials > Microsoft OAuth2 API. Provide the required values: Client ID Client Secret Tenant ID Scope Follow the documentation to complete the setup. Once the credential is saved, connect it to the HTTP node making the Graph API call. Activate the Workflow Set the workflow status to Active in n8n so it runs automatically when a new article appears in the RSS feed. Need Help? Contact us for support and custom workflow development.
by n8n Team
This workflow sends a file to a Notion database of your choosing when a new file is created in a specific Google Drive folder. Prerequisites Notion account and Notion credentials. Google account and Google credentials. Google Drive folder to monitor for new files. How it works When a Google Drive file is created in the folder you specified, the workflow sends the file to the Notion database you created. The workflow uses the On file upload node to trigger the workflow when a new file is created in the folder. The Create database page node creates a new page in the Notion database you created. Setup Create a Notion database called "My Google Drive Files" with the following columns: Filename Google Drive File Share the database to n8n. In the n8n workflow, click on the Create database page node and select the database you created in step 1. In Google Drive, create a folder and navigate to it. Copy the URL of the Google Drive folder you are currently in. In the n8n workflow, add the folder URL to On file upload node.