by Oneclick AI Squad
This n8n template demonstrates how to create a comprehensive voice-powered restaurant assistant that handles table reservations, food orders, and restaurant information requests through natural language processing. The system uses VAPI for voice interaction and PostgreSQL for data management, making it perfect for restaurants looking to automate customer service with voice AI technology. Good to know Voice processing requires active VAPI subscription with per-minute billing Database operations are handled in real-time with immediate confirmations The system can handle multiple simultaneous voice requests All customer data is stored securely in PostgreSQL with proper indexing How it works Table Booking & Order Handling Workflow Voice requests are captured through VAPI triggers when customers make booking or ordering requests The system processes natural language commands and extracts relevant details (party size, time, food items) Customer data is immediately saved to the bookings and orders tables in PostgreSQL Voice confirmations are sent back through VAPI with booking details and estimated wait times All transactions are logged with timestamps for restaurant management tracking Restaurant Info Provider Workflow Info requests trigger when customers ask about hours, menu, location, or services Restaurant details are retrieved from the restaurant_info table containing current information Wait nodes ensure proper data loading before voice response generation Structured restaurant information is delivered via VAPI in natural, conversational format Database Schema Bookings Table booking_id (PRIMARY KEY) - Unique identifier for each reservation customer_name - Customer's full name phone_number - Contact number for confirmation party_size - Number of guests booking_date - Requested reservation date booking_time - Requested time slot special_requests - Dietary restrictions or special occasions status - Booking status (confirmed, pending, cancelled) created_at - Timestamp of booking creation Orders Table order_id (PRIMARY KEY) - Unique order identifier customer_name - Customer's name phone_number - Contact for order updates order_items - JSON array of food items and quantities total_amount - Calculated order total order_type - Delivery, pickup, or dine-in special_instructions - Cooking preferences or allergies status - Order status (received, preparing, ready, delivered) created_at - Order timestamp Restaurant_Info Table info_id (PRIMARY KEY) - Information entry identifier category - Type of info (hours, menu, location, contact) title - Information title description - Detailed information content is_active - Whether info is currently valid updated_at - Last modification timestamp How to use The manual trigger can be replaced with webhook triggers for integration with existing restaurant systems Import the workflow into your n8n instance and configure VAPI credentials Set up PostgreSQL database with the required tables using the schema provided above Configure restaurant information in the restaurant_info table Test voice commands such as "Book a table for 4 people at 7 PM" or "What are your opening hours?" Customize voice responses in VAPI nodes to match your restaurant's tone and branding The system can handle multiple concurrent voice requests and scales with your restaurant's needs Requirements VAPI account for voice processing and natural language understanding PostgreSQL database for storing booking, order, and restaurant information n8n instance with database and VAPI integrations enabled Customising this workflow Voice AI automation can be adapted for various restaurant types - from quick service to fine dining establishments Try popular use-cases such as multi-location booking management, dietary restriction handling, or integration with existing POS systems The workflow can be extended to include payment processing, SMS notifications, and third-party delivery platform integration
by Yaron Been
📰 AI News Digest Agent: Auto News Summarizer & Email Newsletter Create an intelligent news curation system that automatically fetches breaking headlines, generates AI-powered summaries, and delivers personalized news digests to your subscriber list. Perfect for newsletter creators, team leaders, and content curators who want to keep their audience informed without the manual effort of news monitoring and summarization. 🔄 How It Works This streamlined 5-step automation delivers fresh news insights around the clock: Step 1: Automated News Collection The workflow runs on a configurable schedule (default: every 10 minutes) to fetch the latest headlines from NewsAPI, ensuring your content stays current with breaking developments. Step 2: Intelligent Content Curation The system pulls top headlines from reliable news sources, filtering by country, category, and relevance to deliver the most important stories of the day. Step 3: AI-Powered Summarization GPT-4 processes the collected headlines and creates: Concise 5-bullet point summaries Key insights and implications Easy-to-digest news overviews Professional formatting for email distribution Step 4: Subscriber Management The workflow accesses your Google Sheets subscriber list, retrieving names and email addresses for personalized delivery. Step 5: Automated Email Distribution Personalized news digests are automatically sent to each subscriber via Gmail, with custom greetings and professionally formatted content. ⚙️ Setup Steps Prerequisites NewsAPI account (free tier available) OpenAI API access for content summarization Google Sheets for subscriber management Gmail account for email distribution n8n instance (cloud or self-hosted) Required Google Sheets Structure Create a simple subscriber database: | Name | Email | |---------------|--------------------------| | John Smith | john@example.com | | Sarah Johnson | sarah@company.com | | Mike Chen | mike.chen@startup.co | Configuration Steps Credential Setup NewsAPI Key: Sign up at newsapi.org for free headline access OpenAI API Key: Required for AI-powered news summarization Google Sheets OAuth2: Access your subscriber spreadsheet Gmail OAuth2: Enable automated email sending News Source Configuration Country Selection: Choose target region (US, UK, CA, AU, etc.) Category Filters: Focus on specific topics (technology, business, health) Source Selection: Prefer certain news outlets or avoid others Language Settings: Configure for international audiences AI Summarization Customization Default prompt creates 5-bullet summaries, but can be tailored for: Industry Focus: Technology, finance, healthcare, politics Audience Type: General public, professionals, executives Content Depth: Brief overviews vs detailed analysis Tone & Style: Formal, conversational, or technical Email Template Personalization Subject Line Formatting: Include date, breaking news indicators Greeting Customization: Use subscriber names for personal touch Content Layout: Professional formatting with clear sections Branding Elements: Add your organization's signature or logo Delivery Schedule Optimization Frequency Settings: Every 10 minutes, hourly, or daily Time Zone Considerations: Optimize for subscriber locations Breaking News Alerts: Immediate delivery for urgent stories Digest Compilation: Collect multiple stories for periodic summaries 🚀 Use Cases Newsletter Publishers Content Automation: Generate newsletter content without manual curation Consistent Publishing: Maintain regular delivery schedules automatically Audience Growth: Provide value that encourages subscriptions and shares Time Savings: Eliminate hours of daily news monitoring and writing Corporate Communications Employee Updates: Keep teams informed about industry developments Executive Briefings: Deliver curated news summaries to leadership Client Communications: Share relevant industry insights with customers Stakeholder Relations: Maintain informed investor and partner networks Educational Institutions Student Resources: Provide current events for academic discussions Faculty Updates: Keep educators informed about relevant developments Research Support: Deliver news related to specific academic fields Parent Communications: Share educational policy and school-related news Professional Services Client Value Addition: Provide industry-specific news as a service benefit Thought Leadership: Position your firm as an informed industry expert Business Development: Share insights that demonstrate market knowledge Team Knowledge Sharing: Keep entire organization current on industry trends Community Organizations Member Engagement: Keep community members informed and engaged Local News Focus: Customize for regional or local news coverage Event Planning: Stay informed about developments affecting your community Advocacy Support: Monitor news relevant to your organization's mission 🔧 Advanced Customization Options Multi-Source News Aggregation Expand beyond NewsAPI with additional sources: RSS Feed Integration: Add specialized industry publications Social Media Monitoring: Include trending topics from Twitter/LinkedIn Government Sources: Official announcements and policy updates International Coverage: Global perspectives on major stories Intelligent Content Filtering Implement smart curation features: Sentiment Analysis: Filter positive, negative, or neutral news Relevance Scoring: Prioritize stories based on subscriber interests Duplicate Detection: Avoid sending repetitive story coverage Quality Assessment: Ensure content meets editorial standards Subscriber Segmentation Create targeted news experiences: Interest Categories: Technology, business, sports, entertainment Geographic Preferences: Local, national, or international focus Delivery Preferences: Frequency and format customization Engagement Tracking: Monitor opens, clicks, and subscriber behavior Enhanced Email Features Professional newsletter capabilities: HTML Templates: Rich formatting with images and links Call-to-Action Buttons: Drive engagement with your content or services Social Sharing: Enable easy sharing of newsletter content Analytics Integration: Track email performance and subscriber engagement 📊 Content Generation Examples Sample Email Output: Subject: 📰 Your Daily News Digest - March 15, 2024 Hi John, Please find today's top news headlines summarized below: 📈 BUSINESS & TECHNOLOGY Federal Reserve signals potential rate cuts following inflation data Major tech companies announce AI partnership for healthcare applications Renewable energy sector sees record investment levels in Q1 2024 Cryptocurrency markets stabilize after regulatory clarity announcement Supply chain disruptions ease as global shipping routes normalize 💡 These developments suggest growing economic optimism and continued technology sector innovation. The healthcare AI partnership particularly signals significant advances in medical technology accessibility. Stay informed and have a great day! Powered by AI News Digest Agent Unsubscribe | Update Preferences Breaking News Alert Format: Subject: 🚨 Breaking News Alert - Major Development Hi Sarah, BREAKING: [Headline] Key Details: [Critical point 1] [Critical point 2] [Impact analysis] Full coverage in your next scheduled digest. AI News Digest Agent 🛠️ Troubleshooting & Best Practices Common Issues & Solutions API Rate Limiting Monitor NewsAPI quota usage and upgrade plan if needed Implement intelligent caching to reduce redundant requests Stagger requests during high-traffic periods Set up alerts for approaching rate limits Email Delivery Challenges Monitor Gmail sending limits and implement delays if needed Use professional email authentication (SPF, DKIM) Maintain clean subscriber lists to avoid spam flags Implement unsubscribe functionality for compliance Content Quality Control Review AI summaries periodically for accuracy and bias Implement feedback loops for continuous prompt improvement Create editorial guidelines for consistent tone and style Monitor subscriber feedback and engagement metrics Optimization Strategies Performance Enhancement Use parallel processing for multiple news sources Implement intelligent caching for repeated content Optimize AI prompts for faster processing and better results Monitor workflow execution time and resource usage Subscriber Growth Create compelling value propositions for newsletter signups Implement referral systems for organic growth Share sample newsletters on social media and websites Collect feedback to continuously improve content quality Content Strategy A/B test different summary formats and lengths Analyze which news categories generate most engagement Experiment with sending times for optimal open rates Create themed newsletters for special events or topics 📈 Success Metrics Engagement Indicators Open Rates: Percentage of subscribers reading newsletters Click-Through Rates: Engagement with linked news sources Subscriber Growth: New signups and retention rates Forward/Share Rates: Viral coefficient of your content Content Quality Measurements Relevance Scores: Subscriber feedback on content usefulness Timeliness: How quickly breaking news reaches subscribers Accuracy: Verification of AI-summarized content Completeness: Coverage of important stories in your focus areas 📞 Questions & Support Need assistance with your AI News Digest Agent setup or optimization? 📧 Technical Support Email: Yaron@nofluff.online Response Time: Within 24 hours on business days Specialization: NewsAPI integration, AI content optimization, email deliverability 🎥 Educational Resources YouTube Channel: https://www.youtube.com/@YaronBeen/videos Complete setup and configuration tutorials Advanced customization techniques for different industries Email marketing best practices for automated newsletters Troubleshooting common integration issues Scaling strategies for growing subscriber lists 🤝 Professional Community LinkedIn: https://www.linkedin.com/in/yaronbeen/ Connect for ongoing newsletter automation support Share your news curation success stories Access exclusive templates and workflow variations Join discussions about content automation trends 💬 Support Request Best Practices Include in your support message: Your target audience and newsletter focus Current subscriber count and growth goals Specific news categories or geographic regions of interest Any technical errors or integration challenges Current content creation workflow and pain points
by LukaszB
This workflow is designed for freelancers, solopreneurs, and business owners who receive a high volume of irrelevant messages in their Gmail inbox — from cold offers to spammy promotions — and want to automatically filter and delete them using AI. Its main purpose is to scan new emails with the help of OpenAI, classify their content, and automatically delete those considered marketing (OFFER) or junk (SPAM). The result is a cleaner inbox without the need to manually sift through low-value messages. The classification logic uses a detailed system prompt with practical examples, so even complex or borderline messages are categorized accurately. Important emails — such as payment confirmations, shipping updates, or genuine business inquiries — remain untouched. This helps maintain a professional inbox with only valuable and relevant communication. The entire process runs automatically in the background and can be customized further — for example, to archive instead of delete, or log deleted emails for review. How it works When triggered (every hour), the workflow fetches new Gmail messages using the Gmail Trigger node. Each message is passed to an AI classifier powered by OpenAI, which reads the message body (email snippet) and returns one of three labels: SPAM: Obvious junk messages, scams, or low-effort bulk messages OFFER: Cold outreach, discount promotions, cart reminders, or generic advertising IMPORTANT: Valuable information for the user, even if commercial (e.g., invoices, order updates, personal inquiries) The workflow then routes the result through an IF node. If the message is marked as SPAM or OFFER, it is immediately deleted from Gmail via the Gmail Delete node. Emails marked as IMPORTANT are ignored and remain in the inbox. The classification is entirely AI-driven based on message content — sender address, headers, or metadata are not used. How to set up To get started, simply connect two credentials: A Gmail account using OAuth2 (via the Gmail Trigger and Gmail Delete nodes) An OpenAI API key (used by the AI classifier node) No advanced setup is needed beyond these two connections. Optionally, you can review or modify the system prompt used for classification — it’s available inside the workflow’s LangChain AI Agent node. The prompt is in English, so it’s recommended to use this workflow with English-language emails for best results. By default, the workflow deletes matching emails immediately. If you prefer safer testing, you can modify the Gmail node to archive, label, or log emails instead of deleting them. The full workflow takes around 5–10 minutes to configure and includes a sticky note with additional instructions and warnings.
by Marcelo Abreu
Who is this workflow for? If you're using Meta Ads to generate new leads to your sales pipeline, this workflow is for you! 🙌🏻 What this workflow does Triggers every time you have a new calendar event on a chosen Google Acount Filter only events with the same name of your "Schedule a demo" event Formats and send event to Meta Conversion API What events can I send? Any event you'd like! It's preconfigured with the "Schedule" event, but you can change to "Purchase", "InitiateCheckout", "Lead" and custom events. Setup Guide Connect Google OAuth2 to n8n Get your PIXEL ID and Access Token from Meta Set your configuration node with Pixel ID, Access Token, source_url and event_name Requirements Meta Access Token + Pixel ID (via Meta Conversion API): Documentation Google Access (via OAuth2): Documentation This free template was created by pdforge. Feel free to contact us via the founder Linkedin, if you have any questions! 👋🏻
by Audun
Send structured logs to BetterStack from any workflow using HTTP Request Who is this for? This workflow is perfect for automation builders, developers, and DevOps teams using n8n who want to send structured log messages to BetterStack Logs. Whether you're monitoring mission-critical workflows or simply want centralized visibility into process execution, this reusable log template makes integration easy. What problem is this workflow solving? Logging failures or events across multiple workflows typically requires duplicated logic. This workflow solves that by acting as a shared log sender, letting you forward consistent log entries from any other workflow using the Execute Workflow node. What this workflow does Accepts level (e.g., "info", "warn", "error") and message fields via Execute Workflow Trigger Sends the structured log to your BetterStack ingestion endpoint via HTTP Request Uses HTTP Header Auth for secure delivery Includes a manual trigger for testing and a sample call to demonstrate usage Comes with clear sticky notes to help you get started Setup Copy your BetterStack Logs ingestion URL. Create a Header Auth credential in n8n with your Authorization: Bearer YOUR_API_KEY. Replace the URL in the HTTP Request node with your BetterStack endpoint. Optionally modify the test data or log levels for custom scenarios. Use Execute Workflow in any of your workflows to send logs here.
by Gerald Denor
AI-Powered Proposal Generator - Sales Automation Workflow Overview This n8n workflow automates the entire proposal generation process using AI, transforming client requirements into professional, customized proposals delivered via email in seconds. Use Case Perfect for agencies, consultants, and sales teams who need to generate high-quality proposals quickly. Instead of spending hours writing proposals manually, this workflow captures client information through a web form and uses GPT-4 to generate contextually relevant, professional proposals. How It Works Form Trigger - Captures client information through a customizable web form OpenAI Integration - Processes form data and generates structured proposal content Google Drive - Creates a copy of your proposal template Google Slides - Populates the template with AI-generated content Gmail - Automatically sends the completed proposal to the client Key Features AI Content Generation**: Uses GPT-4 to create personalized proposal content Professional Templates**: Integrates with Google Slides for polished presentations Automated Delivery**: Sends proposals directly to clients via email Form Integration**: Captures all necessary client data through web forms Customizable Output**: Generates structured proposals with multiple sections Template Sections Generated Proposal title and description Problem summary analysis Three-part solution breakdown Project scope details Milestone timeline with dates Cost integration Requirements n8n instance** (cloud or self-hosted) OpenAI API key** for content generation Google Workspace account** for Slides and Gmail Basic n8n knowledge** for setup and customization Setup Complexity Intermediate - Requires API credentials setup and basic workflow customization Benefits Time Savings**: Reduces proposal creation from hours to minutes Consistency**: Ensures all proposals follow the same professional structure Personalization**: AI analyzes client needs for relevant content Automation**: Eliminates manual copy-paste and formatting work Scalability**: Handle multiple proposal requests simultaneously Customization Options Modify AI prompts for different industries or services Customize Google Slides template design Adjust form fields for specific information needs Personalize email templates and signatures Configure milestone templates for different project types Error Handling Includes basic error handling for API failures and form validation to ensure reliable operation. Security Notes All credentials have been removed from this template. Users must configure their own: OpenAI API credentials Google OAuth2 connections for Slides, Drive, and Gmail Form webhook configuration This workflow demonstrates practical AI integration in business processes and showcases n8n's capabilities for complex automation scenarios.
by Obsidi8n
I am submitting this workflow for the Obsidian community to showcase the potential of integrating Obsidian with n8n. While straightforward, it serves as a compelling demonstration of the potential unlocked by integrating Obsidian with n8n. How it works This workflow lets you retrieve specific Airtable data you need in seconds, directly within your Obsidian note, using n8n. By highlighting a question in Obsidian and sending it to a webhook via the Post Webhook Plugin, you can fetch specific data from your Airtable base and instantly insert the response back into your note. The workflow leverages OpenAI’s GPT model to interpret your query, extract relevant data from Airtable, and format the result for seamless integration into your note. Set up steps Install the Post Webhook Plugin: Add this plugin to your Obsidian vault from the plugin store or GitHub. Set up the n8n Webhook: Copy the webhook URL generated in this workflow and insert it into the Post Webhook Plugin's settings in Obsidian. Configure Airtable Access: Link your Airtable account and specify the desired base and table to pull data from. Test the Workflow: Highlight a question in your Obsidian note, use the “Send Selection to Webhook” command, and verify that data is returned as expected.
by Khaled
🌐 Web Server Monitor & Alert System This automation pings web servers at regular intervals, logs their status, and sends email alerts if a server goes down. It’s perfect for maintaining visibility over server uptime — without complex monitoring tools. 🧠 How It Works This workflow performs minute-by-minute checks on all listed servers in a Google Sheet and: ✅ Logs all reachable servers in an “Alive” log. 🔻 Sends an email alert if a server is unreachable. 📄 Logs failed servers in a “Down” sheet with timestamps. 🧩 Key Components ⏰ 1. Schedule Trigger Runs the workflow every minute for real-time monitoring. 📄 2. Web Servers List (Google Sheets) Pulls server IPs or hostnames from a Google Sheet named Server_List. Each row = one server to monitor. This makes adding/removing servers effortless — just update the sheet. 🌐 3. Servers Alive Check (HTTP Request) Performs an HTTP GET request to each server (e.g., http://your-server.com). If the request fails, it automatically triggers the error path (handled via continueOnFail). ✅ 4. Web Server Alive Log (Google Sheets) Records successful pings in Server_Status_Alive with: Timestamp Server IP Status = Alive This log can be used for uptime reports or audits. 📧 5. Server Down Notification (Gmail) If a server fails, this node sends an email to the admin. It includes: Server address Timestamp Suggested action 📄 6. Web Server Down Log (Google Sheets) Logs failed pings in a separate sheet for historical tracking and debugging. ✅ Main Advantages Live Server Monitoring Stay informed about server health in near real-time. No-Code Configuration Add/remove servers from the Google Sheet — no need to touch the workflow. Email Alerts on Failure Proactively notifies you before users report the issue. Audit-Ready Logging Maintains logs for both healthy and failed checks for documentation or reporting. Flexible & Scalable Monitor 1 or 100 servers with the same template — just scale the list. ⚙️ Setup Steps 🔑 Prerequisites Google Sheet with server list (column name = “Server”) Gmail OAuth2 Connection for alerts n8n Instance running regularly 🛠 Configuration Google Sheets Sheet 1 (Server_List): Your list of servers. Sheet 2 (Server_Status_Alive): Log for reachable servers. Sheet 3 (Server_Status_Down): Log for unreachable servers. Gmail Integration Connect your Gmail account in the Server Down Notification node. Edit recipient email and message content as needed. HTTP Check Adjust the HTTP request URL template if using port numbers or paths (e.g., http://{{Server}}:8080/status). Schedule Default is every 1 minute. Change via Schedule Trigger if needed. 🧪 Testing Input a reachable server (e.g., example.com) and an unreachable IP. Run the workflow manually or wait for the next scheduled run. Check: Alive log updates correctly. Down log records failures. Email alert is received. 🚀 Deployment Activate the workflow, and it will quietly run in the background, notifying you of any server downtime instantly while keeping logs for future review.
by Cameron Wills
Who is this for? Content creators, social media managers, digital marketers, and researchers who need to download original TikTok videos without watermarks for analysis, repurposing, or archiving purposes. What problem does this workflow solve? Downloading TikTok videos without watermarks typically requires using questionable third-party websites that may have limitations, ads, or privacy concerns. This workflow provides a clean, automated solution that can be integrated into your own systems and processes. What this workflow does This workflow automates the process of downloading TikTok videos without watermarks in three simple steps: Fetch the TikTok video page by providing the video URL Extract the raw video URL from the page's HTML data Download the original video file without watermark (Optional) Upload to Google Drive with public sharing link generation The workflow uses web scraping techniques to extract the original video source directly from TikTok's own servers, maintaining the highest possible quality without any added watermarks or branding. Setup (Est. time: 5-10 minutes) Before getting started, you'll need: n8n installation The URL of a TikTok you want to download (Optional) Google Drive API enabled in Google Cloud Console with OAuth Client ID and Client Secret credentials if you want to use the upload feature How to customize this workflow to your needs Replace the example TikTok URL with your desired video links Modify the file naming convention for downloaded videos Integrate with other nodes to process videos after downloading Create a webhook to trigger the workflow from external applications Set up a schedule to regularly download videos from specific accounts This workflow can be extended to support various use cases like trending content analysis, competitor research, creating compilation videos, or building a content library for inspiration. It provides a foundation that can be customized to fit into larger automated workflows for content creation and social media management.
by Guillaume Duvernay
Description This template provides a simple and powerful backend for adding speech-to-text capabilities to any application. It creates a dedicated webhook that receives an audio file, transcribes it using OpenAI's gpt-4o-mini model, and returns the clean text. To help you get started immediately, you'll find a complete, ready-to-use HTML code example right inside the workflow in a sticky note. This code creates a functional recording interface you can use for testing or as a foundation for your own design. Who is this for? Developers:** Quickly add a transcription feature to your application by calling this webhook from your existing frontend or backend code. No-code/Low-code builders:** Embed a functional audio recorder and transcription service into your projects by using the example code found inside the workflow. API enthusiasts:** A lean, practical example of how to use n8n to wrap a service like OpenAI into your own secure and scalable API endpoint. What problem does this solve? Provides a ready-made API:** Instantly gives you a secure webhook to handle audio file uploads and transcription processing without any server setup. Decouples frontend from backend:** Your application only needs to know about one simple webhook URL, allowing you to change the backend logic in n8n without touching your app's code. Offers a clear implementation pattern:** The included example code provides a working demonstration of how to send an audio file from a browser and handle the response—a pattern you can replicate in any framework. How it works This solution works by defining a clear API contract between your application (the client) and the n8n workflow (the backend). The client-side technique: Your application's interface records or selects an audio file. It then makes a POST request to the n8n webhook URL, sending the audio file as multipart/form-data. It waits for the response from the webhook, parses the JSON body, and extracts the value of the Transcript key. You can see this exact pattern in action in the example code provided in the workflow's sticky note. The n8n workflow (backend): The Webhook node catches the incoming POST request and grabs the audio file. The HTTP Request node sends this file to the OpenAI API. The Set node isolates the transcript text from the API's response. The Respond to Webhook node sends a clean JSON object ({"Transcript": "your text here..."}) back to your application. Setup Configure the n8n workflow: In the Transcribe with OpenAI node, add your OpenAI API credentials. Activate the workflow to enable the endpoint. Click the "Copy" button on the Webhook node to get your unique Production Webhook URL. Integrate with the frontend: Inside the workflow, find the sticky note labeled "Example Frontend Code Below". Copy the complete HTML from the note below it. ⚠️ Important: In the code you just copied, find the line const WEBHOOK_URL = 'YOUR WEBHOOK URL'; and replace the placeholder with the Production Webhook URL from n8n. Save the code as an HTML file and open it in your browser to test. Taking it further Save transcripts:* Add an *Airtable* or *Google Sheets** node to log every transcript that comes through the workflow. Error handling:** Enhance the workflow to catch potential errors from the OpenAI API and respond with a clear error message. Analyze the transcript:* Add a *Language Model** node after the transcription step to summarize the text, classify its sentiment, or extract key entities before sending the response.
by Ria
This is a very simple workflow that lets you subscribe to any github repository for the latest release (using n8n as example). How it works: daily poll to Github repository for release for latest (stable) version of n8n parses the content to HTML sends a gmail Setup steps: add your gmail credentials (or use other email node of choice) change the url to the right Github repository you want to check regularly change the To email address to the email that you want to receive the updates for Feedback & Questions If you have any questions or feedback about this workflow - Feel free to get in touch at ria@n8n.io
by Airtop
Automating LinkedIn Company URL Verification Use Case This automation verifies that a given LinkedIn URL actually belongs to a company by comparing the website listed on their LinkedIn page against the expected company domain. It is essential for ensuring data accuracy in lead qualification, enrichment, and CRM updates. What This Automation Does Input Parameters Company LinkedIn**: The LinkedIn URL to be verified. Company Domain**: The expected domain (e.g., example.com) for validation. Airtop Profile (connected to LinkedIn)**: Airtop Profile with LinkedIn authentication. Output Confirmation whether the LinkedIn page corresponds to the provided domain. Returns the verified LinkedIn URL if the match is confirmed. How It Works Extracts the website URL from the specified LinkedIn company profile. Compares the extracted URL with the provided company domain. If the domain is contained in the extracted website, the LinkedIn profile is confirmed as valid. Returns the original LinkedIn URL if the match is successful. Setup Requirements Airtop API Key LinkedIn-authenticated Airtop Profile Next Steps Use for LinkedIn Discovery Validation**: Ensure correctness after automated LinkedIn page discovery. Combine with CRM Updates**: Prevent incorrect LinkedIn links from being stored in CRM. Automate in Data Pipelines**: Use this as a validation gate before enrichment or scoring steps.