by Oneclick AI Squad
This comprehensive n8n workflow automates the entire travel business call management process, from initial customer inquiries to trip bookings and marketing outreach. The system handles incoming calls, validates trip details, processes bookings, captures leads, and manages outbound marketing campaigns to promote trip organizer services. It streamlines the complete sales cycle while maintaining organized data records for business intelligence. Essential Information The system operates across four distinct workflows to handle different aspects of travel call management. All call data is automatically captured and stored in organized spreadsheets for analysis and follow-up. The workflow validates trip details before processing to ensure data accuracy and prevent booking errors. Outbound marketing campaigns are automatically triggered based on lead detection and formatting. System Architecture Call Handling Pipeline**: The Detect Incoming Call node captures all incoming customer calls, followed by the Validate Trip Details node which verifies and processes trip information, and the Deliver Organizer Info node that provides relevant trip organizer details to callers. Booking Management Flow**: The Capture Voice Input node records customer booking requests, the Update Booking Record node processes and stores booking information, and the Send Booking Confirmation node delivers confirmation details to customers. Lead Generation Process**: The Detect New Lead node identifies potential customers from call data, the Format Lead Information node structures the lead data for marketing use, and the Initiate Marketing Outreach node launches targeted marketing campaigns. Data Management System**: The Receive Call Response node collects call interaction data, the Log User Input node records customer information in spreadsheets, and the Relay Response to System node ensures data synchronization across all components. Implementation Guide Import the workflow into n8n and configure phone system integration for call detection and voice capture. Set up spreadsheet connections for booking records, lead management, and call logging. Configure marketing automation tools for outbound campaign management. Test each workflow section independently before enabling the complete system. Monitor call handling accuracy and adjust validation rules as needed. Technical Dependencies Phone system API or telephony service for call detection and voice processing Spreadsheet service (Google Sheets, Excel Online) for data storage and management Marketing automation platform for outbound campaign execution Voice recognition service for capturing and processing customer input CRM integration for lead management and customer tracking Database & Sheet Structure Call Tracking Sheet**: Columns should include Call_ID, Customer_Phone, Call_Time, Call_Duration, Call_Status, Trip_Interest, Organizer_Assigned Booking Records Sheet**: Required columns are Booking_ID, Customer_Name, Customer_Phone, Destination, Travel_Dates, Group_Size, Booking_Status, Confirmation_Sent Lead Management Sheet**: Essential columns include Lead_ID, Customer_Name, Phone_Number, Email, Trip_Preference, Lead_Source, Lead_Status, Marketing_Campaign_Sent Trip Organizer Database**: Contains Organizer_ID, Organizer_Name, Specialization, Contact_Info, Availability_Status, Performance_Rating Marketing Outreach Log**: Tracks Campaign_ID, Lead_ID, Campaign_Type, Send_Date, Response_Status, Follow_up_Required Customization Possibilities Adjust the Validate Trip Details node to include specific travel validation rules or partner requirements. Modify the Format Lead Information node to match your CRM system's data structure and marketing campaign formats. Configure the Initiate Marketing Outreach node to integrate with your preferred marketing platforms and campaign templates. Customize the data logging structure in the Log User Input node to capture additional customer information or booking details. Add additional validation steps or approval workflows between booking capture and confirmation sending.
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically monitors local event platforms (Eventbrite, Meetup, Facebook Events) and aggregates upcoming events that match your criteria. Never miss a networking or sponsorship opportunity again. Overview A scheduled trigger scrapes multiple event sites via Bright Data, filtering by location, date range, and keywords. OpenAI classifies each event (conference, meetup, workshop) and extracts key details such as venue, organizers, and ticket price. Updates are posted to Slack and archived in Airtable for quick lookup. Tools Used n8n** – Core automation engine Bright Data** – Reliable multi-site scraping OpenAI** – NLP-based event categorization Slack** – Delivers daily event digests Airtable** – Stores enriched event records How to Install Import the Workflow: Add the .json file to n8n. Configure Bright Data: Provide your account credentials. Set Up OpenAI: Insert your API key. Connect Slack & Airtable: Authorize both services. Customize Filters: Edit the initial Set node to adjust city, radius, and keywords. Use Cases Community Managers**: Curate a calendar of relevant events. Sales Teams**: Identify trade shows and meetups for prospecting. Event Planners**: Track competing events when choosing dates. Marketers**: Spot speaking or sponsorship opportunities. Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #eventmonitoring #brightdata #openscraping #openai #slackalerts #n8nworkflow #nocode #meetup #eventbrite
by Paul-François GORIAUX
This workflow acts as your personal AI-powered analyst for Meta Ads. It's pretty straightforward: First, it grabs a list of Facebook Ad Library URLs you want to check out from a Google Sheet. Then, it automatically scrapes the active ads from those pages. Here's the cool part: it sends each ad's image and text to Google Gemini, which analyzes it like an expert marketer would. Finally, Gemini's full analysis—we're talking strengths, weaknesses, actionable suggestions, and a performance score—gets dropped neatly into another Google Sheet for you. Set up steps You should be ready to roll in about 5 minutes. There are no complex configurations, you just need to: Connect your accounts: The workflow has placeholders waiting for your credentials for Google (for Sheets and the Gemini API) and ScrapingFlash. Link your Google Sheets: Just point the first Google Sheets node to the sheet with your URLs, and tell the last node where you want to save the results. All the nitty-gritty details and expressions are explained in the sticky notes inside the workflow itself!
by Sk developer
🚀 LinkedIn Video to MP4 Automation with Google Drive & Sheets | RapidAPI Integration This n8n workflow automatically converts LinkedIn video URLs into downloadable MP4 files using the LinkedIn Video Downloader API, uploads them to Google Drive with public access, and logs both the original URL and Google Drive link into Google Sheets. It leverages the LinkedIn Video Downloader service for fast and secure video extraction. 📝 Node Explanations (Single-Line) 1️⃣ On form submission → Captures LinkedIn video URL from the user via a web form. 2️⃣ HTTP Request → Calls LinkedIn Video Downloader to fetch downloadable MP4 links. 3️⃣ If → Checks for API errors and routes workflow accordingly. 4️⃣ Download mp4 → Downloads the MP4 video file from the API response URL. 5️⃣ Upload To Google Drive → Uploads the downloaded MP4 file to Google Drive. 6️⃣ Google Drive Set Permission → Makes the uploaded file publicly accessible. 7️⃣ Google Sheets → Logs successful conversions with LinkedIn URL and sharable Drive link. 8️⃣ Wait → Delays execution before logging failed attempts. 9️⃣ Google Sheets Append Row → Logs failed video downloads with N/A Drive link. 📄 Google Sheets Columns URL** → Original LinkedIn video URL entered in the form. Drive_URL** → Publicly sharable Google Drive link to the converted MP4 file. (For failed downloads) → Drive_URL will display N/A. 💡 Use Case Automate LinkedIn video downloading and sharing using LinkedIn Video Downloader for social media managers, marketers, and content creators without manual file handling. ✅ Benefits Time-saving* (auto-download & upload), *Centralized tracking* in Sheets, *Easy sharing* via Drive links, and *Error logging* for failed downloads—all powered by *RapidAPI LinkedIn Video Downloader**.
by Robert Breen
Create multi-sheet Excel workbooks in n8n to automate reporting using Google Drive + Google Sheets Build an automated Excel file with multiple tabs directly in n8n. Two Code nodes generate datasets, each is converted into its own Excel worksheet, then combined into a single .xlsx and (optionally) appended to a Google Sheet for sharing—eliminating manual copy-paste and speeding up reporting. Who’s it for Teams that publish recurring reports as Excel with multiple tabs Ops/Marketing/Data folks who want a no-code/low-code way to package JSON into Excel n8n beginners learning the Code → Convert to File → Merge pattern How it works Manual Trigger starts the run. Code nodes emit JSON rows for each table (e.g., People, Locations). Convert to File nodes turn each JSON list into an Excel binary, assigning Sheet1/Sheet2 (or your names). Merge combines both binaries into a single Excel workbook with multiple tabs. Google Sheets (optional) appends the JSON rows to a live spreadsheet for collaboration. Setup (only 2 connections) 1️⃣ Connect Google Sheets (OAuth2) In n8n → Credentials → New → Google Sheets (OAuth2) Sign in with your Google account and grant access Copy the example sheet referenced in the Google Sheets node (open the node and duplicate the linked sheet), or select your own In the workflow’s Google Sheets node, select your Spreadsheet and Worksheet https://docs.google.com/spreadsheets/d/1G6FSm3VdMZt6VubM6g8j0mFw59iEw9npJE0upxj3Y6k/edit?gid=1978181834#gid=1978181834 2️⃣ Connect Google Drive (OAuth2) In n8n → Credentials → New → Google Drive (OAuth2) Sign in with the Google account that will store your Excel outputs and allow access In your Drive-related nodes (if used), point to the folder where you want the .xlsx saved or retrieved Customize the workflow Replace the sample arrays in the Code nodes with your data (APIs, DBs, CSVs, etc.) Rename sheetName in each Convert to File node to match your desired tab names Keep the Merge node in Combine All mode to produce a single workbook In Google Sheets, switch to Manual mapping for strict column order (optional) Best practices (per template guidelines) Rename nodes** to clear, action-oriented names (e.g., “Build People Sheet”, “Build Locations Sheet”) Add a yellow Sticky Note at the top with this description so users see setup in-workflow Do not hardcode credentials** inside HTTP nodes; always use n8n Credentials Remove personal IDs/links before publishing Sticky Note (copy-paste) > Multi-Tab Excel Builder (Google Drive + Google Sheets) > This workflow generates two datasets (Code → JSON), converts each to an Excel sheet, merges them into a single workbook with multiple tabs, and optionally appends rows to Google Sheets. > > Setup (2 connections): > 1) Google Sheets (OAuth2): Create credentials → duplicate/select your target spreadsheet → set Spreadsheet + Worksheet in the node. > 2) Google Drive (OAuth2): Create credentials → choose the folder for storing/retrieving the .xlsx. > > Customize: Edit the Code nodes’ arrays, rename tab names in Convert to File, and adjust the Sheets node mapping as needed. Troubleshooting Missing columns / wrong order:* Use *Manual mapping** in the Google Sheets node Binary not found:* Ensure each *Convert to File* node’s binaryPropertyName matches what *Merge** expects Permissions errors:** Re-authorize Google credentials; confirm you have edit access to the target Sheet/Drive folder 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your PDF)? 📧 rbreen@ynteractive.com 🔗 https://www.linkedin.com/in/robert-breen-29429625/ 🌐 https://ynteractive.com
by dirogar
Telegram Tasker Bot — это сценарий n8n, который принимает голосовые сообщения в Telegram, автоматически превращает их в текст, извлекает из него ключевые поля задачи и создаёт карточку в нужной доске Trello. Пользователь просто говорит задачу — бот сам оформляет её и присылает ссылку на готовую карточку. Для использования вам потребуется telegram bot. Его можно создать через бота BotFather Так же понадобится доступ к API chatgpt - он используется только для транскрибции аудио в речь. Вы можете использовать любой другой сервис, по вашему выбору. И аккаунт в trello, с доступом к API. !Внимание! ID доски в trello можно взять из url ID столбца на доске трелло можно взять через инструменты разработчика (по крайней мере я так получал эти данные)
by Nabin Bhandari
This template uses VAPI and Cal.com to book appointments through a voice conversation. It detects whether the user wants to check availability or book an appointment, then responds naturally with real-time scheduling options. Who is this for? This workflow is perfect for: Voice assistant developers AI receptionists and smart concierge tools Service providers (salons, clinics, coaches) needing hands-free scheduling Anyone building voice-based customer experiences What does it do? This workflow turns a natural voice conversation into a working appointment system. It starts with a Webhook connected to your VAPI voice agent. The Set node extracts user intent (like “check availability” or “book now”). A Switch node branches logic based on the intent. If the user wants to check availability, the workflow fetches available times from Cal.com. If the user wants to book, it creates a new event using Cal.com's API. The final result is sent back to VAPI as a conversational voice response. How to use it Import this workflow into your n8n instance. Set up a Webhook node and connect it to your VAPI voice agent. Add your Cal.com API token as a credential (use HTTP Header Auth). Deploy and test using VAPI’s simulator or real phone input. (Optional) Customize the OpenAI prompt if you're using it to process or moderate inputs. Requirements A working VAPI agent A Cal.com account with API access n8n (cloud or self-hosted) An understanding of how to configure webhook and API credentials in n8n Customization Ideas Swap out Cal.com with another booking API (like Calendly) Add a Google Sheets or Supabase node to log appointments Use OpenAI to summarize or sanitize voice inputs before proceeding Build multi-turn conversations in VAPI for more complex bookings
by Yaron Been
This workflow automatically scrapes customer reviews from Trustpilot and performs sentiment analysis to extract valuable customer insights. It saves you time by eliminating the need to manually read through reviews and provides structured data on customer feedback, sentiment, and pain points. Overview This workflow automatically scrapes the latest customer reviews from any Trustpilot company page and uses AI to analyze each review for sentiment, extract key complaints or praise, and identify recurring customer pain points. It stores all structured review data in Google Sheets for easy analysis and reporting. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping Trustpilot review pages without being blocked OpenAI**: AI agent for intelligent review analysis and sentiment extraction Google Sheets**: For storing structured review data and analysis results How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your review tracking spreadsheet Customize: Enter target Trustpilot company URLs and adjust review analysis parameters Use Cases Product Teams**: Identify customer pain points and feature requests from reviews Customer Support**: Monitor customer satisfaction and recurring issues Marketing Teams**: Extract positive testimonials and understand customer sentiment Business Intelligence**: Track brand reputation and customer feedback trends Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #trustpilot #reviewscraping #sentimentanalysis #brightdata #webscraping #customerreviews #n8nworkflow #workflow #nocode #reviewautomation #customerinsights #brandmonitoring #reviewanalysis #customerfeedback #reputationmanagement #reviewmonitoring #customersentiment #productfeedback #trustpilotscraping #reviewdata #customerexperience #businessintelligence #feedbackanalysis #reviewtracking #customervoice #aianalysis #reviewmining #customerinsights
by Yaron Been
Description This workflow automatically finds trending headlines and content from various sources and posts them to your social media accounts. It helps maintain an active social media presence without the daily manual effort of content curation. Overview This workflow automatically scrapes trending headlines and content from various sources and posts them to your social media accounts. It uses Bright Data to access content and n8n to schedule and post to platforms like Twitter, LinkedIn, or Facebook. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping trending content from news sites, blogs, or other sources without getting blocked. Social Media APIs:** To post content to your accounts. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Connect Social Media: Authenticate your social media accounts. Customize: Set your content preferences, posting schedule, and hashtag strategy. Use Cases Social Media Managers:** Automate content curation and posting. Content Creators:** Share trending topics in your niche. Businesses:** Maintain an active social media presence with minimal effort. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #socialmedia #brightdata #contentcuration #scheduling #socialmediaautomation #contentmarketing #socialmediamanagement #autoposting #trendingcontent #n8nworkflow #workflow #nocode #socialmediatools #digitalmarketing #contentcalendar #socialmediapresence #headlinecuration #trendalerts #socialmediaschedule #contentautomation #socialmediamarketing #contentdistribution #automatedposting #socialmediastrategy
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically monitors customer churn indicators and early warning signals to help reduce customer attrition and improve retention rates. It saves you time by eliminating the need to manually track customer behavior and provides proactive insights for preventing customer churn. Overview This workflow automatically scrapes customer data sources, support tickets, usage analytics, and engagement metrics to identify patterns that indicate potential customer churn. It uses Bright Data to access customer data and AI to intelligently analyze behavior patterns and predict churn risk. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping customer data and analytics platforms without being blocked OpenAI**: AI agent for intelligent churn prediction and pattern analysis Google Sheets**: For storing churn indicators and customer retention data How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your churn monitoring spreadsheet Customize: Define customer data sources and churn indicator parameters Use Cases Customer Success**: Proactively identify at-risk customers for retention efforts Account Management**: Prioritize customer outreach based on churn probability Product Teams**: Identify product issues that contribute to customer churn Revenue Operations**: Reduce churn rates and improve customer lifetime value Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #churnprediction #customerretention #brightdata #webscraping #customeranalytics #n8nworkflow #workflow #nocode #churnindicators #customersuccess #retentionanalysis #customerchurn #customerinsights #churnprevention #retentionmarketing #customerdata #churnmonitoring #customerlifecycle #retentionmetrics #churnanalysis #customerbehavior #retentionoptimization #churnreduction #customerengagement #retentionstrategy #churnmanagement #customerhealth #retentiontracking
by George Zargaryan
AI Real Estate Agent with OpenRouter and SrpAPI to talk with property objects from propertyfinder.ae This n8n template demonstrates a simple AI Agent that can: Scrape information from a provided propertyfinder.ae listing link. Answer questions about a specific property using the scraped information. Use SerpAPI to find details that are missing from the scraped data. Answer general real-estate questions using SerpAPI. Use Case This workflow serves as a starting point for building complex AI assistants for real estate or other domains. See the demo video Potential Enhancements Expand Knowledge:** Augment the workflow with your own knowledge base using a vector database (RAG approach). Add More Sources:** Adapt the scraper to support other real estate websites. Optimize Speed:** Add a cache for scraped data to reduce response latency. Improve Context Handling:** Implement reliable persistence to track the current listing instead of iterating through conversation history. Customize Prompts:** Write more tailored prompts for your specific needs (the current one is for demonstration only). Integrate Channels:** Connect the workflow to communication channels like Instagram, Telegram, or WhatsApp. How It Works The workflow is triggered by a "When chat message received" node for simple demonstration. The Chat Memory Manager node extracts the last 30 messages for the current session. A code node finds the property link, first by checking the most recent user message and then by searching the conversation history. If a link is found, an HTTP Request node scrapes the HTML content from the listing page. The Summarize code node parses the HTML, retrieves key information, and passes it to the AI Agent as a temporary knowledge base. The final AI Agent node answers user queries using the scraped knowledge base and falls back to the SerpAPI tool when information is missing. How to Use You can test this workflow directly in n8n or integrate it into any social media channel or your website. The AI Agent node is configured to use OpenRouter. Add your OpenRouter credentials, or replace the node with your preferred LLM provider. Add your SerpAPI key to the SerpAPI tool within the AI Agent node. Requirements An API key for OpenRouter (or credentials for your preferred LLM provider). A SerpAPI key. You can get one from their website; a free plan is available for testing. Need Help Building Something More? Contact me on: Telegram:** @ninesfork LinkedIn:** George Zargaryan Happy Hacking! 🚀
by Romuald Członkowski
Social Media Intelligence Workflow with Bright Data and OpenAI Get a 360 Social media presence report for a person Who's it for Business development professionals, recruiters, sales teams, and market researchers who need comprehensive social media intelligence on individuals for lead qualification, due diligence, partnership evaluation, or candidate assessment. How it works Enter target person's details through the web form (name, company, location) AI Discovery Agent searches across selected platforms using name variations Profile validator verifies discovered profiles with confidence scoring Platform-specific agents analyze each profile using Bright Data MCP tools GPT-4 synthesizes all data into a comprehensive intelligence report Report automatically generated as formatted Google Doc with direct link Requirements Bright Data MCP account with PRO access (Get your Bright Data API key here) OpenAI API key (or alternative LLM provider) Google Drive OAuth connection for report delivery n8n self-hosted instance or cloud account How to set up Update Bright Data credentials: Find "Bright Data MCP" node (look for red warning note) Replace YOUR_BRIGHT_DATA_TOKEN_HERE with your actual token Update UNLOCKER_CODE_HERE with your unlocker code Update Google Drive settings: Find "Create Empty Google Doc" node Select target folder there Configure your LLM credentials (OpenAI or alternative) Test with your own name using "Basic" search depth Watch Youtube Tutorial How to customize the workflow Add platforms**: Extend the Switch node with new cases and create corresponding prompt builders Modify analysis depth**: Edit the platform-specific prompt builders to focus on different metrics Change report format**: Update the final LLM Chain prompt to adjust report structure Add notifications**: Insert Slack or email nodes after report generation Adjust confidence thresholds**: Modify validators to change profile verification requirements Alternative outputs**: Replace Google Docs with PDF, Excel, or webhook to CRM