by Yaron Been
This workflow automatically scrapes customer reviews from Trustpilot and performs sentiment analysis to extract valuable customer insights. It saves you time by eliminating the need to manually read through reviews and provides structured data on customer feedback, sentiment, and pain points. Overview This workflow automatically scrapes the latest customer reviews from any Trustpilot company page and uses AI to analyze each review for sentiment, extract key complaints or praise, and identify recurring customer pain points. It stores all structured review data in Google Sheets for easy analysis and reporting. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping Trustpilot review pages without being blocked OpenAI**: AI agent for intelligent review analysis and sentiment extraction Google Sheets**: For storing structured review data and analysis results How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your review tracking spreadsheet Customize: Enter target Trustpilot company URLs and adjust review analysis parameters Use Cases Product Teams**: Identify customer pain points and feature requests from reviews Customer Support**: Monitor customer satisfaction and recurring issues Marketing Teams**: Extract positive testimonials and understand customer sentiment Business Intelligence**: Track brand reputation and customer feedback trends Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #trustpilot #reviewscraping #sentimentanalysis #brightdata #webscraping #customerreviews #n8nworkflow #workflow #nocode #reviewautomation #customerinsights #brandmonitoring #reviewanalysis #customerfeedback #reputationmanagement #reviewmonitoring #customersentiment #productfeedback #trustpilotscraping #reviewdata #customerexperience #businessintelligence #feedbackanalysis #reviewtracking #customervoice #aianalysis #reviewmining #customerinsights
by Harshil Agrawal
This workflow allows you to send position updates of the ISS every minute to a table in Google BigQuery. Cron node: The Cron node will trigger the workflow every minute. HTTP Request node: This node will make a GET request to the API https://api.wheretheiss.at/v1/satellites/25544/positions to fetch the position of the ISS. This information gets passed on to the next node in the workflow. Set node: We will use the Set node to ensure that only the data that we set in this node gets passed on to the next nodes in the workflow. Google BigQuery: This node will send the data from the previous node to the position table in Google BigQuery. If you have created a table with a different name, use that table instead.
by Sidetool
This workflow is a supporting automation to a common Airtable situation, that as of this writing, has no direct solution but has great demand. Interfaces are your secret weapon for managing a variety of tasks – from sales funnels and task tracking to creating dynamic dashboards. But here's a common situation: how do you efficiently bulk upload records (like contacts, leads, or clients) from an interface with just a click? Once set up, you'll be able to upload CSV files directly to your tables from the Interfaces with ease. Workflow Key Points: 1. Bulk Upload Functionality: Say goodbye to the limitations of standard Airtable interfaces. Now, you can upload multiple leads or contacts simultaneously, making your work swift and efficient. 2. Customizable Fields: Tailor the base to meet your specific data needs. This ensures seamless integration with your existing systems and simplifies data management. Perfect for teams in e-commerce, CRM, or any sector where managing a high volume of leads or contacts is key. Our Airtable Base is designed to eliminate the tediousness of importing contacts. It makes large-scale data management straightforward, saving you precious time and hassle. Get ready to streamline your operations and boost your productivity! 🚀💡
by Sk developer
🚀 LinkedIn Video to MP4 Automation with Google Drive & Sheets | RapidAPI Integration This n8n workflow automatically converts LinkedIn video URLs into downloadable MP4 files using the LinkedIn Video Downloader API, uploads them to Google Drive with public access, and logs both the original URL and Google Drive link into Google Sheets. It leverages the LinkedIn Video Downloader service for fast and secure video extraction. 📝 Node Explanations (Single-Line) 1️⃣ On form submission → Captures LinkedIn video URL from the user via a web form. 2️⃣ HTTP Request → Calls LinkedIn Video Downloader to fetch downloadable MP4 links. 3️⃣ If → Checks for API errors and routes workflow accordingly. 4️⃣ Download mp4 → Downloads the MP4 video file from the API response URL. 5️⃣ Upload To Google Drive → Uploads the downloaded MP4 file to Google Drive. 6️⃣ Google Drive Set Permission → Makes the uploaded file publicly accessible. 7️⃣ Google Sheets → Logs successful conversions with LinkedIn URL and sharable Drive link. 8️⃣ Wait → Delays execution before logging failed attempts. 9️⃣ Google Sheets Append Row → Logs failed video downloads with N/A Drive link. 📄 Google Sheets Columns URL** → Original LinkedIn video URL entered in the form. Drive_URL** → Publicly sharable Google Drive link to the converted MP4 file. (For failed downloads) → Drive_URL will display N/A. 💡 Use Case Automate LinkedIn video downloading and sharing using LinkedIn Video Downloader for social media managers, marketers, and content creators without manual file handling. ✅ Benefits Time-saving* (auto-download & upload), *Centralized tracking* in Sheets, *Easy sharing* via Drive links, and *Error logging* for failed downloads—all powered by *RapidAPI LinkedIn Video Downloader**.
by Harshil Agrawal
This workflow allows you to compress binary files to zip format. HTTP Request node: The workflow uses the HTTP Request node to fetch files from the internet. If you want to fetch files from your local machine, replace it with the Read Binary File or Read Binary Files node. Compression node: The Compression node compresses the file into a zip. If you want to compress the files to gzip, then select the gzip format instead. Based on your use-case, you may want to write the files to your disk or upload it to Google Drive or Box. If you want to write the compressed file to your disk, replace the Dropbox node with the Write Binary File node, or if you want to upload the file to a different service, use the respective node.
by Zacharia Kimotho
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. What does this workflow do? This workflow helps speed up the analysis process of the top ranking titles and meta descriptions to identify paterns and styles that will help us rank on Google for a given keyword How does it work? We provide a keyword we are interested in on our Google sheet. When executed, We scrap the top 10 pages using Bright Data serp API and analyse the style and patterns of the top ranking pages and generate a new title and meta description Techncial setup Make a copy of this Google sheet Update your desired keywords on the cell/row Set your Bright data credentials on the Update the zone to your preset zone We are getting the results as a JSON. You can update this setting on the url https://www.google.com/search?q={{ $json.search_term .replaceAll(" ", "+")}}&start=0&brd_json=1 by removing the brd_json=1 query Store the generated results on the Duplicated sheet Run the workflow Setting up the Serp Scraper in Bright Data On Bright Data, go to the Proxies & Scraping tab Under SERP API, create a new zone Give it a suitable name and description. The default is serp_api Add this to your account Add your credentials as a header credential and rename to Bright data API
by Emmanuel Bernard
🎉 Do you want to master AI automation, so you can save time and build cool stuff? I’ve created a welcoming Skool community for non-technical yet resourceful learners. 👉🏻 Join the AI Atelier 👈🏻 Unlock streamlined Zoom Meeting organization and exclusive access management with this n8n workflow. Designed for educators, event organizers, and businesses, this tool automates your event logistics, so you can focus on delivering valuable content. Features Zoom Meetings Creation:** Instantly generate new Zoom meetings with the n8n built-in form. Collect Payments Using Stripe:** Effortlessly monetize your events with secure, automatically created Stripe payment pages for each meeting. Exclusive Gated Access:** Ensure your content remains exclusive by sending Zoom meeting passwords only to verified subscribers who have completed their payment through Stripe. Participants Email Notifications:** Automate the distribution of Zoom meeting details post-payment, eliminating the need for manual email management and ensuring participants are promptly informed. Instant and Easy Participants Overview:** Manage and track your event registrations with ease. All related data is stored in a Google Sheets document that you own. You're notified via email with each new subscription, simplifying participant management. Set Up Steps Connect your Zoom, Stripe, Gmail and Google Sheet credentials. Create an empty Google Sheet in your Google Drive. Fill the config node (Sheet URL, email and currency). Edit email text. This n8n workflow template is designed to minimize setup time and maximize efficiency, allowing you to focus on delivering value to your subscribers. With just a few clicks, you can automate the entire process of organizing and monetizing your Zoom meetings. Created by the n8ninja.
by Oneclick AI Squad
This automated n8n workflow performs daily forecasting of sales and raw material needs for a restaurant. By analyzing historical data and predicting future usage with AI, businesses can minimize food waste, optimize inventory, and improve operational efficiency. The forecast is stored in Google Sheets and sent via email for easy review by staff and management. What is AI Forecast Generator? The AI Forecast Generator is a machine learning component that analyzes historical sales data, weather patterns, and seasonal trends to predict future food demand and recommend optimal inventory levels to minimize waste. Good to Know AI forecasting accuracy improves over time with more historical data Weather and seasonal factors significantly impact food demand predictions Google Sheets access must be properly authorized to avoid data sync issues Email notifications help ensure timely review of daily forecasts The system works with two main data sources: historical food wastage data and predicted low-waste food requirements How It Works Daily Trigger - Initiates the workflow every day to perform food waste prediction Fetch Historical Sales Data - Reads past food usage & sales data from Google Sheets to understand trends Format Data for AI Forecasting - Cleans and organizes raw data into a structured format for AI processing AI Forecast Generator - Uses Gemini AI to forecast food demand and recommend waste reduction strategies Clean & Structure AI Output - Parses AI response into structured and actionable format for reporting Log Forecast to Google Sheets - Stores AI-generated forecast back into Google Sheets for historical tracking Create Email Summary - Creates a concise, human-friendly summary of the forecast findings Send Email Forecast Report - Delivers the forecast report via email to decision makers and management Data Sources The workflow utilizes two Google Sheets: Food Wastage Data Sheet - Contains historical data with columns: Date (date) Food Item (text) Quantity Wasted (number) Cost Impact (currency) Category (text) Reason for Waste (text) Predicted Food Data Sheet - Contains AI predictions with columns: Date (date) Food Item (text) Predicted Demand (number) Recommended Order Quantity (number) Waste Risk Level (text) Optimization Notes (text) How to Use Import the workflow into n8n Configure Google Sheets API access and authorize the application Set up email credentials for forecast report delivery Create the two required Google Sheets with the specified column structures Configure the AI model credentials (Gemini API key) Test with sample historical data to verify predictions and email delivery Adjust forecasting parameters based on your restaurant's specific needs Monitor and refine the system based on actual vs. predicted results Requirements Google Sheets API access Email service credentials (Gmail, SMTP, etc.) AI model API credentials (Gemini AI) Historical food wastage data for initial training Customizing This Workflow Modify the AI Forecast Generator prompts to focus on specific food categories, seasonal adjustments, or local market conditions. Adjust the email summary format to match your restaurant's reporting preferences and add additional data sources like supplier information or menu planning data.
by Joachim Hummel
This workflow connects a USB scanner to Nextcloud via ScanservJS and the integrated API. It checks for new scans at a scheduled time (e.g., every 5 minutes). If there are any, they are automatically retrieved via HTTP request and then saved to a desired Nextcloud folder. Ideal for home offices, offices, or maker projects with Raspberry Pi and network scanners. Nodes used: Schedule Trigger – starts the flow cyclically HTTP Request – retrieves document data from ScanservJS Nextcloud Node – uploads the file directly to your Nextcloud account Requirements: Local installation of ScanservJS (e.g., on a Raspberry Pi) Configured USB scanner Nextcloud access with write permissions in the target folder
by Ari Nakos
This n8n workflow automates lead generation by searching Reddit for relevant posts based on keywords, filtering them, using OpenRouter AI to analyze and summarize content, and logging the findings (link, summary, etc.) to Google Sheets. Watch the full setup tutorial on how I setup this ETL pipeline using n8n: https://youtu.be/F3-fbU3UmYQ Required Authentication: To run this workflow, you need to set up credentials in n8n for: Reddit: Uses OAuth 2.0. Requires creating an app on Reddit to get a Client ID & Secret. (YT Tutorial for Reddit App Creation: https://youtu.be/zlGXtW4LAK8) OpenRouter: Uses an API Key. Generate this key directly from your OpenRouter account settings. (YT Tutorial : https://youtu.be/Cq5Y3zpEhlc) Google Sheets: Uses OAuth 2.0. Requires setup in Google Cloud Console (enable Sheets API, create OAuth Client ID with n8n redirect URI) to get a Client ID & Secret. Ensure these credentials are created and selected in the respective n8n nodes (Get Posts, OpenRouter Chat Model nodes, Output The Results).
by Nazmy
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. OAuth Token Generator and Validator This n8n template helps you generate, validate, and store tokens for your customers securely using: n8n** as your backend automation engine Airtable** as your lightweight client and token store 🚀 What It Does Accepts client_id and client_secret via POST webhook. Validates client credentials against Airtable. Generates a long token on success. Stores the generated token in Airtable with metadata. Responds with a JSON containing the token, expiry, and type. Returns clear error messages if validation fails. How It Works Webhook node receives client_id and client_secret. Validator (Code node) checks: Body contains only client_id and client_secret. Rejects missing or extra fields. Airtable search: Looks up the client_id. Rejects if not found. Secret validation (If node): Compares provided client_secret with stored value. Rejects if incorrect. Token generation (Code node): Generates a 128-character secure token. Airtable create: Stores token, client ID, creation date, and type. Webhook response: Returns JSON { access_token, expires_in, token_type } on success. Returns appropriate JSON error messages on failure. Related Workflow You can also use it with the published Bearer Token Validation workflow: 👉 Validate API Requests with Bearer Token Authentication and Airtable to securely validate tokens you generate with this workflow across your protected endpoints. Why Use This Provides OAuth-like flows without a complex backend. Uses n8n + Airtable for client management and token storage. Clean, modular, and ready for your SaaS or internal API automations. Extendable for token expiry, refresh, and rotation handling. Enjoy building secure token-based APIs using n8n + Airtable! 🚀 Built by: Nazmy
by Preston Zeller
How It Works This N8N workflow creates an automated system for discovering high-potential real estate investment opportunities. The workflow runs on a customizable schedule to scan the market for properties that match your specific criteria, then alerts your team about the most promising leads. The process follows these steps: Connects to BatchData API on a regular schedule to search for properties matching your parameters Compares new results with previous scans to identify new listings and property changes Applies intelligent filtering to focus on high-potential opportunities (high equity, absentee owners, etc.) Retrieves comprehensive property details and owner information for qualified leads Delivers formatted alerts through multiple channels (email and Slack/Teams) Each email alert includes detailed property information, owner details, equity percentage, and a direct Google Maps link to view the property location. The workflow also posts concise notifications to your team's communication channels for quick updates. Who It's For This workflow is designed for: Real Estate Investors: Find off-market properties with high equity and motivated sellers Real Estate Agents: Identify potential listing opportunities before they hit the market Property Acquisition Teams: Streamline the lead generation process with automated scanning Real Estate Wholesalers: Discover properties with significant equity spreads for potential deals REITs and Property Management Companies: Monitor market changes and expansion opportunities The workflow is especially valuable for professionals who want to: Save hours of manual market research time Get early notifications about high-potential properties Access comprehensive property and owner information in one place Focus their efforts on the most promising opportunities About BatchData BatchData is a powerful property data platform for real estate professionals. Their API provides access to comprehensive property and owner information across the United States, including: Property details (bedrooms, bathrooms, square footage, year built, etc.) Valuation and equity estimates Owner information (name, mailing address, contact info) Transaction history and sales data Foreclosure and distressed property status Demographic and neighborhood data The platform specializes in providing accurate, actionable property data that helps real estate professionals make informed decisions and identify opportunities efficiently. BatchData's extensive database covers millions of properties nationwide and is regularly updated to ensure data accuracy. The API's flexible search capabilities allow you to filter properties based on numerous criteria, making it an ideal data source for automated lead generation workflows like this one.