by Teddy
Retrieve 20 Latest TechCrunch Articles Who is this for? This workflow is designed for developers, content creators, and data analysts who need to scrape recent articles from TechCrunch. It’s perfect for anyone looking to aggregate news articles or create custom feeds for analysis, reporting, or integration into other systems. What problem is this workflow solving? This workflow automates the process of scraping recent articles from TechCrunch. Manually collecting article data can be time-consuming and inefficient, but with this workflow, you can quickly gather up-to-date news articles with relevant metadata, saving time and effort. What this workflow does This workflow retrieves the latest 20 news articles from TechCrunch’s “Recent” page. It extracts the article URLs, metadata (such as titles and publication dates), and main content for each article, allowing you to access the information you need without any manual effort. Setup Clone or download the workflow template. Ensure you have a working n8n environment. Configure the HTTP Request nodes with your desired parameters to connect to the TechCrunch API. (Optional) Customize the workflow to target specific sections or topics of interest. Run the workflow to retrieve the latest 20 articles. How to customize this workflow to your needs Modify the HTTP request to pull articles from different pages or sections of TechCrunch. Adjust the number of articles to retrieve by changing the selection criteria. Add additional processing steps to further filter or analyze the article data. Workflow Steps Send an HTTP request to the TechCrunch "Recent" page. Parse a posts box that holds the list of articles. Parse all posts to extract all articles. spilt out posts for each article. Extract the URL and metadata from each article. Send an HTTP request for each article using its URL. Locate and parse the main content of each article. Note: Be sure to update the HTTP Request nodes with any necessary headers or authentication to work with TechCrunch’s website.
by Mauricio Perera
n8n Workflow: Calculate the Centroid of a Set of Vectors Overview This workflow receives an array of vectors in JSON format, validates that all vectors have the same dimensions, and computes the centroid. It is designed to be reusable across different projects. Workflow Structure Nodes and Their Functions: Receive Vectors (Webhook): Accepts a GET request containing an array of vectors in the vectors parameter. Expected Input: vectors parameter in JSON format. Example Request: /webhook/centroid?vectors=[[2,3,4],[4,5,6],[6,7,8]] Output: Passes the received data to the next node. Extract & Parse Vectors (Set Node): Converts the input string into a proper JSON array for processing. Ensures vectors is a valid array. If the parameter is missing, it may generate an error. Expected Output Example: { "vectors": [[2,3,4],[4,5,6],[6,7,8]] } Validate & Compute Centroid (Code Node): Validates vector dimensions and calculates the centroid. Validation: Ensures all vectors have the same number of dimensions. Computation: Averages each dimension to determine the centroid. If validation fails: Returns an error message indicating inconsistent dimensions. Successful Output Example: { "centroid": [4,5,6] } Error Output Example: { "error": "Vectors have inconsistent dimensions." } Return Centroid Response (Respond to Webhook Node): Sends the final response back to the client. If the computation is successful, it returns the centroid. If an error occurs, it returns a descriptive error message. Example Response: { "centroid": [4, 5, 6] } Inputs JSON array of vectors, where each vector is an array of numerical values. Example Input { "vectors": [ [1, 2, 3], [4, 5, 6], [7, 8, 9] ] } Setup Guide Create a new workflow in n8n. Add a Webhook node (Receive Vectors) to receive JSON input. Add a Set node (Extract & Parse Vectors) to extract and convert the data. Add a Code node (Validate & Compute Centroid) to: Validate dimensions. Compute the centroid. Add a Respond to Webhook node (Return Centroid Response) to return the result. Function Node Script Example const input = items[0].json; const vectors = input.vectors; if (!Array.isArray(vectors) || vectors.length === 0) { return [{ json: { error: "Invalid input: Expected an array of vectors." } }]; } const dimension = vectors[0].length; if (!vectors.every(v => v.length === dimension)) { return [{ json: { error: "Vectors have inconsistent dimensions." } }]; } const centroid = new Array(dimension).fill(0); vectors.forEach(vector => { vector.forEach((val, index) => { centroid[index] += val; }); }); for (let i = 0; i < dimension; i++) { centroid[i] /= vectors.length; } return [{ json: { centroid } }]; Testing Use a tool like Postman or the n8n UI to send sample inputs and verify the responses. Modify the input vectors to test different scenarios. This workflow provides a simple yet flexible solution for vector centroid computation, ensuring validation and reliability.
by Zacharia Kimotho
This workflow takes off the task of backing up workflows regularly on Github and uses Google Drive as the main tool to host these. This can be a good way to keep track of your workflows so that you never lose any workflows in case your n8n goes down. How does it work Creates a new folder within a specified folder with the time its backed up Loops around all workflows, converts them to a JSON file and uploads them to the created folder Gets the previous backups and deletes them This has a clean feel and look as it simplifies the backup while not keeping a cache of workflows on your drive. Setup Create a new folder Create new service account credentials Share the folder with the service account email Upload this workflow to your canvas and map the credentials Set the schedule that you need your workflows to run and manage your backups Activate the workflow Happy Productivity! @Imperol
by Baptiste Fort
Who is it for? This workflow is for marketers, sales teams, and local businesses who want to quickly collect leads (business name, phone, website, and email) from Google Maps and store them in Airtable. You can use it for real estate agents, restaurants, therapists, or any local niche. How it works Scrape Google Maps with Apify Google Maps Extractor. Clean and structure the data (name, address, phone, website). Visit each website and retrieve the raw HTML. Use GPT to extract the most relevant email from the site content. Save everything to Airtable for easy filtering and future outreach. It works for any location or keyword – just adapt the input in Apify. Requirements Before running this workflow, you’ll need: ✅ Apify account (to use the Google Maps Extractor) ✅ OpenAI API key (for GPT email extraction) ✅ Airtable account & base with the following fields: Business Name Address Website Phone Number Email Google Maps URL Airtable Structure Your Airtable base should contain these columns: Airtable Structure | Title | Street | Website | Phone Number | Email | URL | |-------------------------|-------------------------|--------------------|-----------------|------------------------|----------------------| | Paris Real Estate Agency| 10 Rue de Rivoli, Paris | https://agency.fr | +33 1 23 45 67 | contact@agency.fr | maps.google.com/... | | Example Business 2 | 25 Avenue de l’Opéra | https://example.fr | +33 1 98 76 54 | info@example.fr | maps.google.com/... | | Example Business 3 | 8 Boulevard Haussmann | https://demo.fr | +33 1 11 22 33 | contact@demo.fr | maps.google.com/... | Error Handling Missing websites:** If a business has no website, the flow skips the scraping step. No email found:** GPT returns Null if no email is detected. API rate limits:** Add a Wait node between requests to avoid Apify/OpenAI throttling. Now let’s take a detailed look at how to set up this automation, using real estate agencies in Paris as an example. Step 1 – Launch the Google Maps Scraper Start with a When clicking Execute workflow trigger to launch the flow manually. Then, add an HTTP Request node with the method set to POST. 👉 Head over to Apify: Google Maps Extractor On the page: https://apify.com/compass/google-maps-extractor Enter your business keyword (e.g., real estate agency, hairdresser, restaurant) Set the location you want to target (e.g., Paris, France) Choose how many results to fetch (e.g., 50) Optionally, use filters (only places with a website, by category, etc.) ⚠️ No matter your industry, this works — just adapt the keyword and location. Once everything is filled in: Click Run to test. Then, go to the top right → click on API. Select the API endpoints tab. Choose Run Actor synchronously and get dataset items. Copy the URL and paste it into your HTTP Request (in the URL field). Then enable: ✅ Body Content Type → JSON ✅ Specify Body Using JSON` Go back to Apify, click on the JSON tab, copy the entire code, and paste it into the JSON body field of your HTTP Request. At this point, if you run your workflow, you should see a structured output similar to this: title subTitle price categoryName address neighborhood street city postalCode ........ Step 2 – Clean and structure the data Once the raw data is fetched from Apify, we clean it up using the Edit Fields node. In this step, we manually select and rename the fields we want to keep: Title → {{ $json.title }} Address → {{ $json.address }} Website → {{ $json.website }} Phone → {{ $json.phone }} URL → {{ $json.url }}* This node lets us keep only the essentials in a clean format, ready for the next steps. On the right: a clear and usable table, easy to work with. Step 3 – Loop Over Items Now that our data is clean (see step 2), we’ll go through it item by item to handle each contact individually. The Loop Over Items node does exactly that: it takes each row from the table (each contact pulled from Apify) and runs the next steps on them, one by one. 👉 Just set a Batch Size of 20 (or more, depending on your needs). Nothing tricky here, but this step is essential to keep the flow dynamic and scalable. Step 4 – Edit Field (again) After looping through each contact one by one (thanks to Loop Over Items), we're refining the data a bit more. This time, we only want to keep the website. We use the Edit Fields node again, in Manual Mapping mode, with just: Website → {{ $json.website }} The result on the right? A clean list with only the URLs extracted from Google Maps. 🔧 This simple step helps isolate the websites so we can scrape them one by one in the next part of the flow. Step 5 – Scrape Each Website with an HTTP Request Let’s continue the flow: in the previous step, we isolated the websites into a clean list. Now, we’re going to send a request to each URL to fetch the content of the site. ➡️ To do this, we add an HTTP Request node, using the GET method, and set the URL as: {{ $json.website }} This value comes from the previous Edit Fields input This node will simply “visit” each website automatically and return the raw HTML code (as shown on the right). 📄 That’s the material we’ll use in the next step to extract email addresses (and any other useful info). We’re not reading this code manually — we’ll scan through it line by line to detect patterns that matter to us. This is a technical but crucial step: it’s how we turn a URL into real, usable data. Step 6 – Extract the Email with GPT Now that we've retrieved all the raw HTML from the websites using the HTTP Request node, it's time to analyze it. 💡 Goal: detect the most relevant email address on each site (ideally the main contact or owner). 👉 To do that, we’ll use an OpenAI node (Message a Model). Here’s how to configure it: ⚙️ Key Parameters: Model: GPT-4-1-MINI (or any GPT-4+ model available) Operation: Message a Model Resource: Text Simplify Output: ON Prompt (message you provide): Look at this website content and extract only the email I can contact this business. In your output, provide only the email and nothing else. Ideally, this email should be of the business owner, so if you have 2 or more options, try for most authoritative one. If you don't find any email, output 'Null'. Exemplary output of yours: name@examplewebsite.com {{ $json.data }} Step 7 – Save the Data in Airtable Once we’ve collected everything — the business name, address, phone number, website… and most importantly the email extracted via ChatGPT — we need to store all of this somewhere clean and organized. 👉 The best place in this workflow is Airtable. 📦 Why Airtable? Because it allows you to: Easily view and sort the leads you've scraped Filter, tag, or enrich them later And most importantly… reuse them in future automations ⚙️ What we're doing here We add an Airtable → Create Record node to insert each lead into our database. Inside this node, we manually map each field with the data collected in the previous steps: | Airtable Field | Description | Value from n8n | | -------------- | ------------------------ | ------------------------------------------ | | Title | Business name | {{ $('Edit Fields').item.json.Title }} | | Street | Full address | {{ $('Edit Fields').item.json.Address }} | | Website | Website URL | {{ $('Edit Fields').item.json.Website }} | | Phone Number | Business phone number | {{ $('Edit Fields').item.json.Phone }} | | Email | Email found by ChatGPT | {{ $json.message.content }} | | URL | Google Maps listing link | {{ $('Edit Fields').item.json.URL }} | 🧠 Reminder: we’re keeping only clean, usable data — ready to be exported, analyzed, or used in cold outreach campaigns (email, CRM, enrichment, etc.). ➡️ And the best part? You can rerun this workflow automatically every week or month to keep collecting fresh leads 🔁.
by Halfbit 🚀
Jura Coffee Counter: Webhook API & Google Sheets Logger ☕️ Track how many coffees your Jura E8 espresso machine makes — fully automated via webhook and Google Sheets. This workflow exposes a custom API endpoint that can be called by smart devices, such as an ESP8266 or ESP32 reading data from a Jura E8 coffee machine via Bluetooth Low Energy (BLE). The incoming data (including total coffee count) is timestamped and appended to a Google Sheet, making it easy to visualize or analyze your machine usage. ☕ Originally built for a Jura E8, based on AlexxIT/Jura reverse-engineering project. > 📝 This workflow uses Google Sheets as a logging backend. You can easily switch it to Airtable, Notion, or a database of your choice. Live example available at: https://halfbitstudio.com/o-nas/ > 🖥️ In our setup, this workflow is used to provide real-time coffee consumption stats displayed directly on our website. > 🔌 Some Jura machines require an accessory Bluetooth transmitter to enable connectivity. Communication is based on the Bluetooth Low Energy (BLE) protocol. Use Case Tracking usage of a Jura coffee machine Logging IoT sensor data into Google Sheets Creating dashboards for daily consumption Smart office setups with coffee stats! Features ☁️ Two Webhook endpoints: POST /{{WEBHOOK_POST_PATH}} — receives JSON from ESP (coffee machine reader) GET /{{WEBHOOK_GET_PATH}} — returns latest records as JSON 📅 Timestamping via Date & Time node 🔹 Coffee counter extraction from incoming JSON 🧾 Appends structured rows to Google Sheets 📤 Webhook response for external status or dashboards Setup Instructions Jura Coffee Machine Integration (Hardware) Use an ESP device (e.g. ESP8266 or ESP32) to connect to the Jura E8 via Bluetooth Low Energy (BLE). Send POST requests with JSON payload: { "total_coffees": 123 } Reverse-engineered protocol reference: AlexxIT/Jura Google Sheets Configuration Create a new Google Sheet with column headers like: date | time | coffee counter Connect your Google account in n8n and authorize access to this sheet. Replace the documentId and sheetName fields in the Google Sheets nodes: Use full URL to your spreadsheet Use the actual sheet name (e.g. Sheet1) Environment Variables & Placeholders | Placeholder | Description | | ------------------------ | ----------------------------------------------- | | {{WEBHOOK_POST_PATH}} | Endpoint to receive coffee counter data | | {{WEBHOOK_GET_PATH}} | Endpoint to return latest data (for dashboards) | | {{SHEET_ID}} | Google Spreadsheet ID | | {{GOOGLE_CREDENTIALS}} | OAuth2 credentials for Google Sheets | | {{DATA_COLUMNS}} | Column names in the target sheet | Testing the Workflow Send test request: Use Postman or ESP to send a POST request to /{{WEBHOOK_POST_PATH}} Body should include total_coffees value Check Google Sheet: Open your sheet and verify that a new row was appended Test GET endpoint: Access the second webhook URL (e.g. /{{WEBHOOK_GET_PATH}}) in browser or fetch via API Optional: Use Respond to Webhook output in a dashboard or frontend Customization Tips Sheet format**: Add more columns if you want to track additional data (e.g. machine temperature, errors) Output format**: Replace Google Sheets with any other storage (e.g. MySQL, Notion) Auth layer**: Add basic auth or token verification if needed for public exposure Notifications**: Send alerts to Discord/Slack when reaching thresholds (e.g. 200 coffees brewed) Tags: google-sheets, iot, webhook, jura, coffee, api, automation
by Emmanuel Bernard
🎥 AI Video Generator with HeyGen 🚀 Create AI-Powered Videos in n8n with HeyGen This workflow enables you to generate realistic AI videos using HeyGen, an advanced AI platform for video automation. Simply input your text, choose an AI avatar and voice, and let HeyGen generate a high-quality video for you – all within n8n! ✅ Ideal for: Content creators & marketers 🏆 Automating personalized video messages 📩 AI-powered video tutorials & training materials 🎓 🔧 How It Works 1️⃣ Provide a text script – This will be spoken in the AI-generated video. 2️⃣ Select an Avatar & Voice – Choose from a variety of AI-generated avatars and voices. 3️⃣ Run the workflow – HeyGen processes your request and generates a video. 4️⃣ Download your video – Get the direct link to your AI-powered video! ⚡ Setup Instructions 1️⃣ Get Your HeyGen API Key Sign up for a HeyGen account. Go to your account settings and retrieve your API Key. 2️⃣ Configure n8n Credentials In n8n, create new credentials and select "Custom Auth" as the authentication type. In the Name provide : X-Api-Key And in the value paste your API key from Heygen Update the 2 http node with the right credentials. 3️⃣ Select an AI Avatar & Voice Browse available avatars & voices in your HeyGen account. Copy the Avatar ID and Voice ID for your video. 4️⃣ Run the Workflow Enter your text, avatar ID, and voice ID. Execute the workflow – your video will be generated automatically! 🎯 Why Use This Workflow? ✔️ Fully Automated – No manual editing required! ✔️ Realistic AI Avatars – Choose from a variety of digital avatars. ✔️ Seamless Integration – Works directly within your n8n workflow. ✔️ Scalable & Fast – Generate multiple videos in minutes. 🔗 Start automating AI-powered video creation today with n8n & HeyGen!
by Oneclick AI Squad
This n8n template demonstrates how to create a comprehensive marketing automation and booking system that combines Excel-based lead management with voice-powered customer interactions. The system utilizes VAPI for voice communication and Excel/Google Sheets for data management, making it ideal for restaurants seeking to automate marketing campaigns and streamline booking processes through intelligent voice AI technology. Good to know Voice processing requires active VAPI subscription with per-minute billing Excel operations are handled in real-time with immediate data synchronization The system can handle multiple simultaneous voice calls and lead processing All customer data is stored securely in Excel with proper formatting and validation Marketing campaigns can be scheduled and automated based on lead data How it works Lead Management & Marketing Automation Workflow New Lead Trigger: Excel triggers capture new leads when customers are added to the lead management spreadsheet Lead Preparation: The system processes and formats lead data, extracting relevant details (name, phone, preferences, booking history) Campaign Loop: Automated loop processes through multiple leads for batch marketing campaigns Voice Marketing Call: VAPI initiates personalized voice calls to leads with tailored restaurant offers and booking invitations Response Tracking: All call results and lead responses are logged back to Excel for campaign analysis Booking & Order Processing Workflow Voice Response Capture: VAPI webhook triggers when customers respond to marketing calls or make direct booking requests Response Storage: Customer responses and booking preferences are immediately saved to Excel sheets Information Extraction: System processes natural language responses to extract booking details (party size, preferred times, special requests) Calendar Integration: Booking information is automatically scheduled in restaurant management systems Confirmation Loop: Automated follow-up voice messages confirm bookings and provide additional restaurant information Excel Sheet Structure Lead Management Sheet | Column | Description | |--------|-------------| | lead_id | Unique identifier for each lead | | customer_name | Customer's full name | | phone_number | Primary contact number | | email | Customer email address | | last_visit_date | Date of last restaurant visit | | preferred_cuisine | Customer's food preferences | | party_size_typical | Usual number of guests | | preferred_time_slot | Preferred dining times | | marketing_consent | Permission for marketing calls | | lead_source | How customer was acquired | | lead_status | Current status (new, contacted, converted, inactive) | | last_contact_date | Date of last marketing contact | | notes | Additional customer information | | created_at | Lead creation timestamp | Booking Responses Sheet | Column | Description | |--------|-------------| | response_id | Unique response identifier | | customer_name | Customer's name from call | | phone_number | Contact number used for call | | booking_requested | Whether customer wants to book | | party_size | Number of guests requested | | preferred_date | Requested booking date | | preferred_time | Requested time slot | | special_requests | Dietary restrictions or special occasions | | call_duration | Length of VAPI call | | call_outcome | Result of marketing call | | follow_up_needed | Whether additional contact is required | | booking_confirmed | Final booking confirmation status | | created_at | Response timestamp | Campaign Tracking Sheet | Column | Description | |--------|-------------| | campaign_id | Unique campaign identifier | | campaign_name | Descriptive campaign title | | target_audience | Lead segments targeted | | total_leads | Number of leads contacted | | successful_calls | Calls that connected | | bookings_generated | Number of bookings from campaign | | conversion_rate | Percentage of leads converted | | campaign_cost | Total VAPI usage cost | | roi | Return on investment | | start_date | Campaign launch date | | end_date | Campaign completion date | | status | Campaign status (active, completed, paused) | How to use Setup: Import the workflow into your n8n instance and configure VAPI credentials Excel Configuration: Set up Excel/Google Sheets with the required sheet structure provided above Lead Import: Populate the Lead Management sheet with customer data from various sources Campaign Setup: Configure marketing message templates in VAPI nodes to match your restaurant's branding Testing: Test voice commands such as "I'd like to book a table for tonight" or "What are your specials?" Automation: Enable triggers to automatically process new leads and schedule marketing campaigns Monitoring: Track campaign performance through the Campaign Tracking sheet and adjust strategies accordingly The system can handle multiple concurrent voice calls and scales with your restaurant's marketing needs. Requirements VAPI account** for voice processing and natural language understanding Excel/Google Sheets** for storing lead, booking, and campaign data n8n instance** with Excel/Sheets and VAPI integrations enabled Valid phone numbers** for lead contact and compliance with local calling regulations Customising this workflow Multi-location Support**: Adapt voice AI automation for restaurant chains with location-specific offers Seasonal Campaigns**: Try popular use-cases such as holiday promotions, special event marketing, or loyalty program outreach Integration Options**: The workflow can be extended to include CRM integration, SMS follow-ups, and social media campaign coordination Advanced Analytics**: Add nodes for detailed campaign performance analysis and customer segmentation
by Yaron Been
Automated pipeline that extracts job listings from Upwork and exports them to Google Sheets for better organization, analysis, and team collaboration. 🚀 What It Does Fetches job postings based on saved searches Extracts key job details (title, budget, description) Organizes data in Google Sheets Updates in real-time Supports multiple search criteria 🎯 Perfect For Freelancers tracking opportunities Teams managing multiple projects Agencies monitoring client needs Market researchers Business analysts ⚙️ Key Benefits ✅ Centralized job board ✅ Easy sharing with team members ✅ Advanced filtering and sorting ✅ Historical data tracking ✅ Customizable data points 🔧 What You Need Upwork account Google account n8n instance Google Sheets setup 📊 Data Exported Job title and description Budget and hourly rate Client information Posted date Required skills Job URL 🛠️ Setup & Support Quick Setup Get started in 15 minutes with our step-by-step guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Streamline your job search and opportunity tracking with automated data collection and organization.
by bangank36
This workflow restores all n8n instance credentials from GitHub backups using the n8n API node. It complements the Backup Your Credentials to GitHub template by allowing users to seamlessly restore previously saved credentials. How It Works The workflow fetches credentials stored in a GitHub repository and imports them into your n8n instance. Setup Instructions To configure the workflow, update the Globals node with the following values: repo.owner** – Your GitHub username repo.name** – The name of your GitHub repository storing the credentials repo.path** – The folder path within the repository where credentials are stored For example, if your GitHub username is john-doe, your repository is named n8n-backups, and credentials are stored in a credentials/ folder, you would set: repo.owner → john-doe repo.name → n8n-backups repo.path → credentials/ Required Credentials GitHub API** – Access to your repository n8n API** – To import credentials into your n8n instance Who Is This For? This template is ideal for users who want to restore their credentials from GitHub backups, ensuring easy migration and recovery in case of data loss. Check out my other templates: 👉 My n8n Templates
by bangank36
This workflow converts an exported CSV from Squarespace profiles into a Shopify-compatible format for customer import. How It Works Clone this Google Sheets template, which includes two sheets: Squarespace Profiles (Input) Go to Squarespace Dashboard → Contacts Click the three-dot icon → Select Export all Contacts Shopify Customers (Output) This sheet formats the data to match Shopify's customer import CSV. Shopify Dashboard → Customers → Import customers by CSV The workflow can run on-demand or be triggered via webhook. Via webhook Set up webhook node to expect a POST request Trigger the webhook using this code (psuedo) - replace {webhook-url} with the actual URL const formData = new FormData(); formData.append('file', blob, 'profiles_export.csv'); // Add file to FormData fetch('{webhook-url}', { // Replace with your target URL method: 'POST', mode: 'no-cors', body: formData }); The data is processed into the Shopify Customers sheet. Manually trigger Import Squarespace profiles into the sheet. Run the workflow to convert and populate the Shopify Customers sheet. Once workflow is done, export the Shopify to csv and import to Shopify customers Requirements To use this template, you need: Google Sheets API credentials Google Sheets Setup Use this sample Google Sheets template to get started quickly. Who Is This For? For anyone looking to automate Squarespace contact exports into a Shopify-compatible format—no more manual conversion! Explore More Templates Check out my other n8n templates: 👉 n8n.io/creators/bangank36
by Adrian Bent
This workflow takes two inputs, YouTube video URL (required) and a description of what information to extract from the video. If the description/"what you want" field is left empty, the default prompt will generate a detailed summary and description of the video's contents. However, you can ask for something more specific using this field/input. ++ Don't forget to make the workflow Active and use the production URL from the form node. Benefits Instant Summary Generation - Convert hours of watching YouTube videos to familiar, structured paragraphs and sentences in less than a minute Live Integration - Generate a summary or extract information on the contents of a YouTube video whenever, wherever Virtually Complete Automation - All that needs to be done is to add the video URL and describe what you want to know from the video Presentation - You can ask for a specific structure or tone to better help you understand or study the contents of the video How It Works Smart Form Interface: Simple N8N form captures video URL and description of what's to be extracted Designed for rapid and repeated completion anywhere and anytime Description Check: Uses JavaScript to determine if the description was filled in or left empty If the description field was left empty, the default prompt is, "Please be as descriptive as possible about the contents being spoken of in this video after giving a detailed summary." If the description field is filled, then the filled input will be used to describe what information to extract from the video HTTP Request: We're using Gemini API, specifically the video understanding endpoint We make a post HTTP request passing the video URL and the description of what information to extract Setup Instructions: HTTP Request Setup: Sign up for a Google Cloud account, join the Developer Program and get your Gemini API key Get curl for Gemini Video Understanding API The video understanding relies on the inputs from the form, code and HTTP request node, so correct mapping is essential for the workflow to function correctly. Feel free to reach out for additional help or clarification at my Gmail: terflix45@gmail.com, and I'll get back to you as soon as I can. Setup Steps: Code Node Setup: The code node is used as a filter to ensure a description prompt is always passed on. Use the JavaScript code below for that effect: // Loop over input items and add a new field called 'myNewField' to the JSON of each one for (const item of $input.all()) { item.json.myNewField = 1; if ($input.first().json['What u want?'].trim() == "") { $input.first().json['What do you want?'] = "Please be as descriptive as possible about the contents being spoken of this video after giving a detailed summary"; } } return $input.all(); // End of Code HTTP Request: To use Gemini Video Understanding, you'll need your Gemini API key Go to https://ai.google.dev/gemini-api/docs/video-understanding#youtube. This link will take you directly to the snippet. Just select REST programming language, copy that curl command, then paste it into the HTTP Request node Replace "Please summarize the video in 3 sentences." with the code node's output, which should either be the default description or the one entered by the user (second output field variable) Replace "https://www.youtube.com/watch?v=9hE5-98ZeCg" with the n8n form node's first output field, which should be the YouTube video URL variable Replace $GEMINI_API_KEY with your API key Redirect: Use n8n form node, page type "Final Ending" to redirect user to the initial n8n form for another analysis or preferred destination
by Anurag
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Description This workflow automates document processing and structured table extraction using the Nanonets API. You can submit a PDF file via an n8n form trigger or webhook—the workflow then forwards the document to Nanonets, waits for asynchronous parsing to finish, retrieves the results (including header fields and line items/tables), and returns the output as an Excel file. Ideal for automating invoice, receipt, or order data extraction with downstream business use. How It Works A document is uploaded (via n8n form or webhook). The PDF is sent to the Nanonets Workflow API for parsing. The workflow waits until processing is complete. Parsed results are fetched. Both top-level fields and any table rows/line items are extracted and restructured. Data is exported to Excel format and delivered to the requester. Setup Steps Nanonets Account: Register for a Nanonets account and set up a workflow for your specific document type (e.g., invoice, receipt). Credentials in n8n: Add HTTP Basic Auth credentials in n8n for the Nanonets API (never store credentials directly in node parameters). Webhook/Form Configuration: Option 1: Configure and enable the included n8n Form Trigger node for document uploads. Option 2: Use the included Webhook node to accept external POSTs with a PDF file. Adjust Workflow: Update any HTTP nodes to use your credential profile. Insert your Nanonets Workflow ID in all relevant nodes. Test the Workflow: Enable the workflow and try with a sample document. Features Accepts documents via n8n Form Trigger or direct webhook POST. Securely sends files to Nanonets for document parsing (credentials stored in n8n credentials manager). Automatically waits for async processing, checking Nanonets until results are ready. Extracts both header data and all table/line items into a tabular format. Exports results as an Excel file download. Modular nodes allow easy customization or extension. Prerequisites Nanonets account** with workflow configured for your document type. n8n** instance with HTTP Request, Webhook/Form, Code, and Excel/Spreadsheet nodes enabled. Valid HTTP Basic Auth credentials** saved in n8n for API access. Example Use Cases | Scenario | Benefit | |-----------------------|--------------------------------------------------| | Invoice Processing | Automated extraction of line items and totals | | Receipt Digitization | Parse amounts and charges for expense reports | | Purchase Orders | Convert scanned POs into structured Excel sheets | Notes You must set up credentials in the n8n credentials manager—do not store API keys directly in nodes. All configuration and endpoints are clearly explained with inline sticky notes in the workflow editor. Easily adaptable for other document types or similar APIs—just modify endpoints and result mapping.