by higashiyama
This workflow automates the process of converting audio meeting recordings into a structured to-do list. It listens for new audio files in a Google Drive folder, transcribes them, extracts action items using AI, and sends a formatted list to a designated Slack channel. Who’s it for This template is perfect for project managers, teams, and anyone who wants to save time on post-meeting administrative tasks. If you record your meetings and use Google Drive for storage and Slack for team communication, this workflow will streamline your follow-up process and ensure no action item is missed. What it does This workflow automates the entire process of turning spoken words from a meeting into actionable tasks for your team. Trigger on New Audio: The workflow starts automatically when you upload a new audio file (e.g., MP3, M4A, WAV) to a specific folder in your Google Drive. Transcribe Audio: It takes the audio file and uses Google Gemini to generate a full text transcript of the recording. Extract To-Do Items: The transcript is then passed to another Google Gemini node with a specialized prompt. This prompt instructs the AI to carefully analyze the text and extract all action items. Format Output: The AI formats the extracted tasks into a clean JSON array. Each task includes a description, the assigned person, a deadline, and its priority. Send to Slack: Finally, the workflow sends the structured to-do list as a message to your specified Slack channel, making it easy for the whole team to see and act upon. How to set up Configure Credentials: Ensure you have configured your credentials for Google Drive, Google Gemini, and Slack in n8n. Set Google Drive Folder: In the "Looking for uploading file" node, select the Google Drive folder you want the workflow to monitor. Set Slack Channel: In the "Send a message" node, choose the correct Slack account and select the channel where you want the to-do list to be posted. Activate Workflow: Save your changes and activate the workflow using the toggle at the top right. Test It: Upload a meeting recording to the designated Google Drive folder to see the magic happen! How to customize the workflow Change AI Model:** You can easily swap the Google Gemini nodes for other AI models like OpenAI or Anthropic to handle transcription and analysis based on your preference. Modify the AI Prompt:** Adjust the prompt in the "Analyze document" node to change the output format. For example, you could ask for a meeting summary in addition to the to-do list. Change Notification Service:** Replace the Slack node with another notification service like Discord, Microsoft Teams, or an email node. Archive Results:** Add a node (e.g., Google Sheets, Notion, Airtable) after the "Analyze document" node to save a history of all meeting transcripts and their corresponding action items.
by Ajay Yadav
A production-ready n8n workflow that automatically analyzes websites to detect e-commerce platforms, frameworks, payment gateways, and technology stacks. Perfect for lead generation, competitive analysis, and market research. 🎯 Use Cases Lead Generation: Identify potential e-commerce clients Competitive Analysis: Analyze competitor technology stacks Market Research: Understand technology adoption trends Sales Intelligence: Qualify prospects based on their tech stack Agency Services: Automated technology audits for clients ⚡ Key Features Comprehensive Detection 50+ E-commerce Platforms: Magento, Shopify, WooCommerce, BigCommerce, Squarespace, Wix, etc. Modern Frameworks: React, Vue.js, Next.js, Angular, WordPress, Gatsby Payment Gateways: Stripe, PayPal, Square, Klarna, Razorpay, Braintree E-commerce Features: Cart, Catalog, Checkout, Wishlist, PWA capabilities Custom E-commerce: Detects custom-built e-commerce solutions Production-Ready Features Intelligent Error Handling: Specific error types (DNS, SSL, timeout, 404, 500, etc.) Rate Limiting: Respectful 2-second delays between requests Batch Processing: Processes domains in chunks of 5 for optimal performance Retry Logic: 3 attempts with exponential backoff for failed requests SSL Handling: Ignores certificate issues for broader compatibility Smart Domain Processing Multiple Detection Methods: 8 different approaches to extract domain names Protocol Auto-Addition: Automatically adds https:// to bare domains Domain Cleaning: Removes www, paths, and query parameters HTML Meta Extraction: Fallback domain detection from og:url and canonical tags Advanced Analysis Confidence Scoring: 0-100% accuracy rating for each detection Comprehensive Logging: Detailed console output for debugging Multiple Triggers: Manual execution or scheduled automation Flexible Output: Updates Google Sheets with structured results
by Pramod Kumar Rathoure
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🚀 Automating Google Maps Lead Generation with n8n + Apify Finding quality leads can be time-consuming. What if you could scrape restaurant data from Google Maps, filter the best ones, and email a Morning Brew–style newsletter automatically? That’s exactly what this n8n workflow does. 🔎 What This Workflow Does Takes a location input (Bangkok or Bareilly in this case) Runs a Google Maps scraper via Apify Actor Extracts restaurant essentials (name, category, rating, reviews, address, phone, Google Maps link) Sorts & filters results (only high-review, highly-rated places) Saves data to Airtable for lead management Uses AI to generate a newsletter in a Morning Brew–style HTML email Emails the newsletter automatically to your chosen recipients 🛠️ Workflow Breakdown 1. Form Trigger User selects location from a dropdown (Bangkok or Bareilly) Submits form to kickstart the process 2. Google Maps Scraper Powered by Apify Collects up to 1,000 restaurants with details: Name Category Price Range Rating Reviews Address Phone Google Maps URL Skips closed places and pulls detailed contact data 3. Extract & Transform Data n8n Set node extracts only the essentials Formats them into a clean text block (Restaurant_Data) 4. Sort & Filter Sorted by**: Review Count (descending) Rating (descending) Filter*: Only restaurants with *500+ reviews 5. Airtable Lead Storage Each record is saved to Google Map Leads - Restaurants Airtable table Fields include: Title Category Price Range Rating Review Count Address Phone Location 6. AI-Powered Newsletter n8n’s LangChain + OpenAI node generates an HTML newsletter Tone**: Breezy, witty, like Morning Brew Content**: Sorted restaurant picks with ratings, reviews, and contact links Output is JSON with "Subject" and "Body" 7. Automatic Email Gmail node sends the newsletter directly to your inbox Example recipient: prath002@gmail.com 🎯 Why This Workflow Rocks End-to-End Automation**: From scraping → filtering → emailing, no manual effort Lead Enrichment**: Only keeps high-quality restaurants with strong social proof Scalable**: Works for any city you plug into the form Engaging Output**: AI crafts the results into a ready-to-send newsletter 🔮 Next Steps Add more cities to the dropdown for multi-location scraping Customize the email template for branding Integrate with CRM tools for automated outreach 👉 With just a few clicks, you can go from raw Google Maps data → polished newsletter → fresh leads in Airtable. That’s the power of n8n + Apify + AI.
by Maksym Brashenko
Advanced n8n Workflow Sync with GitHub A robust workflow to back up and synchronize your n8n workflows to a GitHub repository, with intelligent change detection and support for file renames. 🎯 Who's it for? This workflow is for n8n administrators, developers, and power users who want a reliable, automated way to: Keep a version-controlled history of their workflows. Collaborate on workflows using GitHub's infrastructure. Prevent data loss and have a disaster recovery plan for their n8n instance. ✨ Key Features (What it does) Intelligent Sync**: Backs up all your n8n workflows to a designated GitHub repository. Human-Readable Filenames**: Saves workflows with filenames based on their actual names in n8n (e.g., My Awesome Workflow.json). Reliable Matching**: Uses the unique n8n workflow ID to reliably track files, even if their names change. Rename Detection**: If you rename a workflow in n8n, it intelligently deletes the old file and creates a new one in a single logical operation. Efficient Commits**: Commits changes to GitHub only when there are actual modifications to a workflow's logic or structure. It performs a deep comparison, ignoring metadata changes. Clear Commit History**: Generates clean, informative commit messages: create: workflowName update: workflowName rename: oldName - newName ⚙️ How It Works (Simple Steps) Get n8n Workflows: The workflow starts by fetching all your current workflows from n8n. Get GitHub Files: At the same time, it lists all existing workflow files from your GitHub repository. Compare & Decide: It then compares each n8n workflow with its GitHub counterpart. It checks if anything changed, if it was renamed, or if it's new. Take Action: If a workflow is new, it's created on GitHub. If a workflow is updated, the file content is changed on GitHub. If a workflow was renamed, the old file is deleted, and a new one is created. If nothing changed, the workflow is skipped. Send Report: Finally, it can send a summary report to Telegram about what happened. 🚀 How to Set Up Credentials: GitHub: Go to Credentials > New and add your GitHub credentials. You'll need a token with repo permissions. n8n API: In the same Credentials section, create n8n API credentials. You'll need your n8n instance's Base URL and an API key (you can create one in your n8n user settings). Telegram (Optional): If you want notifications, add your Telegram Bot credentials. Configure the Workflow: Open the Configuration node (the green one at the start). Fill in the following values: repo.owner: Your GitHub username or organization name. repo.name: The name of the repository for backups. repo.path: The folder inside the repository to store workflows (e.g., workflows/). report.tg.chatID (Optional): Your Telegram chat ID for notifications. Set to 0 to disable. report.verbose: Set to true to receive a report even if there were no changes. Connect Credentials: Select your newly created credentials in the following nodes: Get all workflows: Select your n8n API credentials. All GitHub nodes (e.g., List files, Create new file): Select your GitHub credentials. Send a message (Telegram): Select your Telegram credentials. Set the Schedule: In the Schedule Trigger node, configure how often you want the backup to run (e.g., every hour, once a day). Activate the Workflow: Save the workflow and toggle it to "Active". 🔧 How to Customize Change Report Destination**: The final part of the workflow sends a report to Telegram. You can easily replace the Send a message node with a node for Slack, Discord, or email to change where notifications go. The message is pre-formatted in the Render summary node. 💡 What's Next? (Future Updates) This workflow is actively maintained! Here's a sneak peek at what's planned for future versions: Automatic Archive Handling**: The next major update will introduce logic to automatically detect when a workflow is archived in n8n and move it to a dedicated archived/ folder in your GitHub repository, keeping your main backup directory clean. Performance Optimizations**: I'm exploring ways to reduce API traffic by intelligently checking for changes before fetching full workflow data. To get the latest version with these features when it's released, be sure to follow my profile for new workflow publications!
by Calistus Christian
What this workflow does Pulls free security/tech headlines from multiple RSS feeds (e.g., CISA, BleepingComputer, Krebs, SecurityWeek, Ars Technica, TechCrunch, Hacker News). De-duplicates stories, keeps only the last 24 hours, and limits to a manageable number. Uses OpenAI to write a concise brief with sections and "Why it matters." Sends a clean HTML email via Gmail. Category: Security / News\ Time to set up: ~10--15 minutes\ Difficulty: Beginner--Intermediate\ Cost: Mostly free (OpenAI tokens + Gmail) * What you'll need n8n (recent version) OpenAI credentials Gmail (or SMTP) credentials A few free RSS feed URLs (swap in/out as you like) * Set up steps Trigger -- Add a Cron to run daily (pick your time and timezone). Fetch -- Add one RSS Read node per source and connect all to a Merge (append). De-duplicate (this run) -- Add Remove Duplicates and compare on a stable key (prefer the article URL). Freshness -- Add an IF to pass only items published in the last 24 hours. Limit -- Add Limit to cap the total items (e.g., 25). Summarize -- Add OpenAI → Message a model to produce a JSON brief with subject + HTML body. Email -- Add Gmail → Send to deliver the brief to your inbox. * Tips & troubleshooting If everything gets discarded at de-dup e while testing, switch to "within current input" or reset the node's stored values. If no items pass the IF, widen the date window temporarily (some feeds publish late). If the email arrives blank, ensure Gmail email type is set to HTML and the subject/body fields map to the model's output. * Sources you can start with (swap freely) CISA, BleepingComputer, KrebsOnSecurity, SecurityWeek, Ars Technica (Security), TechCrunch (Security), Hacker News (front page).
by Rakin Jakaria
📄 AI Invoice Agent The AI Invoice Agent automates the invoice creation, email delivery, and status tracking process for client billing. It ensures invoices are generated, sent professionally, and updated in Google Sheets with minimal manual work. 🔹 How It Works Trigger Activated manually (Execute Workflow) when you want to process invoices. Fetch Invoices Reads client invoice data from a Google Sheet (Client Invoices). Filter Pending Invoices Passes through only invoices with Status = Pending. Prepare Invoice Data Collects and formats details: Invoice ID Client Name & Address Project Name Amount (USD) Invoice Date (today’s date) Due Date (7 days later) Loop Over Invoices Processes each invoice one by one. AI Email Draft Uses GPT-4.1-mini to generate a polite, professional email. Tone: friendly but business-oriented. Signed as Upward Engine Team. Extract Email Parts Separates subject and body from the AI output using an Information Extractor. Generate Invoice PDF Uses CraftMyPDF to create a formatted invoice PDF with: Company details (Upward Engine) Client details Invoice ID, Date, Due Date Amount due Footer message Send Email to Client Sends invoice email via Gmail, attaching the PDF invoice. Update Invoice Status Updates Google Sheets to mark the invoice as Completed. Saves Invoice ID, Date, Due Date, and updated status. Loop Continuation Continues until all pending invoices are processed. 🔹 Tools & Integrations Google Sheets** → Stores client & invoice data Filter Node** → Selects only Pending invoices GPT-4.1-mini (OpenAI)** → Generates professional emails Information Extractor** → Separates subject & body CraftMyPDF** → Creates PDF invoices Gmail** → Sends invoice emails with PDF attachments 🔹 Example Workflow ✅ Google Sheets: Invoice marked as Pending ➡️ AI generates email → “Invoice INV-1023 for Web Design Project – Due Sep 5” ➡️ PDF invoice created & attached ➡️ Email sent to client with subject + body ➡️ Status updated in Google Sheet → Completed ⚡ This agent ensures zero missed invoices, professional client communication, and up-to-date tracking — fully automated for agencies and small businesses.
by Davide
This workflow automates the creation of realistic Multi-speaker podcasts using ElevenLabsv3 API by reading a script from Google Sheets and saving the final MP3 file to Google Drive. Data Source – Dialogue scripts are stored in a Google Sheet. Each row contains: Speaker name (optional) Voice ID (from ElevenLabs) Text to be spoken Data Preparation – The workflow transforms the spreadsheet content into the proper JSON format required by the ElevenLabs API. Podcast Generation – ElevenLabs’ Eleven v3 model converts the prepared text into expressive, natural-sounding dialogue. It supports not only speech but also non-verbal cues and audio effects (e.g., \[laughs], \[whispers], \[clapping]). File Storage – The generated audio file is automatically uploaded to Google Drive, organized by timestamped filenames. Key Advantages Seamless Automation** – From dialogue writing to final audio upload, everything runs automatically in one workflow. Multi-Speaker Support** – Easily assign different voices to multiple characters for dynamic conversations. Expressive & Realistic Output** – Supports emotions, speech styles, and ambient effects, making podcasts more immersive. Flexible Content Input** – Scripts can be collaboratively written and edited in Google Sheets, with no technical knowledge required. Scalable & Reusable** – Can generate multiple podcast episodes in seconds, ideal for content creators, educators, or businesses. Cloud Integration** – Final audio files are securely stored in Google Drive, ready to be shared or published. How It Works The workflow processes a structured script from a spreadsheet and uses AI to generate a realistic conversation. Manual Trigger: The workflow is started manually by a user clicking "Execute workflow" in n8n. Get Dialogue: The "Get dialogue" node fetches the podcast script data from a specified Google Sheet. The sheet should contain columns for Speaker (optional), Voice ID, and the dialogue Input/Text. Prepare Dialogue: The "Code" node transforms the raw sheet data into the precise JSON format required by the ElevenLabs API. It creates an array of objects where each object contains the text and the corresponding voice_id for each line of dialogue. Generate Podcast: The "HTTP Request" node sends a POST request to the ElevenLabs Text-to-Dialogue API endpoint (/v1/text-to-dialogue). It sends the transformed dialogue array in the request body, instructing the API to generate a single audio file with a conversation between the specified voices. Upload File: The "Upload file" node takes the audio file response from ElevenLabs and saves it to a designated folder in Google Drive.. Set Up Steps To use this workflow, you must complete the following configuration steps: Prepare the Google Sheet: Clone the Template: Duplicate the provided Google Sheet template into your own Google Drive. Fill the Script: Column A (Speaker): Optional. Add speaker names for your reference (e.g., "Host", "Guest"). Column B (Voice ID): Mandatory. Enter the unique Voice ID for each line from ElevenLabs. Column C (Input): Mandatory. Write the dialogue text for each speaker. You can use [non-speech audio events] like [laughs] or [whispers] to add expression. Configure ElevenLabs API Credentials: Login or create FREE account on Elevenlabs Edit the "Generate podcast" node's credentials. Create an HTTP Header Auth credential named "ElevenLabs API". Set the Name to xi-api-key and the Value to your actual ElevenLabs API key. Configure Google Services: Google Sheets: Ensure the "Get dialogue" node has valid OAuth credentials and that the documentId points to your copy of the script sheet. Google Drive: Ensure the "Upload file" node has valid OAuth credentials and that the folderId points to the correct Google Drive folder where you want the audio files saved. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Rosh Ragel
Automatically Send Square Summary Report for Yesterday's Sales via Gmail What It Does This workflow automatically connects to the Square API and generates a daily sales summary report for all your Square locations. The report matches the figures displayed in Square Dashboard > Reports > Sales Summary. It's designed to run daily and pull the previous day's sales into a CSV file, which is then sent to a manager/finance team for analysis. This workflow builds on my previous template, which allows users to automatically pull data from the Square API into n8n for processing. (See here: https://n8n.io/workflows/6358) Prerequisites To use this workflow, you'll need: A Square API credential (configured as a Header Auth credential) A Gmail credential How to Set Up Square Credentials: Go to Credentials > Create New Choose Header Auth Set the Name to Authorization Set the Value to your Square Access Token (e.g., Bearer <your-api-key>) How It Works Trigger: The workflow runs every day at 4:00 AM Fetch Locations: An HTTP request retrieves all Square locations linked to your account Fetch Orders: For each location, an HTTP request pulls completed orders for the specified report_date Filter Empty Locations: Locations with no sales are ignored Aggregate Sales Data: A Code node processes the order data and produces a summary identical to Square’s built-in Sales Summary report Create CSV File: A CSV file is created containing the relevant data Send Email: An email is sent to the chosen third party Example Use Cases Automatically send Square sales data to management to improve the quality of planning and scheduling decisions Automatically send data to an external third party, such as a landlord or agent, who is paid via commission Automatically send data to a bookkeeper for entry into QuickBooks How to Use Configure both HTTP Request nodes to use your Square API credential Set the workflow to Active so it runs automatically Enter the email address of the person you want to send the report to and update the message body If you want to remove the n8n attribution, you can do so in the last node Customization Options Add pagination to handle locations with more than 1,000 orders per day Instead of a daily summary, you can modify this workflow to produce a weekly summary once a week Why It's Useful This workflow saves time, reduces manual report pulling from Square, and enables smarter automation around sales data — whether for operations, finance, or performance monitoring.
by Yang
Who is this for? This workflow is perfect for content marketers, bloggers, SEO professionals, and virtual assistants who need to transform keyword research into complete blog posts without spending hours writing and formatting. What problem is this workflow solving? Writing a blog post from scratch requires research, summarizing content, and structuring it into a polished article. This workflow automates that process by taking a single keyword, fetching related news articles, cleaning the data, and generating a professional blog draft automatically in Google Docs. What this workflow does The workflow begins when a keyword is submitted through a form. It expands the keyword into trending suggestions using Dumpling AI Autocomplete, then fetches recent news articles with Dumpling AI Google News. Articles are filtered to only include those published within the last 1–2 days, then scraped and cleaned for quality text. The aggregated content is sent to OpenAI, which generates a polished blog draft with a clear title. Finally, the draft is saved directly into Google Docs for easy editing and publishing. Nodes Overview Form Trigger – Form Submission (Keywords) Starts the workflow when a keyword is submitted through a form. HTTP Request – Dumpling AI Autocomplete Expands the keyword into multiple trending search suggestions. Split Out – Split Autocomplete Suggestions Breaks the list of autocomplete suggestions into individual items for processing. Loop – Loop Suggestions Iterates through each suggestion to process articles separately. Wait – Delay Between Requests Adds a pause to avoid sending too many requests at once. HTTP Request – Dumpling AI Google News Fetches recent news articles for each suggestion. Split Out – Split News Articles Splits the returned news results into individual articles. Code – Filter Articles (1–2 Days Old) Keeps only articles that are between 1 and 2 days old for fresh content. Limit – Limit Articles Restricts the workflow to the top 2 articles for each suggestion. HTTP Request – Dumpling AI Scraper Scrapes and cleans the full text content from the article URLs. Code – Clean & Prepare Article Content Removes clutter like links, images, and unrelated sections to ensure clean input. Aggregate – Aggregate Articles Combines the cleaned article content into one dataset. OpenAI – Generate Blog Draft Uses OpenAI to create a polished blog post draft and title in Markdown format. Google Docs – Create Blog File Creates a new Google Doc with the generated blog title. Google Docs – Insert Blog Content Inserts the full blog draft into the created document. 📝 Notes Set up Dumpling AI and generate your API key from: Dumpling AI OpenAI must be connected with an active API key for blog generation. Google Docs must be connected with write permissions to create and update blog posts. You can adjust the article filter (currently set to 1–2 days old) in the code node depending on your needs.
by Baris Cem Ant
Workflow Objective This n8n workflow automates the entire content creation process by monitoring specified RSS feeds for new articles. It then leverages Google Gemini AI to generate comprehensive, SEO-optimized blog posts inspired by these articles, creates unique cover images, and distributes the final content as a JSON file to stakeholders via Telegram. The primary goal is to automate the end-to-end content pipeline, saving significant time and ensuring a consistent output of high-quality content. Step-by-Step Breakdown Monitor News Sources (RSS Triggers): The workflow is triggered periodically (e.g., hourly, weekly) by multiple RSS Feed nodes that monitor sources like "Search Engine Journal" and "Tech Crunch" for new publications. Prevent Duplicate Content (Deduplication): For each new article fetched from the RSS feeds, the workflow checks an AWS DynamoDB database to see if the article's URL has been processed before. If the link already exists in the database, the process for that item is halted, and a debug notification is sent to Telegram via the "Telegram Debugger" node. This prevents the generation of duplicate content. AI-Powered Content Generation (Gemini Content Generation): If the article is new, its link is passed to a Google Gemini node. Using a highly detailed and structured prompt, Gemini generates a complete blog post in a specific JSON format. This output includes a title, meta description, SEO-friendly slug, a descriptive prompt for generating a cover image, and the full markdown body of the article (including an introduction, subheadings, conclusion, FAQ section, etc.). Data Cleaning and Parsing (JSON Parser): The raw text response from the AI is processed by a "Code" node. This custom script cleans the output—removing markdown code blocks, fixing potential syntax errors—and reliably parses it into a valid JSON object, ensuring the data is clean for subsequent steps. Image Generation and Cloud Storage: The image_generation_prompt from the parsed JSON is sent to another Google Gemini node configured for image generation, creating a 1200x630 cover image for the blog post. The newly created image is renamed using the slug. Finally, the image is uploaded to a cloud storage service like Cloudflare R2. If the upload fails, an error message is sent to Telegram. Final Data Assembly and Distribution: The generated text content is merged with the URL of the uploaded image to create the final, complete blog post data object. This entire data structure is converted into a JSON file, named using the format [slug].json. In the final step, this JSON file is sent as a document to designated recipients User via the Telegram nodes. Technologies and Services Used Trigger:** RSS Feed Reader Artificial Intelligence:** Google Gemini (for both text and image generation) Database:** AWS DynamoDB (for content deduplication) Cloud Storage:** Cloudflare R2 (S3-compatible) Notification & Distribution:** Telegram Data Processing:** n8n's native nodes (Merge, If, Set, Code)
by Sridevi Edupuganti
Description This workflow sanitizes any uploaded n8n workflow JSON by removing credentials, webhook IDs, and sensitive metadata. Using AI and structured comparison, it generates a clean, secure workflow version, creates a downloadable sanitized file, and emails a detailed change-log report to the user. Key Features • AI-powered sanitization of workflow JSON • Automatic removal of secrets, credentials, webhook IDs, and metadata • Node-level change detection and comparison • Generates sanitized workflow file (JSON) • Sends formatted HTML email report with attachment • Supports customization for additional filtering rules How It Works The user uploads a workflow JSON file, which is extracted and formatted. AI then sanitizes the workflow and returns a secure version. Both original and sanitized workflows are merged for analysis, and a structured change-log is generated. A sanitized JSON file is created and emailed to the user with the report. How to Use Upload your workflow JSON via the form. The workflow processes it automatically, generates a sanitized version, creates a change-log, and emails both the report and sanitized JSON file to you. Requirements • OpenAI credentials • Gmail or SMTP credentials • n8n workflow JSON exported from the editor Customising This Workflow Modify sanitization rules, formatting logic, or email templates inside the JS and AI nodes to suit organizational security policies or custom metadata filtering. Support: Join n8n Discord https://discord.com/invite/n8n or Community Forum https://community.n8n.io/ README file available at https://bit.ly/GeneratesanitizedJSONfile
by Ryan Nolan
This template and YouTube video goes over 8 different examples of how we can utilize Binary data within n8n. We start with brining in Binary data with Google Drive, FTP, or Form submission. After we jump into how to extract Binary Data, Analyze an image, convert files, and use base64. This lesson also covers the recent update with grabbing binary data in later nodes. YouTube video: https://youtu.be/0Vefm8vXFxE