by Abideen Bello
This is an n8n template that Automate welcome emails with discount codes via Mailchimp and Gmail Who's it for Perfect for e-commerce businesses, SaaS companies, course creators, and service providers who want to automatically nurture new subscribers with personalized welcome emails and discount codes. If you're looking to boost conversions from your website signup forms and create a professional onboarding experience, this workflow is your solution. How it works This workflow creates a seamless subscriber onboarding process: Webhook receives signup data from your website form (name, email, timestamp, source) Mailchimp integration automatically adds the subscriber to your email list with their name Gmail sends personalized welcome email with a discount code and branded content Error handling ensures the welcome email sends even if Mailchimp fails The workflow is triggered instantly when someone submits your website signup form, creating a professional first impression that can significantly improve customer engagement and conversion rates. How to set up Requirements Mailchimp account** with an active audience/list Gmail account** with OAuth2 access Website or landing page with a signup form Basic HTML/CSS knowledge** for email customization (optional) Step-by-step setup 1. Configure Mailchimp Integration Create or identify your Mailchimp audience Replace YOUR_MAILCHIMP_LIST_ID with your actual list ID Add your Mailchimp API credentials in n8n Set up any custom merge fields you need (FNAME is included by default) 2. Set Up Gmail Credentials Add your Gmail OAuth2 credentials in n8n Ensure the sending email account has appropriate permissions Test email delivery to avoid spam folder issues 3. Customize the Welcome Email Replace [Your Business Name] with your actual business name Update the discount code (WELCOME15) with your preferred offer Modify the shop URL (https://your-website.com/shop) to your store. Update social media links with your actual profiles Customize colors, fonts, and branding to match your business. 4. Deploy Your Webhook Copy the webhook URL from the n8n workflow Add this URL to your website signup form as the POST endpoint Ensure your form sends JSON data with name and email fields 5. Test the Complete Flow Submit a test signup through your website form Verify the contact appears in Mailchimp Check that the welcome email arrives with proper personalization How to customize the workflow Advanced Email Personalization Dynamic content blocks: Add conditional sections based on signup source or user preferences Custom merge fields: Capture additional data like company name, phone number, or interests in Mailchimp Segmented messaging: Create different email templates for different subscriber types Multi-language support: Detect user language from form data and send localized emails Webhook Integration Examples Google Forms Integration: Use Google Apps Script to POST form responses to your n8n webhook Map form fields to the expected JSON structure (name, email, source) Typeform Integration: Configure Typeform webhooks in Connect panel Set payload to include question responses in the required format Custom HTML Forms: // Example form submission code fetch('YOUR_N8N_WEBHOOK_URL', { method: 'POST', headers: {'Content-Type': 'application/json'}, body: JSON.stringify({ name: document.getElementById('name').value, email: document.getElementById('email').value, source: 'website' }) }); WordPress Contact Form 7: Use CF7 hooks to send form data to your webhook endpoint Install REST API plugins for seamless integration Workflow Logic Enhancements Data validation: Add If nodes to check email format and required fields before processing Duplicate prevention: Query Mailchimp first to avoid adding existing subscribers Source-based routing: Send different welcome emails based on signup source (blog, product page, etc.) Lead scoring: Assign scores based on signup source and send to appropriate lists Follow-up sequences: Add Wait nodes to create multi-step email campaigns Advanced Integrations CRM sync: Connect to Salesforce, HubSpot, or Pipedrive to create leads automatically Analytics tracking: Log conversions to Google Sheets or send events to Google Analytics Slack notifications: Alert your team about high-value signups or VIP customers SMS follow-up: Add Twilio integration for multi-channel welcome sequences Troubleshooting Common Issues and Solutions Emails going to spam folder: Configure SPF and DKIM records for your sending domain Use Gmail's "Send as" feature to authenticate your sending address Start with low volume and gradually increase to build sender reputation Include unsubscribe links and proper email headers Mailchimp API errors: Check your API key permissions and rate limits Verify the list ID is correct (found in Audience settings) Ensure required fields are properly mapped Review Mailchimp's compliance requirements for your region Webhook not triggering: Test the webhook URL directly using tools like Postman Check that your form sends POST requests with proper Content-Type headers Verify JSON payload structure matches expected format Review n8n execution logs for error details Personalization not working: Confirm form field names match the n8n node references Check that data is properly passed between workflow nodes Test with sample data to isolate mapping issues Use n8n's data inspection tools to debug payload structure Performance Optimization High-volume handling: Consider using Mailchimp's batch operations for multiple signups Implement queue systems for processing during traffic spikes Monitor workflow execution times and optimize slow nodes Set up error notifications to catch issues quickly Delivery improvements: Use dedicated email services like SendGrid or Mailgun for better deliverability Implement email warmup procedures for new sending domains A/B test subject lines and send times for better engagement Monitor bounce rates and remove invalid emails promptly.
by David Olusola
Overview: Automated WordPress Post Archiving This workflow is designed to maintain your blog's health and SEO by automatically moving old, published posts into a "draft" or "archive" state. This prevents outdated or low-traffic content from negatively impacting your site's performance and allows you to easily review and update them later. How It Works Quarterly Trigger: The workflow is set to run automatically on a recurring schedule, specifically on the 1st day of every 3rd month (quarterly). This ensures that your content is regularly audited without any manual intervention. Find Old Posts: The workflow connects to your WordPress site and fetches all published posts that are older than a specified time frame (in this case, 12 months). It uses the WordPress API's filtering capabilities to efficiently find the right content. Check if Posts Found: An If node checks if the previous step found any posts. This prevents the workflow from running further steps if there's nothing to archive. If no posts are found, the workflow ends and logs this. Archive Post: If posts are found, the workflow proceeds to update each one. It changes the post's status from publish to draft and automatically adds tags like archived and old-content for easy identification within your WordPress dashboard. Send Notification: After the archiving process is complete, the workflow sends an email notification to the administrator. This provides a summary of the activity, letting you know that the task has been completed. Setup Steps Configure WordPress Credentials: In both the Find Old Posts and Archive Post nodes, you need to add your WordPress credentials. This typically involves entering your site URL and creating an application password in your WordPress admin dashboard for secure API access. Set Up Email Credentials: In the Send Notification node, add your email service credentials (like SMTP or a Gmail account) to enable the workflow to send you the completion notification. Adjust the Archiving Period: In the Find Old Posts node, the current expression is {{ $now.minus({ months: 12 }).toISO() }} which archives posts older than 12 months. You can change the number of months to fit your content strategy (e.g., 24 for two years). Customize Tags: In the Archive Post node, you can customize the tags to better suit your needs. You can change or add new tags that will be applied to the archived posts. Activate the Workflow: Once all credentials and settings are configured, make sure to activate the workflow to set the quarterly schedule in motion.
by n8nwizard
🧾 Overview This n8n workflow automates the process of fetching user data from an API, verifying its validity, transforming the response, and then saving it to Google Sheets for team collaboration. Additionally, it generates a CSV backup file of the same data for offline access or external integrations. Perfect for developers, analysts, or teams who want an automated, no-code data ingestion and backup solution. ⚙️ Key Features 🔌 Fetches data from any REST API endpoint (e.g., RandomUser API) ✅ Validates successful API responses before processing 🧠 Transforms JSON response into simple key-value pairs (name and country) 📊 Appends data directly into Google Sheets 💾 Generates a downloadable CSV backup file 🧱 Modular design — easily customizable and extendable 🧱 Workflow Steps 1. Start Workflow Manually (Manual Trigger Node) The workflow starts manually by clicking Execute Workflow. You can later replace this with a Cron or Webhook trigger for automation. 2. Fetch User Data from API (HTTP Request Node) Makes an HTTP GET request to the configured API endpoint defined in the environment variable BASE_URL. Example: https://randomuser.me/api/?results=10 This node fetches raw user data in JSON format. 3. Verify API Response Success (If Node) Checks if the API response returned an HTTP 200 status code. ✅ If status = 200 → Continue processing data ❌ If status ≠ 200 → Trigger Stop and Error node to halt execution This prevents saving invalid or failed responses. 4. Transform API Data to Name and Country (Function Node) Formats the raw JSON data to extract key details (name and country) from each user record. Input Example: { "results": [ { "name": { "first": "John", "last": "Doe" }, "location": { "country": "United States" } } ] } Output Example: [ { "name": "John Doe", "country": "United States" } ] This step makes the data compatible with Google Sheets. 5. Append Data to Google Sheets (Google Sheets Node) Appends the formatted data to your specified Google Sheet. Environment Variables Required: GOOGLE_SHEET_ID → ID of your target Google Sheet Configuration: Range:** A:B Columns:** Name (A) and Country (B) Example Google Sheet: | Name | Country | | ---------- | ------------- | | John Doe | United States | | Jane Smith | Canada | 6. Create CSV Backup File (Spreadsheet File Node) Generates a .csv file named users_backup_export.csv containing all saved user data. This file can be: Stored locally Sent via email Uploaded to cloud storage (e.g., Google Drive, Dropbox) Used for external analytics tools ⚠️ Error Handling If the API response is invalid (non-200), the Stop on API Failure node halts the workflow and logs the error: > ❌ API request failed — status code not 200. Workflow stopped. This ensures only valid data is stored. 🧰 Setup Instructions Add Environment Variables: BASE_URL=https://randomuser.me/api/?results=10 GOOGLE_SHEET_ID=<your_google_sheet_id> Add Credentials: Google Sheets OAuth2 credentials API (if authentication is required) Run Workflow: Start manually or configure a Cron node to run periodically Check Output: Data appears in your Google Sheet CSV file is created in n8n’s file system 🧩 Customization Options | Goal | How to Modify | | ----------------------- | ----------------------------------------------------------------- | | Change API fields | Edit Transform API Data function to extract desired fields | | Add columns | Expand output object and update Google Sheets range (e.g., A:D) | | Automate execution | Replace manual trigger with a Cron or Webhook node | | Filter users | Add an If node after transformation to include/exclude data | | Send email notification | Add Gmail or SMTP node after CSV creation | 🧠 Example Use Case A recruiter fetches random candidate data daily from an HR API. Data (Name + Country) is saved to Google Sheets. A CSV backup is automatically generated for offline analysis. ✅ Benefits Hands-free automated data collection Centralized storage in Google Sheets for team access Built-in CSV export for reporting and backups Protects data integrity with API validation Fully customizable for any API format ✨ Tip: Add a Slack or Telegram node at the end to notify your team whenever new data is added successfully!
by Anan
📢 Monitor n8n releases and get notifications for new versions 🆕 This workflow automatically monitors n8n’s release channels (latest and beta) and sends you email notifications whenever a new version is published. It also reads the version of your current n8n instance, allowing you to integrate automatic updates and ensure you never miss a release. Who is this for This workflow is designed for n8n users who want to stay informed and up to date with new releases and features without manually checking for updates, especially those managing their own instances who need to plan upgrades and review release notes. How it works The workflow performs the following steps: Fetches version information from the npm registry** (latest and beta releases) Identifies only new versions** by deduplication Retrieves release notes from GitHub** for any newly detected version Converts Markdown to HTML** for email template formatting Sends a styled email notification** including the release name, version tag, your current version, and the complete release notes Setup Configure your n8n instance URL (Set my_n8n_url) to detect your current version (optional — can be left blank) Connect and authorize the Gmail account used to send emails Update the recipient email address in the Gmail node Requirements A Gmail account for sending emails Customization tips Adjust the schedule trigger if hourly checks are too frequent Modify the release channel (e.g., “latest” or “beta”) if you want to track a different tag Change the npm registry link if you want to monitor a different package Customize the email template/styling in the Gmail node Add additional notification channels (Slack, Discord, etc.) alongside or instead of email Extend this workflow to automatically update your n8n instance when a new release becomes available Need help? If you're facing any issues using this workflow, join the community discussion on the n8n forum.
by Yassin Zehar
Description This workflow sends a personalized email when a task in a Google Sheet is marked as Urgent, but only once per task. It prevents duplicate notifications by updating the sheet after the email is sent. Ideal for collaborative task tracking where multiple people edit the same spreadsheet. Context When working with shared task lists in Google Sheets, it’s easy to miss critical updates — or worse, trigger multiple alerts for the same task. This workflow ensures that each "Urgent" task only sends one email notification, and then marks it as “Notified” to avoid duplicates. Target Users Project Managers using Google Sheets Operations or support teams managing collaborative task boards Anyone who needs alert automation with built-in anti-spam logic Technical Requirements Google Sheets account with edit access Gmail account for sending notifications Google Sheet with columns: Priority Notified Task Owner Deadline Status Next Step Workflow Steps Trigger: Watches for changes in Google Sheets (e.g., edits to the "Priority" column) IF Node – Checks that: Priority = Urgent Notified is empty row exists (required for update) Send Email: Sends a personalized message with task details Update Row: Writes “Yes” in the Notified column to avoid duplicate alerts Setup Instructions To set up this workflow: Connect your Google Sheets and Gmail credentials in n8n. Copy the spreadsheet structure or use your own Import the workflow, select your Sheet (and the column to check if you use a different Google Sheets template), and test by marking a task as “Urgent”. Check that an email is sent and the “Notified” column updates to “Yes”.- Key Features ✅ One email per urgent task — prevents duplicates 📧 Dynamic email content with task info 🧠 Built-in anti-spam logic 📋 Simple to configure and reuse 💬 Customizable for any team’s needs Expected Output An email alert is sent only once per task marked as Urgent The Notified field is updated in the Google Sheet A clean and scalable alert system with no duplicates Tutorial video: Watch the Youtube Tutorial video About me : I’m Yassin a Project & Product Manager Scaling tech products with data-driven project management. 📬 Feel free to connect with me on Linkedin
by Yuki Hirota
Task Deadline Reminder Workflow (Today / 3-Day / 7-Day) Task deadline management manually is inefficient and leads to missed deadlines—especially when teams rely on spreadsheets and individual reminders. This workflow automates the entire follow-up process by reading a centralized task sheet in Google Sheets every morning, checking the deadline for each task, and sending automatic email notifications to the responsible person based on urgency. Tasks due today, within three days, or within one week are identified and routed to customized Gmail notifications, ensuring that every team member is aware of upcoming deadlines without manual checking. Who’s it for This workflow is ideal for teams and organizations that manage multiple tasks across departments and need a reliable way to stay on top of deadlines. It is especially useful for: Project managers coordinating many deadlines Back-office teams monitoring routine operational tasks Organizations with distributed members Anyone who relies on spreadsheets but needs automated follow-up By integrating Google Sheets, n8n, and Gmail, you gain a proactive notification system that keeps everyone aligned and reduces the risk of forgotten tasks. How it works 1. Daily trigger The workflow runs every morning at 9:00 using a Schedule Trigger. 2. Load task list from Google Sheets The workflow retrieves all rows from the designated spreadsheet, including task name, deadline, responsible person, and email address. 3. Process tasks individually A loop node evaluates each task one by one. 4. Evaluate deadline conditions Due today:** Deadline matches today’s date Due within 3 days:** Deadline falls between today and three days ahead Due within 7 days:** Deadline falls between today and one week ahead 5. Send notifications Depending on urgency: “本日が締め切りです” for tasks due today “タスク期限が三日前となりました” for tasks due within 3 days “タスクの期限が一週間以内です” for tasks due within 7 days Each email is automatically sent to the responsible person based on the “メールアドレス” field in the sheet. 6. Complete processing The loop continues until all task rows have been checked. How to set up Import the workflow into your n8n instance Authenticate Google Sheets and select the task spreadsheet Authenticate Gmail as the sender account Confirm required columns: タスク, 期限, 担当, メールアドレス Adjust time, message text, or conditions based on your internal rules Requirements Active n8n instance Google Sheets access with permission to read the task list Gmail OAuth connection for email sending Spreadsheet with at least: task name, deadline, responsible person, email address How to customize You can expand and refine this workflow to match your company’s processes: Add Slack, Chatwork, or LINE notifications Add overdue task detection Add task priority sorting (High / Medium / Low) Log notifications back into the spreadsheet Send daily summary reports to managers This workflow provides a flexible foundation for building a complete automated task governance system.
by Javier Rieiro
Description Automates daily CVE-driven scanning against bug bounty scopes. It fetches bug-bounty domains, pulls newly published Project Discovery templates, converts them to Nuclei rules, runs targeted scans, and emails findings. Objective Help security researchers and bug bounty hunters discover exploitable instances quickly by automatically running the latest public templates from Project Discovery against a consolidated bug-bounty scope. Reduce manual steps and maintain continuous reconnaissance. How it works The workflow accepts or fetches a domain list that covers HackerOne, Bugcrowd, Intigriti, and YesWeHack. It downloads the latest public templates from Project Discovery. For each new template published since the last run it: creates a file, uploads it to a remote host, and converts it to a Nuclei-compatible YAML. It uploads a consolidated domains wordlist to the remote host. It executes Nuclei with the new templates against the domains list using configured flags (concurrency, rate limits, severity tags). It collects and deduplicates Nuclei output. If results exist, it sends the findings via Gmail. Requirements • SSH access (root or equivalent) to a VPS or host. • Nuclei installed on the remote host. • Gmail OAuth2 credentials for sending notifications. • Recommended: VPS with enough CPU and network capacity for concurrent scanning when scope is large.
by Oneclick AI Squad
This n8n workflow ensures instant notifications to parents and staff during school emergencies. It processes incoming alerts via webhooks, filters active emergencies, and sends notifications through email and Slack. Key Features Instant Alerts**: Triggers notifications immediately upon detecting emergencies. Multi-Channel**: Sends alerts via email and Slack for broad reach. Automated Filtering**: Identifies and processes only active emergency alerts. Reliable Delivery**: Ensures notifications reach parents and staff swiftly. No Action Needed**: Skips inactive alerts without further processing. Workflow Process Webhook Trigger: Receives POST requests with emergency data. Filter Emergency Alerts: Checks and validates active emergency alerts. Send Email Alert: Delivers email notifications to parents and staff. Send Slack Alert: Posts real-time messages to a Slack channel. No Action for Inactive: Ignores and stops for inactive alerts. Setup Instructions Import Workflow**: Load the workflow into n8n using the import feature. Configure Webhook**: Set up a webhook URL to receive emergency data. Set Up Notifications**: Add email (e.g., Gmail) and Slack credentials. Activate**: Save and enable the workflow in n8n. Test**: Simulate an alert to ensure notifications work. Requirements n8n Instance**: Hosted or cloud-based n8n environment. Webhook Source**: System to send emergency data via POST. Email Service**: SMTP setup for email alerts. Slack Integration**: Configured Slack workspace for alerts. Customization Options Add Channels**: Include SMS or other platforms for alerts. Adjust Filters**: Modify criteria for active alerts. Custom Messages**: Tailor email/Slack content for clarity.
by Oneclick AI Squad
This automated n8n workflow enables launching AWS EC2 instances directly from a Google Sheets document. Users can specify instance details (e.g., region, instance type, key pair) in a Google Sheet, triggering the workflow to create EC2 instances via the AWS API. The workflow updates the sheet with instance information and sends confirmation emails. Fundamental Aspects Google Sheets Trigger**: Initiates the workflow when a new row is added or updated in the Google Sheet. Extract Instance Details**: Parses region, instance type, key pair name, and instance name from the sheet. Validate Inputs**: Checks for required fields and valid AWS configurations. Launch EC2 Instance**: Uses the AWS EC2 API to launch the specified instance. Update Google Sheet**: Adds instance ID and status to the sheet. Send Confirmation Email**: Notifies the user via email with instance details. Setup Instructions Import the Workflow into n8n**: Download the workflow JSON and import it via the n8n interface. Configure API Credentials**: Set up Google Sheets API credentials with appropriate permissions. Configure AWS IAM credentials with EC2 launch permissions. Configure SMTP credentials for email notifications. Prepare Google Sheet**: Create a sheet with columns for region, instance type, key pair name, instance name, instance ID, and status. Run the Workflow**: Activate the Google Sheets trigger and test by adding a row with instance details. Verify Responses**: Check the Google Sheet for updated instance IDs and emails for confirmation. Adjust Parameters**: Fine-tune AWS region settings or email templates as needed. Technical Dependencies Google Sheets API**: For reading and writing data. AWS EC2 API**: For launching and managing instances. SMTP Service**: For sending confirmation emails. n8n**: For workflow automation and integration. Customization Possibilities Add Instance Types**: Support additional EC2 instance types. Enhance Validation**: Add checks for AWS limits or quotas. Support Tags**: Include custom tags for launched instances. Add Logging**: Integrate with a logging service for workflow tracking. Customize Emails**: Adjust email content or add attachments.
by Wessel Bulte
TenderNed Public Procurement What This Workflow Does This workflow automates the collection of public procurement data from TenderNed (the official Dutch tender platform). It: Fetches the latest tender publications from the TenderNed API Retrieves detailed information in both XML and JSON formats for each tender Parses and extracts key information like organization names, titles, descriptions, and reference numbers Filters results based on your custom criteria Stores the data in a database for easy querying and analysis Setup Instructions This template comes with sticky notes providing step-by-step instructions in Dutch and various query options you can customize. Prerequisites TenderNed API Access - Register at TenderNed for API credentials Configuration Steps Set up TenderNed credentials: Add HTTP Basic Auth credentials with your TenderNed API username and password Apply these credentials to the three HTTP Request nodes: "Tenderned Publicaties" "Haal XML Details" "Haal JSON Details" Customize filters: Modify the "Filter op ..." node to match your specific requirements Examples: specific organizations, contract values, regions, etc. How It Works Step 1: Trigger The workflow can be triggered either manually for testing or automatically on a daily schedule. Step 2: Fetch Publications Makes an API call to TenderNed to retrieve a list of recent publications (up to 100 per request). Step 3: Process & Split Extracts the tender array from the response and splits it into individual items for processing. Step 4: Fetch Details For each tender, the workflow makes two parallel API calls: XML endpoint** - Retrieves the complete tender documentation in XML format JSON endpoint** - Fetches metadata including reference numbers and keywords Step 5: Parse & Merge Parses the XML data and merges it with the JSON metadata and batch information into a single data structure. Step 6: Extract Fields Maps the raw API data to clean, structured fields including: Publication ID and date Organization name Tender title and description Reference numbers (kenmerk, TED number) Step 7: Filter Applies your custom filter criteria to focus on relevant tenders only. Step 8: Store Inserts the processed data into your database for storage and future analysis. Customization Tips Modify API Parameters In the "Tenderned Publicaties" node, you can adjust: offset: Starting position for pagination size: Number of results per request (max 100) Add query parameters for date ranges, status filters, etc. Add More Fields Extend the "Splits Alle Velden" node to extract additional fields from the XML/JSON data, such as: Contract value estimates Deadline dates CPV codes (procurement classification) Contact information Integrate Notifications Add a Slack, Email, or Discord node after the filter to get notified about new matching tenders. Incremental Updates Modify the workflow to only fetch new tenders by: Storing the last execution timestamp Adding date filters to the API query Only processing publications newer than the last run Troubleshooting No data returned? Verify your TenderNed API credentials are correct Check that you have setup youre filter proper Need help setting this up or interested in a complete tender analysis solution? Get in touch 🔗 LinkedIn – Wessel Bulte
by Shahrear
🧾 Image Extraction Pipeline (Google Drive + VLM Run + n8n) ⚙️ What This Workflow Does This workflow automates the process of extracting images from uploaded documents in Google Drive using the VLM Run Execute Agent, then downloads and saves those extracted images into a designated Drive folder. 🧩 Requirements Google Drive OAuth2 credentials** VLM Run API credentials** with Execute Agent access A reachable n8n Webhook URL (e.g., /image-extract-via-agent) ⚡Quick Setup Configure Google Drive OAuth2 and create upload folder and folder for saving extracted images. Install the verified VLM Run node by searching for VLM Run in the node list, then click Install. Once installed, you can start using it in your workflows. Add VLM Run API credentials for document parsing. ⚙️ How It Works Monitor Uploads – The workflow watches a specific Google Drive folder for new file uploads (e.g., receipts, reports, or PDFs). Download File – When a file is created, it’s automatically downloaded in binary form. Extract Images (VLM Run) – The file is sent to the VLM Run Execute Agent, which analyzes the document and extracts image URLs via its callback. Receive Image Links (Webhook) – The workflow’s Webhook node listens for the agent’s response containing extracted image URLs. Split & Download – The Split Out node processes each extracted link, and the HTTP Request node downloads each image. Save Image – Finally, each image is uploaded to your chosen Google Drive folder for storage or further processing. 💡Why Use This Workflow Manual image extraction from PDFs and scanned files is repetitive and error-prone. This pipeline automates it using VLM Run, a vision-language AI service that: Understands document layout and structure Handles multi-page and mixed-content files Extracts accurate image data with minimal setup. For example- the output contains URLs to extracted images { "image_urls": [ "https://vlm.run/api/files/img1.jpg", "https://vlm.run/api/files/img2.jpg" ] } Works with both images and PDFs 🧠 Perfect For Extracting photos or receipts from multi-page PDFs Archiving embedded images from reports or invoices Preparing image datasets for labeling or ML model training 🛠️ How to Customize You can extend this workflow by: Adding naming conventions or folder structures based on upload type Integrating Slack/Email notifications when extraction completes Including metadata logging (file name, timestamp, source) into Google Sheets or a database Chaining with classification or OCR workflows using VLM Run’s other agents ⚠️ Community Node Disclaimer This workflow uses community nodes (VLM Run) that may need additional permissions and custom setup.
by EoCi - Mr.Eo
🎯 What This Does Automatically finds PDF file in Google Drive and extracts information. Use it to pull out clean output. It then formats the output into a clean JSON object. 🔄 How It Works 1. Manual Trigger starts the process. 2. 🔎Find File: "Google Drive" node finds the PDF file/files in a specified folder and downloads it/them. 3. 📝Extract Raw Text: "Extract From File" node pulls the text content from the retrieval file/files. 4. ✅Output Clean Data: "Code" node refines the extracted content and runs custom code for cleaning and final formatting. 🚀Setup Guidelines Setup Requirements Google Drive Account**: A Google Drive with an empty folder or folder that contains PDF file/files that you want to process. API Keys**: Gemini, Google Drive. Set up steps Setup time: < 5 minutes Add Credentials in n8n: Ensure your Google Drive OAuth2 and Google Gemini (PaLM) API credentials are created and connected. Go to Credentials > New to add them if you haven't created yet. Configure the Search Node (Get PDF Files/File): Open the node and select your Google Drive credential. In the "Resource" field, choose File/Folder. In "Search Method" field, select "Search File/Folder Name", In "Search Query" type in *.pdf. Add on 2 filters, in "Folder" filter click on dropdown choose "From List" and connect to the created folder on your google drive. In "What to Search" filter, select file. Add on "Options" (optional): Click on "Add option", choose ("ID" and "Name") Define Extraction Rules (Extract Files/File's Data): Select File Type: Open node and click on the dropdown below "Operation" section, choose "Extract From PDF". Next, in "Input Binary Field" section keep as default "data". Clean & Format Data (Optional): Adjust the Get PDF Data Only node to keep only the fields you need and give them friendly names. Modify the Data Parser & Cleaner node if you need to perform custom transformation. Activate and Run: Save and Activate the workflow. Click "Execute Workflow" to run it manually and check the output. That’s it! Once configured, this workflow becomes your personal data assistant. Run it anytime you need to extract information quickly and accurately, saving you hours of manual work and ensuring your data is always ready to use.