by System Admin
Define URLs in array. curl -X POST https://api.firecrawl.dev/v1/scrape \ -H 'Content-Type: application/json' \ -H 'Authorization: Bearer YOUR_API_KEY' \ -d '{ "url": "https://docs.firecrawl.dev", ...
by Dahiana
AI Content Summarizer Suite This n8n template collection demonstrates how to build a comprehensive AI-powered content summarization system that handles multiple input types: URLs, raw text, and PDF files. Built as 4 separate workflows for maximum flexibility. Use cases: Research workflows, content curation, document processing, meeting prep, social media content creation, or integrating smart summarization into any app or platform. How it works Multi-input handling: Separate workflows for URLs (web scraping), direct text input, and PDF file processing Smart PDF processing: Attempts text extraction first, falls back to OCR.Space for image-based PDFs AI summarization: Uses OpenAI's GPT-4.1-mini with customizable length (brief/standard/detailed) and focus areas (key points/numbers/conclusions/action items) Language support: Multi-language summaries with automatic language detection Flexible output: Returns clean markdown-formatted summaries via webhook responses Unified option: The all-in-one workflow automatically detects input type and routes accordingly How to use Replace webhook triggers with your preferred method (manual, form, API endpoint) Each workflow accepts different parameters: URL, text content, or file upload Customize summary length and focus in the AI prompt nodes Authentication is optional - switch to "none" if running internally Perfect for integration with Bubble, Zapier, or any platform that can make HTTP requests Requirements OpenAI API key or OpenRouter Keys OCR.Space API key (for PDF fallback processing) n8n instance (cloud or self-hosted) Any platform that can make HTTP requests. Setup Steps Replace "Dummy OpenAI" with your OpenAI credentials Add your OCR.Space API key in the OCR nodes is not mandatory. Update webhook authentication as needed Test each workflow path individually
by Fahmi Fahreza
TikTok Trend Analyzer with Apify + Gemini + Airtable Automatically scrape trending TikTok videos, analyze their virality using Gemini AI, and store insights directly into Airtable for creative research or content planning. Who’s it for? Marketing analysts, creators, and creative agencies looking to understand why videos go viral and how to replicate successful hooks and formats. How it works A scheduled trigger runs the Apify TikTok Trends Scraper weekly. The scraper collects trending video metadata. Data is stored in Airtable (views, likes, captions, sounds, etc.). When a specific video is submitted via webhook, the workflow fetches it from Airtable. Gemini AI analyzes the video and extracts structured insights: summary, visual hook, audio, and subtitle analysis. The workflow updates the Airtable record with these AI insights. How to set up Connect Apify and Airtable credentials, link Gemini or OpenAI keys, and adjust the schedule frequency. Add your Airtable base and table IDs. You can trigger analysis manually via the webhook endpoint.
by Atik
Automate multi-document handling with AI-powered extraction that adapts to any format and organizes it instantly. What this workflow does Monitors Google Drive for new uploads (receipts, resumes, claims, physician orders, blueprints, or any doc type) Automatically downloads and prepares files for analysis Identifies the document type using Google Gemini Parses structured data via the trusted VLM Run node with OCR + layout parsing Stores records in Google Sheets — AI Agent maps values to the correct sheet dynamically Setup Prerequisites: Google Drive & Google Sheets accounts, VLM Run API credentials, n8n instance. Install the verified VLM Run node by searching for VLM Run in the node list, then click Install. Once installed, you can integrate it directly for high-accuracy data extraction. Quick Setup: Configure Google Drive OAuth2 and select a folder for uploads Add VLM Run API credentials Create a Master Reference Google Sheet with the following structure: | Document_Name | Spreadsheet_ID | | ---------------------- | ----------------------------- | | Receipt | your-receipt-sheet-id | | Resume | your-resume-sheet-id | | Physician Order | your-physician-order-sheet-id | | Claims Processing | your-claims-sheet-id | | Construction Blueprint | your-blueprint-sheet-id | The first column holds the document type, and the second column holds the target sheet ID where extracted data should be appended. In the AI Agent node, edit the agent prompt to: Analyze the JSON payload from VLM Run Look up the document type in the Master Reference Sheet If a matching sheet exists → fetch headers, then append data accordingly If headers don’t exist → create them from JSON keys, then insert values If no sheet exists → add the new type to the Master Reference with an empty Spreadsheet ID Test with a sample upload and activate the workflow How to customize this workflow to your needs Extend functionality by: Adjusting the AI Agent prompt to support any new document schema (just update field mappings) Adding support for multi-language OCR or complex layouts in VLM Run Linking Sheets data to BI dashboards or reporting tools Triggering notifications when new entries are stored This workflow leverages the VLM Run node for flexible, precision extraction and the AI Agent for intelligent mapping, creating a powerful system that adapts to any document type with minimal setup changes.
by Arlin Perez
🔍 Description: Effortlessly delete unused or inactive workflows from your n8n instance while automatically backing them up as .json files into your Google Drive. Keep your instance clean, fast, and organized — no more clutter slowing you down. This workflow is ideal for users managing large self-hosted n8n setups, or anyone who wants to maintain optimal performance while preserving full workflow backups. ✅ What it does: Accepts a full n8n Workflow URL via a form Retrieves workflow info automatically Converts the workflow’s full JSON definition into a file Uploads that file to Google Drive Deletes the workflow safely using the official n8n API Sends a Telegram notification confirming backup and deletion ⚙️ How it works: 📝 Form – Collects the full workflow URL from the user 🔍 n8n Node (Get Workflow) – Uses the URL to fetch workflow details 📦 Code Node ("JSON to File") – Converts the workflow JSON into a properly formatted .json file with UTF-8 encoding, ready to be uploaded to Google Drive. ☁️ Google Drive Upload – Uploads the .json backup file to your selected Drive folder 🗑️ n8n Node (Delete Workflow) – Deletes the workflow from your instance using its ID 📬 Telegram Notification – Notifies you that the workflow was backed up and deleted, showing title, ID, and date 📋 Requirements: Google Drive connected to your n8n account Telegram Bot connected to n8n An n8n instance with API access (self-hosted or Cloud) Your n8n API Key (Create one in the settings) 🛠️ How to Set Up: ✅ Add your Google Drive credentials ✅ Add your Telegram Bot credentials 🧾 In the “JSON to File” Code node, no additional setup is required — it automatically converts the workflow JSON into a downloadable .json file using the correct encoding and filename format. ☁️ In the Google Drive node: Binary Property: data Folder ID: your target folder in Google Drive 🔑 Create a new credential for the n8n node using: API Key: your personal n8n API key Base URL: your full n8n instance API path (e.g. https://your-n8n-instance.com/api/v1) ⚙️ Use this credential in both the Get Workflow and Delete Workflow n8n nodes 📬 In the Telegram node, use this message template: 🗑️ Workflow "{{ $json.name }}" (ID: {{ $json.id }}) was backed up to Google Drive and deleted from n8n. 📅 {{ $now }} 🔒 Important: This workflow backs up the entire workflow data to Google Drive. Please be careful with the permissions of your Google Drive folder and avoid sharing it publicly, as the backups may contain sensitive information. Ensuring proper security and access control is essential to protect your data. 🚀 Activate the workflow and you're ready to safely back up and remove workflows from your n8n instance
by Meak
Firecrawl Web Search Agent → Google Sheets Logger with OpenRouter + n8n Most teams craft search operators by hand and copy results into spreadsheets. This workflow automates query generation, multi-operator searches, scraping, and logging — from a single webhook call. Benefits Auto-generate Firecrawl queries from natural language (OpenRouter Agent) Use pro operators: site:, inurl:, intitle:, exclusions, related Run parallel searches (site match, in-URL, exclusions, YouTube/intitle) Append titles/URLs/results to Google Sheets automatically Return results to the caller via webhook response Optional scraping of markdown + full-page screenshots How It Works Webhook receives a natural-language search request OpenRouter-powered Agent converts it to a Firecrawl query (+ limit) Firecrawl Search runs with scrapeOptions (markdown, screenshot) Parallel queries: site:, inurl:, negative filters, YouTube intitle:automation Collect results (title, url, data fields) from each call Append rows to Google Sheets (one per result) Respond to the webhook with the aggregated payload Ready to chain into alerts, enrichment, or CRM sync Who Is This For Researchers and content teams building source lists Growth/SEO teams needing precise operator queries Agencies automating discovery, monitoring, and logging Setup Connect OpenRouter (select your LLM; e.g., GPT-4.1-mini) Add Firecrawl API key and endpoint (/v1/search) Connect Google Sheets (Document ID + Sheet/Tab) Set webhook path and allow POST from your app Define default limit (fallback = 5) and scrapeOptions ROI & Monetization Save 3–6 hours/week on manual searching & copy/paste Offer as a $500–$2k/month research automation for clients Upsell alerts (cron/webhook) and data enrichment for premium retainers Strategy Insights In the full walkthrough, I show how to: Prompt the Agent to produce flawless site:/inurl/intitle/-exclusions Map Firecrawl data fields cleanly into Sheets Handle rate limits, empty results, and retries Extend with dedupe, domain filtering, and Slack/Telegram alerts Check Out My Channel For more advanced AI automation systems that generate real business results, check out my YouTube channel where I share the exact strategies I use to build automation agencies, sell high-value services, and scale to $20k+ monthly revenue.
by Rapiwa
Automatically Send WhatsApp Discount Codes to Shopify Customers Using Rapiwa Who is this for? This n8n workflow automatically sends WhatsApp promotional messages to top customers whenever a new discount code is created in Shopify. It’s perfect for store owners, marketers, sales teams, or support agents who want to engage their best customers effortlessly. The workflow fetches customer data, filters high-spending customers, verifies their WhatsApp numbers using the Rapiwa API, sends discount messages to verified contacts, and logs all activity in Google Sheets. Designed for non-technical users who don’t use the official WhatsApp Business API, this automation simplifies customer outreach and tracking without any manual work. What this Workflow Does This n8n workflow connects with a Google Sheet that contains a list of contacts. It reads rows marked for processing, cleans the phone numbers, checks their validity using Rapiwa's WhatsApp validation API, sends WhatsApp messages to valid numbers, and updates the status of each row accordingly. Key Features Runs Every 5 Minutes**: Automatically triggers the workflow Google Sheets Integration**: Reads and writes data from a specific sheet Phone Number Validation**: Confirms if a WhatsApp number is active via Rapiwa API Message Sending**: Sends a message using Rapiwa's /send-message endpoint Status Update**: Sheet is updated with success or failure status Safe API Usage**: Delays added between requests to prevent rate limits Batch Limit**: Processes max 60 rows per cycle Conditional Checks**: Skips rows without a "check" value Requirements A Google Sheet with necessary columns Rapiwa account** with active subscription (you can free 200 message) Your WhatsApp number connected to Rapiwa Valid Bearer Token n8n Instance** (self-hosted or cloud) Google Sheets node configured HTTP Request node access How to Use Step-by-Step Setup Webhook Receives Shopify Webhook (discount creation) via HTTP POST request. This is triggered when a discount is created in your Shopify store. Configure Google Sheets in n8n Use the Google Sheets node with OAuth2 access Get Rapiwa API Token Create an account on Rapiwa Connect your WhatsApp number Copy your Bearer Token from the Rapiwa dashboard Set Up HTTP Request Nodes Validate number via: https://app.rapiwa.com/api/verify-whatsapp Send message via: https://app.rapiwa.com/api/send-message Add your bearer token to the headers Google Sheet Column Structure A Google Sheet** formatted like this ➤ Sample | discount_code | created_at | shop_domain | name | number | verify | status | | -------------------------------------------- | ------- | ------------------------- | ----------------------- | -------------- | ------------- | ---------- | -------- | | V8ZGVRDFP5TB | 2025-09-25T05:26:40-04:00 | your_shop_domain | Abdul Mannan | 8801322827798| unverified | not sent | | V8ZGVRDFP5TB | 2025-09-25T05:26:40-04:00 | your_shop_domain | Abdul Mannan | 8801322827799| verified | sent | Support & Help Rapiwa Website:** https://rapiwa.com WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Evoort Solutions
🚀 Website Traffic Monitoring with SEMrush API and Google Sheets Integration Leverage the powerful SEMrush Website Traffic Checker API to automatically fetch detailed website traffic insights and log them into Google Sheets for real-time monitoring and reporting. This no-code n8n workflow simplifies traffic analysis for marketers, analysts, and website owners. ⚙️ Node-by-Node Workflow Breakdown 1. 🟢 On Form Submission Trigger:** The workflow is initiated when a user submits a website URL via a form. This serves as the input for further processing. Use Case:** When you want to track multiple websites and monitor their performance over time. 2. 🌐 Website Traffic Checker API Request:* The workflow makes a POST request to the *SEMrush Website Traffic Checker API** via RapidAPI using the website URL that was submitted. API Data:** The API returns detailed traffic insights, including: Visits Bounce rate Page views Sessions Traffic sources And more! 3. 🔄 Reformat Parsing:** The raw API response is parsed to extract the relevant data under trafficSummary. Data Structure:** The workflow creates a clean dataset of traffic data, making it easy to store in Google Sheets. 4. 📄 Google Sheets Logging Data:** The traffic data is appended as a new row in your Google Sheet. Google Sheet Setup:** The data is organized and updated in a structured format, allowing you to track website performance over time. 💡 Use Cases 📊 SEO & Digital Marketing Agencies:** Automate client website audits by pulling live traffic data into reports. 🌐 Website Owners & Bloggers:** Monitor traffic growth and analyze content performance automatically. 📈 Data Analysts & Reporting Teams:** Feed traffic data into dashboards and integrate with other KPIs for deeper analysis. 🕵️ Competitor Tracking:** Regularly log competitor site metrics for comparative benchmarking. 🎯 Key Benefits ✅ Automated Traffic Monitoring — Run reports automatically on-demand or on a scheduled basis. ✅ Real-Time Google Sheets Logging — Easily centralize and structure traffic data for easy sharing and visualization. ✅ Zero Code Required — Powered by n8n’s visual builder, set up workflows quickly without writing a single line of code. ✅ Scalable & Flexible — Extend the workflow to include alerts, additional API integrations, or other automated tasks. 🔐 How to Get Your SEMrush API Key via RapidAPI Visit the API Listing 👉 SEMrush Website Traffic Checker API Sign In or Create an Account Log in to RapidAPI or sign up for a free account. Subscribe to the API Choose the appropriate pricing plan and click Subscribe. Access Your API Key Go to the Endpoints tab. Your API key is located under the X-RapidAPI-Key header. Secure & Use the Key Add your API key to the request headers in your workflow. Never expose the key publicly. 🔧 Step-by-Step Setup Instructions 1. Creating the Form to Capture URL In n8n, create a new workflow and add a Webhook trigger node to capture website URLs. Configure the webhook to accept URL submissions from your form. Add a form to your website or app that triggers the webhook when a URL is submitted. 2. Configure SEMrush API Request Node Add an HTTP Request node after the webhook. Set the method to POST and the URL to the SEMrush API endpoint. Add the necessary headers: X-RapidAPI-Host: semrush-website-traffic-checker.p.rapidapi.com X-RapidAPI-Key: [Your API Key] Pass the captured website URL from the webhook as a parameter in the request body. 3. Reformat API Response Add a Set node to parse and structure the API response. Extract only the necessary data, such as: trafficSummary.visits trafficSummary.bounceRate trafficSummary.pageViews trafficSummary.sessions Format the response to be clean and suitable for Google Sheets. 4. Store Data in Google Sheets Add the Google Sheets node to your workflow. Authenticate with your Google account. Select the spreadsheet and worksheet where you want to store the traffic data. Configure the node to append new rows with the extracted traffic data. Google Sheets Columns Setup A**: Website URL B**: Visits C**: Bounce Rate D**: Page Views E**: Sessions F**: Date/Time (optional, you can use a timestamp) 5. Test and Deploy Run a test submission through your form to ensure the workflow works as expected. Check the Google Sheets document to verify that the data is being logged correctly. Set up scheduling or additional workflows as needed (e.g., periodic updates). 📈 Customizing the Template You can modify the workflow to suit your specific needs: Add more data points**: Customize the SEMrush API request to fetch additional metrics (e.g., traffic sources, keywords, etc.). Create separate sheets**: If you're tracking multiple websites, you can create a different sheet for each website or group websites by category. Add alerts**: Set up email or Slack notifications if specific traffic conditions (like sudden drops) are met. Visualize data**: Integrate Google Sheets with Google Data Studio or other tools for more advanced visualizations. 🚀 Start Automating in Minutes Build your automated website traffic dashboard with n8n today — no coding required. 👉 Start with n8n for Free Save time, improve accuracy, and supercharge your traffic insights workflow!
by Daniel
Secure your n8n automations with this comprehensive template that automates periodic backups to Telegram for instant access while enabling flexible restores from Google Drive links or direct file uploads—ensuring quick recovery without data loss. 📋 What This Template Does This dual-branch workflow handles full n8n instance backups and restores seamlessly. The backup arm runs every 3 days, fetching all workflows via the n8n API, aggregating them into a JSON array, converting to a text file, and sending it to Telegram for offsite storage and sharing. The restore arm supports two entry points: manual execution to pull a backup from Google Drive or form-based upload for local files, then parses the JSON, cleans workflows for compatibility, and loops to create missing ones or update existing by name—handling batches efficiently to respect API limits. Scheduled backups with Telegram delivery for easy stakeholder access Dual restore paths: Drive download or direct file upload via form Intelligent create-or-update logic with data sanitization to avoid conflicts Looped processing with existence checks and error continuation 🔧 Prerequisites n8n instance with API enabled (self-hosted or cloud) Telegram account for bot setup Google Drive account (optional, for Drive-based restores) 🔑 Required Credentials n8n API Setup In n8n, navigate to Settings → n8n API Enable the API and generate a new key Add to n8n as "n8n API" credential type, pasting the key in the API Key field Telegram API Setup Message @BotFather on Telegram to create a new bot and get your token Find your chat ID by messaging @userinfobot Add to n8n as "Telegram API" credential type, entering the Bot Token Google Drive OAuth2 API Setup In Google Cloud Console, go to APIs & Services → Credentials Create an OAuth 2.0 Client ID for Web application, enable Drive API Add redirect URI: [your-n8n-instance-url]/rest/oauth2-credential/callback Add to n8n as "Google Drive OAuth2 API" credential type and authorize ⚙️ Configuration Steps Import the workflow JSON into your n8n instance Assign the n8n API, Telegram API, and Google Drive credentials to their nodes Update the Telegram chat ID in the "Send Backup to Telegram" node Set the Google Drive file ID in the "Download Backup from Drive" node (from file URL) Activate the workflow and test backup by executing the Schedule node manually Test restore: Run manual trigger for Drive or use the form for upload 🎯 Use Cases Dev teams backing up staging workflows to Telegram for rapid production restores during deployments Solo automators uploading local backups via form to sync across devices after n8n migrations Agencies sharing client workflow archives via Drive links for secure, collaborative restores Educational setups scheduling exports to Telegram for student template distribution and recovery ⚠️ Troubleshooting Backup file empty: Verify n8n API permissions include read access to workflows Restore parse errors: Check JSON validity in backup file; adjust Code node property reference if needed API rate limits hit: Increase Wait node duration or reduce batch size in Loop Form upload fails: Ensure file is valid JSON text; test with small backup first
by Ms. Phuong Nguyen (phuongntn)
An AI Recruiter that screens, scores, and ranks candidates in minutes — directly inside n8n. 🧠 Overview An AI-powered recruiter workflow that compares multiple candidate CVs with a single Job Description (JD). It analyzes text content, calculates fit scores, identifies strengths and weaknesses, and provides automated recommendations. ⚙️ How it works 🔹 Webhook Trigger – Upload one Job Description (JD) and multiple CVs (PDF or text) 🔹 File Detector – Auto-identifies JD vs CV 🔹 Extract & Merge – Reads text and builds candidate dataset 🔹 🤖 AI Recruiter Agent – Compares JD & CVs → returns Fit Score, Strengths, Weaknesses, and Recommendation 🔹 📤 Output Node – Sends structured JSON or summary table for HR dashboards or Chat UI Example: Upload JD.pdf + 3 candidate CVs → get instant JSON report with top match and recommendations. 🧩 Requirements OpenAI or compatible AI Agent connection (no hardcoded API keys). Input files in PDF or text format (English or Vietnamese supported). n8n Cloud or Self-Hosted v1.50+ with AI Agent nodes enabled. 🔸 “OpenAI API Key or n8n AI Agent credential required” 🧱 Customizing this workflow Swap the AI model with Gemini, Claude, or another LLM. Add a Google Sheets export node to save results. Connect to SAP HR or internal employee APIs. Adjust scoring logic or include additional attributes (experience, skills, etc.). 👩💼 Author https://www.linkedin.com/in/nguyen-phuong-17a71a147/ Empowering HR through intelligent, data-driven recruitment.
by Julian Kaiser
Scan Any Workout Plan into the Hevy App with AI This workflow automates the creation of workout routines in the Hevy app by extracting exercise information from an uploaded PDF or Image using AI. What problem does this solve? Tired of manually typing workout plans into the Hevy app? Whether your coach sends them as Google Docs, PDFs, or you have a screenshot of a routine, entering every single exercise, set, and rep is a tedious chore. This workflow ends the madness. It uses AI to instantly scan your workout plan from any file, intelligently extract the exercises, and automatically create the routine in your Hevy account. What used to take 15 minutes of mind-numbing typing now happens in seconds. How it works Trigger: The workflow starts when a PDF file is submitted through an n8n form. Data Extraction: The PDF is converted to a Base64 string and sent to an AI model to extract the raw text of the workout plan. Context Gathering: The workflow fetches a complete list of available exercises directly from the Hevy API. This list is then consolidated. AI Processing: A Google Gemini model analyzes the extracted text, compares it against the official Hevy exercise list, and transforms the raw text into a structured JSON format that matches the Hevy API requirements. Routine Creation: The final structured data is sent to the Hevy API to create the new workout routine in your account. Set up steps Estimated set up time:** 15 minutes. Configure the On form submission trigger or replace it with your preferred trigger (e.g., Webhook). Ensure it's set up to receive a file upload. Add your API credentials for the AI service (in this case, OpenRouter.ai) and the Hevy app. You will need to create 'Hevy API' and OpenRouter API credentials in your n8n instance. In the Structured Data Extraction node, review the prompt and the json schema in the Structured Output Parser. You may need to adjust the prompt to better suit the types of files you are uploading. Activate the workflow. Test it by uploading a sample workout plan document.
by Automate With Marc
Gemini 3 Image & PDF Extractor (Google Drive → Gemini 3 → Summary) Automatically summarize newly uploaded images or PDF reports using Google Gemini 3, triggered directly from a Google Drive folder. Perfect for anyone who needs fast AI-powered analysis of financial reports, charts, screenshots, or scanned documents. 🎥 Watch the full step-by-step video tutorial: https://www.youtube.com/watch?v=UuWYT_uXiw0 What this template does This workflow watches a Google Drive folder for new files and automatically: Detects new uploaded files Uses Google Drive Trigger Watches a specific folder for fileCreated events Filters by MIME type: image/png image/webp application/pdf Downloads the file automatically Depending on the file type: Images → Download via HTTP Request → Send to Gemini 3 Vision PDFs → Download via HTTP Request → Extract content → Send to Gemini 3 Analyzes content using Gemini 3 Two separate processing lanes: 🖼️ Image Lane Image is sent to Gemini 3 (Vision / Image Analyze) Extracts textual + visual meaning from charts, diagrams, or screenshots Passes structured output to an AI Analyst Agent Agent summarizes and highlights top 3 findings 📄 PDF Lane PDF is downloaded Text is extracted using Extract From File Processed using Gemini 3 via OpenRouter Chat Model AI Analyst Agent summarizes charts/tables and extracts insights Why this workflow is useful Save hours manually reading PDFs, charts, and screenshots Convert dense financial or operational documents into digestible insights Great for: Financial analysts Operations teams Market researchers Content & reporting teams Anyone receiving frequent reports via Drive Requirements Before using this template, you will need: Google Drive OAuth credential (for Drive trigger + file download) Gemini 3 / PaLM or OpenRouter API key (Optional) Update folder ID to your own Google Drive target folder ⚠️ No credentials are included in this template. Add them manually after importing it. Node Overview Google Drive Trigger Watches a specific Drive folder for newly added files Provides metadata like webContentLink and MIME type Filter by Type (IF Node) Routes files to Image lane or PDF lane png or webp → Image pdf → PDF 🖼️ Image Processing Lane Download Image (HTTP Request) Analyze Image (Gemini Vision) Analyzer Agent Summarizes findings Highlights actionable insights Powered by OpenRouter Gemini 3 📄 PDF Processing Lane Download PDF (HTTP Request) Extract From File → PDF Analyzer Agent (PDF) Summarizes extracted chart/report information Highlights key takeaways Setup Guide Import the template into your n8n workspace Open Google Drive Trigger Select your Drive OAuth credential Replace folder ID with your target folder Open Gemini 3 / OpenRouter AI Model nodes Add your API credentials Test by uploading: A PNG/WebP chart screenshot A multi-page PDF report Check the execution to view summary outputs Customization Ideas Add email delivery (send the summary to yourself daily) Save summaries into: Google Sheets Notion Slack channels n8n Data Tables Add a second agent to convert summaries into: Weekly reports PowerPoint slides Slack-ready bullet points Add classification logic: Revenue reports Marketing analytics Product dashboards Financial charts Troubleshooting Trigger not firing? Confirm your Drive OAuth credential has read access to the folder. Gemini errors? Ensure your model ID matches your API provider: models/gemini-3-pro-preview google/gemini-3-pro-preview PDF extraction empty? Check if the file contains selectable text or only images. (You can add OCR if needed.)