by Antonio Gasso
Process multiple invoices automatically using Mistral's dedicated OCR model—at approximately $0.002 per page. Upload batches of PDF, PNG, or JPG invoices through a simple form, extract structured financial data with AI, validate results with confidence scoring, and save everything to Google Sheets. What this workflow does Accepts multiple invoice uploads via n8n Form Trigger Processes files in batch with rate limiting Converts each file to base64 and sends to Mistral OCR API Extracts 9 standard fields using GPT-4o-mini Information Extractor Validates data and assigns confidence scores (high/medium/low) Saves all results to Google Sheets with status tracking Fields extracted Invoice Number, Date, Vendor Name, Tax ID, Subtotal, Tax Rate, Tax Amount, Total Amount, Currency Use cases Accountants processing client invoices in bulk Small businesses digitizing paper receipts Bookkeepers automating repetitive data entry Finance teams building searchable invoice databases Setup requirements Mistral API Key (console.mistral.ai) — HTTP Header Auth credential OpenAI API Key (platform.openai.com) Google Sheets OAuth connection Google Sheet with 15 columns (template in workflow notes)
by Robert Breen
Pull a Dun & Bradstreet Business Information Report (PDF) by DUNS, convert the response into a binary PDF file, extract readable text, and use OpenAI to return a clean, flat JSON with only the key fields you care about (e.g., report date, Paydex, viability score, credit limit). Includes Sticky Notes for quick setup help and guidance. ✅ What this template does Requests a D&B report* (PDF) for a specific *DUNS** via HTTP Converts* the API response into a *binary PDF file** Extracts** the text from the PDF for analysis Uses OpenAI with a Structured Output Parser to return a flat JSON Designed to be extended to Sheets, databases, or CRMs 🧩 How it works (node-by-node) Manual Trigger — Runs the workflow on demand ("When clicking 'Execute workflow'"). D&B Report (HTTP Request) — Calls the D&B Reports API for a Business Information Report (PDF). Convert to PDF File (Convert to File) — Turns the D&B response payload into a binary PDF. Extract Binary (Extract from File) — Extracts text content from the PDF. OpenAI Chat Model — Provides the language model context for the analyzer. Analyze PDF (AI Agent) — Reads the extracted text and applies strict rules for a flat JSON output. Structured Output (AI Structured Output Parser) — Enforces a schema and validates/auto-fixes the JSON shape. (Optional) Get Bearer Token (HTTP Request) — Template guidance for OAuth token retrieval (shown as disabled; included for reference if you prefer Bearer flows). 🛠️ Setup instructions (from the JSON) 1) D&B Report (HTTP Request) Auth:* Header Auth (use an n8n *HTTP Header Auth** credential) URL:** https://plus.dnb.com/v1/reports/duns/804735132?productId=birstd&inLanguage=en-US&reportFormat=PDF&orderReason=6332&tradeUp=hq&customerReference=customer%20reference%20text Headers:** Accept: application/json Credential Example:** D&B (HTTP Header Auth) > Put your Authorization: Bearer <token> header inside this credential, not directly in the node. 2) Convert to PDF File (Convert to File) Operation:** toBinary Source Property:** contents[0].contentObject > This takes the PDF content from the D&B API response and converts it to a binary file for downstream nodes. 3) Extract Binary (Extract from File) Operation:** pdf > Produces a text field with the extracted PDF content, ready for AI analysis. 4) OpenAI Model(s) OpenAI Chat Model** Model:** gpt-4o (as configured in the JSON) Credential:* Your stored *OpenAI API* credential (do *not** hardcode keys) Wiring:** Connect OpenAI Chat Model as ai_languageModel to Analyze PDF Connect another OpenAI Chat Model (also gpt-4o) as ai_languageModel to Structured Output 5) Analyze PDF (AI Agent) Prompt Type:** define Text:** ={{ $json.text }} System Message (rules):** You are a precision extractor. Read the provided business report PDF and return only a single flat JSON object with the fields below. No arrays/lists. No prose. If a value is missing, output null. Dates: YYYY-MM-DD. Numbers: plain numerics (no commas or $). Prefer most recent or highest-level overall values if multiple are shown. Never include arrays, nested structures, or text outside of the JSON object. 6) Structured Output (AI Structured Output Parser) JSON Schema Example:** { "report_date": "", "company_name": "", "duns": "", "dnb_rating_overall": "", "composite_credit_appraisal": "", "viability_score": "", "portfolio_comparison_score": "", "paydex_3mo": "", "paydex_24mo": "", "credit_limit_conservative": "" } Auto Fix:** enabled Wiring:* Connect as ai_outputParser to *Analyze PDF** 7) (Optional) Get Bearer Token (HTTP Request) — Disabled example If you prefer fetching tokens dynamically: Auth:** Basic Auth (D&B username/password) Method:** POST URL:** https://plus.dnb.com/v3/token Body Parameters:** grant_type = client_credentials Headers:** Accept: application/json Downstream usage:** Set header Authorization: Bearer {{$json["access_token"]}} in subsequent calls. > In this template, the D&B Report node uses Header Auth credential instead. Use one strategy consistently (credentials are recommended for security). 🧠 Output schema (flat JSON) The analyzer + parser return a single flat object like: { "report_date": "2024-12-31", "company_name": "Example Corp", "duns": "123456789", "dnb_rating_overall": "5A2", "composite_credit_appraisal": "Fair", "viability_score": "3", "portfolio_comparison_score": "2", "paydex_3mo": "80", "paydex_24mo": "78", "credit_limit_conservative": "25000" } 🧪 Test flow Click Execute workflow (Manual Trigger). Confirm D&B Report returns the PDF response. Check Convert to PDF File for a binary file. Verify Extract from File produces a text field. Inspect Analyze PDF → Structured Output for valid JSON. 🔐 Security notes Do not hardcode tokens in nodes; use Credentials (HTTP Header Auth or Basic Auth). Restrict who can execute the workflow if it's accessible from outside your network. Avoid storing sensitive payloads in logs; mask tokens/headers. 🧩 Customize Map the structured JSON to Google Sheets, Postgres/BigQuery, or a CRM. Extend the schema with additional fields (e.g., number of employees, HQ address) — keep it flat. Add validation (Set/IF nodes) to ensure required fields exist before writing downstream. 🩹 Troubleshooting Missing PDF text?* Ensure *Convert to File** source property is contents[0].contentObject. Unauthorized from D&B?** Refresh/verify token; confirm Header Auth credential contains Authorization: Bearer <token>. Parser errors?** Keep the agent output short and flat; the Structured Output node will auto-fix minor issues. Different DUNS/product?** Update the D&B Report URL query params (duns, productId, etc.). 🗒️ Sticky Notes (included) Overview:** "Fetch D&B Company Report (PDF) → Convert → Extract → Summarize to Structured JSON (n8n)" Setup snippets for Data Blocks (optional) and Auth flow 📬 Contact Need help customizing this (e.g., routing the PDF to Drive, mapping JSON to your CRM, or expanding the schema)? 📧 robert@ynteractive.com 🔗 https://www.linkedin.com/in/robert-breen-29429625/ 🌐 https://ynteractive.com
by sato rio
Generate market research reports from news and competitor sites to Notion and Slack This workflow automates market research and competitive intelligence by collecting industry news and competitor website updates, analyzing them with AI, and publishing structured insights to Notion and Slack. It replaces manual monitoring and summarization with a repeatable, scalable workflow suitable for daily or weekly use. Who’s it for Marketing teams** who want to track industry trends and competitor messaging in one place Product managers** looking for early signals to inform roadmap and prioritization decisions Founders and analysts** who need automated market briefings without manual research How it works A scheduled trigger starts the workflow (daily by default). Industry news is fetched via NewsAPI while competitor websites are scraped in parallel. All collected content is consolidated and sent to OpenAI (GPT-4o) for analysis. The AI generates a structured report including trends, SWOT insights, and recommended actions. The full Markdown report is saved to a Notion database, and an executive summary is posted to Slack. If any API call or scraping step fails, an error notification is sent to Slack. How to set up Add API credentials for OpenAI, NewsAPI, Notion, and Slack. Configure keywords and competitor URLs in the Research Configuration node. Select your Notion database and Slack channels in the relevant nodes. Requirements OpenAI API key (GPT-4o access) NewsAPI account Notion and Slack accounts How to customize the workflow Change the trigger to run weekly or on demand Modify the AI prompt to focus on pricing, features, or specific competitors Add additional sources such as RSS feeds or more competitor sites
by Rohit Dabra
Odoo CRM MCP Server Workflow 📖 Overview This workflow connects an AI Agent with Odoo CRM using the Model Context Protocol (MCP). It allows users to manage CRM data in Odoo through natural language chat commands. The assistant interprets the user’s request, selects the appropriate Odoo action, and executes it seamlessly. 🔹 Key Features Contacts Management**: Create, update, delete, and retrieve contacts. Opportunities Management**: Create, update, delete, and retrieve opportunities. Notes Management**: Create, update, delete, and retrieve notes. Conversational AI Agent**: Understands natural language and maps requests to Odoo actions. Model Used**: OpenAI Chat Model. This makes it easy for end-users to interact with Odoo CRM without needing technical commands—just plain language instructions. ▶️ Demo Video Watch the full demo here: 👉 YouTube Demo Video ⚙️ Setup Guide Follow these steps to set up and run the workflow: 1. Prerequisites An Odoo instance configured with CRM enabled. An n8n or automation platform account where MCP workflows are supported. An OpenAI API key with access to GPT models. MCP Server installed and running. 2. Import the Workflow Download the provided workflow JSON file. In your automation platform (n8n, Langflow, or other MCP-enabled tool), choose Import Workflow. Select the JSON file and confirm. 3. Configure MCP Server Go to your MCP Server Trigger node in the workflow. Configure it to connect with your Odoo instance. Set API endpoint. Provide authentication credentials (API key). Test the connection to ensure the MCP server can reach Odoo. 4. Configure the OpenAI Model Select the OpenAI Chat Model node in the workflow. Enter your OpenAI API Key. Choose the model (e.g., gpt-5 or gpt-5-mini). 5. AI Agent Setup The AI Agent node links the Chat Model, Memory, and MCP Client. Ensure the MCP Client is mapped to the correct Odoo tools (Contacts, Opportunities, Notes). The System Prompt defines assistant behavior—use the tailored system prompt provided earlier. 6. Activate and Test Turn the workflow ON (toggle Active). Open chat and type: "Create a contact named John Doe with email john@example.com." "Show me all opportunities." "Add a note to John Doe saying 'Follow-up scheduled for Friday'." Verify the results in your Odoo CRM. ✅ Next Steps Extend functionality with Tasks, Stages, Companies, and Communication Logs for a complete CRM experience. Add confirmation prompts for destructive actions (delete contact/opportunity/note). Customize the AI Agent’s system prompt for your organization’s workflows.
by Cheng Siong Chin
Introduction Automates AI-driven assignment grading with HTML and CSV output. Designed for educators evaluating submissions with consistent criteria and exportable results. How It Works Webhook receives papers, extracts text, prepares data, loads answers, AI grades submissions, generates results table, converts to HTML/CSV, returns response. Workflow Template Webhook → Extract Text → Prepare Data → Load Answer Script → AI Grade (OpenAI + Output Parser) → Generate Results Table → Convert to HTML + CSV → Format Response → Respond to Webhook Workflow Steps Input & Preparation: Webhook receives paper, extracts text, prepares data, loads answer script. AI Grading: OpenAI evaluates against answer key, Output Parser formats scores and feedback. Output & Response: Generates results table, converts to HTML/CSV, returns multi-format response. Setup Instructions Trigger & Processing: Configure webhook URL, set text extraction parameters. AI Configuration: Add OpenAI API key, customize grading prompts, define Output Parser JSON schema. Prerequisites OpenAI API key Webhook platform n8n instance Use Cases University exam grading Corporate training assessments Customization Modify rubrics and criteria Add PDF output Integrate LMS (Canvas, Blackboard) Benefits Consistent AI grading Multi-format exports Reduces grading time by 90%
by Milo Bravo
Event Sponsor Matching: Google Sheets, GPT-4o & Gmail Revenue Optimizer Who is this for? Event planners, conference organizers, non-profits, and partnership managers who manage sponsor spreadsheets and want AI-powered package recommendations to maximize revenue. What problem is this workflow solving? Sponsor matching is manual and suboptimal: Hours matching 50+ sponsors to Gold/Silver/Bronze packages Missing perfect fits (enterprise → Gold, startups → Silver) No personalized outreach or tracking Revenue leaks from mismatched proposals This workflow auto-matches sponsors to optimal packages and emails owners instantly. What this workflow does Trigger**: Google Sheets update (Sponsors + Packages tabs) AI Matching: GPT-4o scores sponsors → **best 1-3 packages (budget/industry fit) Gmail Outreach**: "AI recommends Gold Package for TechCorp ($5k revenue, enterprise perfect fit)" Tracking Log**: Sheets append matches + scores + status Bonus**: Revenue projections by tier acceptance rates Setup (5 minutes) Google Sheets**: 2 tabs (Sponsors: Name/Industry/Budget/Goals + Packages: Tier/Price/Benefits) AI**: OpenAI API key (GPT-4o recommended) Email**: Gmail credentials (no hardcoded IDs—env vars) Configurable**: All via variables, scales to 1000s Fully configurable, no code changes needed. How to customize to your needs Packages**: Add Platinum/Diamond tiers or custom benefits Scoring**: Adjust GPT criteria (budget weight, industry focus) Outreach**: Swap Gmail for Outreach/Reply.io sequences Tracking**: HubSpot/Salesforce sync for closed deals Triggers**: Schedule daily or webhook for real-time ROI: 20% higher sponsor conversion** via perfect-fit recs 5x faster matching** (minutes vs hours) Revenue optimization** (proven enterprise events) Zero manual spreadsheet work** Need help customizing?: Contact me for consulting and support: LinkedIn / Message Keywords: sponsor matching, event sponsorship, conference revenue optimization, sponsor scoring, package matching, sponsorship outreach.
by Sridevi Edupuganti
Telegram Voice → AI Summary & Sentiment Analysis via Gmail This n8n template demonstrates how to capture Telegram voice messages, transcribe them into text using AssemblyAI, analyze the transcript with AI for summary and sentiment insights, and finally deliver a structured email report via Gmail. Use cases Automating meeting or lecture voice note transcriptions. Gathering student feedback or training session insights from voice messages. Quickly summarizing Telegram-delivered audio inputs into structured reports. Reducing manual effort in capturing sentiment and key action items from conversations. How it works A voice message is sent to a connected Telegram Bot. The workflow fetches the file and uploads it to AssemblyAI. AssemblyAI generates a transcript from the audio. The transcript is analyzed by OpenAI to extract: Executive summary (120–180 words) Sentiment label and score Key points Action items (if any) Notable quotes Topics The formatted analysis is sent as an email report using Gmail. The workflow ends with a clean summary email containing actionable insights. How to use Import this workflow into your n8n instance. Set up and connect the required credentials: Telegram Bot API token AssemblyAI API key OpenAI API key Gmail OAuth2 account Replace placeholders (e.g., <<YOUR_EMAIL ID>> and <<YOUR_ASSEMBLYAI_API_KEY>>) with your actual values. Start the workflow. Whenever a voice message is received on the Telegram Bot, the workflow will process it end-to-end and deliver a polished email report. Requirements Telegram Bot account (API token) AssemblyAI account with API key OpenAI account with API key Gmail OAuth2 credentials configured in n8n Active n8n instance Customising this workflow You can customize the email formatting, sentiment thresholds, or extend the workflow to save transcripts into Google Drive, Airtable, or any other connected apps. Additionally, you can trigger the same workflow from multiple input sources (e.g., local audio files, Google Drive links, or Telegram).
by Tristan V
Who is this for? Businesses and developers who want to automate customer support or engagement on Facebook Messenger using AI-powered responses. What does it do? Creates an intelligent Facebook Messenger chatbot that: Responds to messages using OpenAI (gpt-4o-mini) Batches rapid-fire messages into a single AI request Maintains conversation history (50 messages per user) Shows professional UX feedback (seen indicators, typing bubbles) How it works Webhook Verification - Handles Facebook's GET verification request Message Reception - Receives incoming messages via POST webhook Message Batching - Waits 3 seconds to collect multiple quick messages AI Processing - Sends combined message to OpenAI with conversation context Response Delivery - Formats and sends the AI response back to Messenger Setup Configure Facebook Graph API credential with your Page Access Token Configure OpenAI API credential with your API key Set your verify token in the "Is Token Valid?" node Register the webhook URL in Facebook Developer Console Key Features Message Batching: Combines "Hey" + "Can you help" + "with my order?" into one request Conversation Memory: Remembers context from previous messages Echo Filtering: Prevents responding to your own messages Response Formatting: Cleans markdown for Messenger's 2000-char limit
by Hussam Muhammad Kazim
How it works: This Telegram automation works with voice and text messages given to the Telegram bot, and it returns the response in voice form if the input is in voice form. If the input is in text form, it will return a response in text form. Use Cases: Customer Support Personal Chatbot Prerequisites: OpenAI API Key Gemini API Key Telegram Bot built by BotFather Telegram Bot's API Key Target Audience: AI Automation learners who want to learn how to build and set up a basic Telegram Bot using n8n. How to set up: Create a telegram bot using BotFather, and the BotFather will give you an API key Copy the API key and set it up in a Telegram node inside n8n Get a free gemini api from https://aistudio.google.com/ Set up the Gemini API in the Transcribe recording node Get an OpenAI API key from https://platform.openai.com/docs/overview and make sure to top up your credits Copy the API key from the OpenAI platform and set it up in any OpenAI Chat Model, and it will be configured for all other nodes automatically by n8n That's it! Now you can activate the workflow and test it by sending a simple message to your Telegram bot
by Satva Solutions
🟢 Manual Trigger Workflow starts manually to initiate the reconciliation process on demand. 📄 Fetch Invoices & Bank Statements Retrieves invoice data and bank statement data from Google Sheets for comparison. 🔀 Merge Data Combines both datasets into a single structured dataset for processing. 🧩 Format Payload for AI Function node prepares and structures the merged data into a clean JSON payload for AI analysis. 🤖 AI Reconciliation AI Agent analyzes the invoice and bank statement data to identify matches, discrepancies, and reconciled entries. 🧮 Parse AI Output Parses the AI response into a structured format suitable for adding back to Google Sheets. 📊 Update Sheets Adds the reconciled data and reconciliation results into the target Google Sheet for recordkeeping. 🧾 Prerequisites ✅ OpenAI API Credentials Required for the AI Reconciliation node to process and match transactions. Add your OpenAI API key in n8n → Credentials → OpenAI. ✅ Google Sheets Credentials Needed to read invoice and bank statement data and to write reconciled results. Add credentials in n8n → Credentials → Google Sheets. ✅ Google Sheets Setup The connected spreadsheet must contain the following tabs: Invoices – for invoice data Bank_Statement – for bank transaction data Reconciled_Data – for storing the AI-processed reconciliation output ✅ Tab Structure & Required Headers Invoices Sheet Columns: Invoice_ID Invoice_Date Customer_Name Amount Status Bank_Statement Sheet Columns: Transaction_ID Transaction_Date Description Debit/Credit Amount Reconciled_Data Sheet Columns: Invoice_ID Transaction_ID Matched_Status Remarks Confidence_Score ⚙️ n8n Environment Setup Ensure all nodes are connected correctly and the workflow has permission to access the required sheets. Test each fetch and write operation before running the full workflow.
by Avkash Kakdiya
How it works This workflow captures idea submissions from a webhook and enriches them using AI. It extracts key fields like Title, Tags, Submitted By, and Created date in IST format. The cleaned data is stored in a Notion database for centralized tracking. Finally, a confirmation message is posted in Slack to notify the team. Step-by-step Step-by-step 1. Capture and process submission Webhook** – Receives idea submissions with text and user ID. AI Agent & OpenAI Model** – Enrich and structure the input into Title, Tags, Submitted By, and Created fields. Code** – Extracts clean data, formats tags, and prepares the entry for Notion. 2. Store in Notion Add to Notion** – Creates a new database entry with mapped fields: Title, Submitted By, Tags, Created. 3. Notify in Slack Send Confirmation (Slack)** – Posts a confirmation message with the submitted idea title. Why use this? Centralizes idea collection directly into Notion for better organization. Eliminates manual formatting with AI-powered data structuring. Ensures consistency in tags, submitter info, and timestamps. Provides instant team-wide visibility via Slack notifications. Saves time while keeping idea management streamlined and transparent.
by Bastian Diaz
🎯 Description Automatically generates, designs, stores, and logs complete Instagram carousel posts. It transforms a simple text prompt into a full post with copy, visuals, rendered images, Google Drive storage, and a record in Google Sheets. ⚙️ Use case / What it does This workflow enables creators, educators, or community managers to instantly produce polished, on-brand carousel assets for social media. It integrates OpenAI GPT-4.1, Pixabay, Templated.io, Google Drive, and Google Sheets into one continuous content-production chain. 💡 How it works 1️⃣ Form Trigger – Collects the user prompt via a simple web form. 2️⃣ OpenAI GPT-4.1 – Generates structured carousel JSON: titles, subtitles, topic, description, and visual keywords. 3️⃣ Code (Format content) – Parses the JSON output for downstream use. 4️⃣ Google Drive (Create Folder) – Creates a subfolder for the new carousel inside “RRSS”. 5️⃣ HTTP Request (Pixabay) – Searches for a relevant image using GPT’s visual suggestion. 6️⃣ Code (Get first result) – Extracts the top Pixabay result and image URL. 7️⃣ Templated.io – Fills the design template layers (titles/subtitles/topic/image). 8️⃣ HTTP Request (Download renders) – Downloads the rendered PNGs from Templated.io. 9️⃣ Google Drive (Upload) – Uploads the rendered images into the created folder. 10️⃣ Google Sheets (Save in DB) – Logs metadata (title, topic, folder link, description, timestamp, status). 🔗 Connectors used OpenAI GPT-4.1 (via n8n LangChain node) Templated.io API (design rendering) Pixabay API (stock image search) Google Drive (storage + folder management) Google Sheets (database / logging) Form Trigger (input collection) 🧱 Input / Output Input: User-submitted “Prompt” (text) via form Output: Generated carousel images stored in Google Drive Spreadsheet row in Google Sheets containing title, topic, description, Drive URL, status ⚠️ Requirements / Setup Valid credentials for: OpenAI API (GPT-4.1 access) Templated.io API key Pixabay API key Google Drive + Google Sheets OAuth connections Existing Google Drive folder ID for RRSS storage Spreadsheet with matching column headers (Created At, Title, Topic, Folder URL, Description, Status) Published form URL for user prompts 🌍 Example applications / extensions Educational themes (mental health, fitness, sustainability). Extend to auto-publish to Instagram Business via Meta API. Add Notion logging or automated email notifications. Integrate scheduling (Cron node) to batch-generate weekly carousels.