by Robert Breen
Pull a Dun & Bradstreet Business Information Report (PDF) by DUNS, convert the response into a binary PDF file, extract readable text, and use OpenAI to return a clean, flat JSON with only the key fields you care about (e.g., report date, Paydex, viability score, credit limit). Includes Sticky Notes for quick setup help and guidance. โ What this template does Requests a D&B report* (PDF) for a specific *DUNS** via HTTP Converts* the API response into a *binary PDF file** Extracts** the text from the PDF for analysis Uses OpenAI with a Structured Output Parser to return a flat JSON Designed to be extended to Sheets, databases, or CRMs ๐งฉ How it works (node-by-node) Manual Trigger โ Runs the workflow on demand ("When clicking 'Execute workflow'"). D&B Report (HTTP Request) โ Calls the D&B Reports API for a Business Information Report (PDF). Convert to PDF File (Convert to File) โ Turns the D&B response payload into a binary PDF. Extract Binary (Extract from File) โ Extracts text content from the PDF. OpenAI Chat Model โ Provides the language model context for the analyzer. Analyze PDF (AI Agent) โ Reads the extracted text and applies strict rules for a flat JSON output. Structured Output (AI Structured Output Parser) โ Enforces a schema and validates/auto-fixes the JSON shape. (Optional) Get Bearer Token (HTTP Request) โ Template guidance for OAuth token retrieval (shown as disabled; included for reference if you prefer Bearer flows). ๐ ๏ธ Setup instructions (from the JSON) 1) D&B Report (HTTP Request) Auth:* Header Auth (use an n8n *HTTP Header Auth** credential) URL:** https://plus.dnb.com/v1/reports/duns/804735132?productId=birstd&inLanguage=en-US&reportFormat=PDF&orderReason=6332&tradeUp=hq&customerReference=customer%20reference%20text Headers:** Accept: application/json Credential Example:** D&B (HTTP Header Auth) > Put your Authorization: Bearer <token> header inside this credential, not directly in the node. 2) Convert to PDF File (Convert to File) Operation:** toBinary Source Property:** contents[0].contentObject > This takes the PDF content from the D&B API response and converts it to a binary file for downstream nodes. 3) Extract Binary (Extract from File) Operation:** pdf > Produces a text field with the extracted PDF content, ready for AI analysis. 4) OpenAI Model(s) OpenAI Chat Model** Model:** gpt-4o (as configured in the JSON) Credential:* Your stored *OpenAI API* credential (do *not** hardcode keys) Wiring:** Connect OpenAI Chat Model as ai_languageModel to Analyze PDF Connect another OpenAI Chat Model (also gpt-4o) as ai_languageModel to Structured Output 5) Analyze PDF (AI Agent) Prompt Type:** define Text:** ={{ $json.text }} System Message (rules):** You are a precision extractor. Read the provided business report PDF and return only a single flat JSON object with the fields below. No arrays/lists. No prose. If a value is missing, output null. Dates: YYYY-MM-DD. Numbers: plain numerics (no commas or $). Prefer most recent or highest-level overall values if multiple are shown. Never include arrays, nested structures, or text outside of the JSON object. 6) Structured Output (AI Structured Output Parser) JSON Schema Example:** { "report_date": "", "company_name": "", "duns": "", "dnb_rating_overall": "", "composite_credit_appraisal": "", "viability_score": "", "portfolio_comparison_score": "", "paydex_3mo": "", "paydex_24mo": "", "credit_limit_conservative": "" } Auto Fix:** enabled Wiring:* Connect as ai_outputParser to *Analyze PDF** 7) (Optional) Get Bearer Token (HTTP Request) โ Disabled example If you prefer fetching tokens dynamically: Auth:** Basic Auth (D&B username/password) Method:** POST URL:** https://plus.dnb.com/v3/token Body Parameters:** grant_type = client_credentials Headers:** Accept: application/json Downstream usage:** Set header Authorization: Bearer {{$json["access_token"]}} in subsequent calls. > In this template, the D&B Report node uses Header Auth credential instead. Use one strategy consistently (credentials are recommended for security). ๐ง Output schema (flat JSON) The analyzer + parser return a single flat object like: { "report_date": "2024-12-31", "company_name": "Example Corp", "duns": "123456789", "dnb_rating_overall": "5A2", "composite_credit_appraisal": "Fair", "viability_score": "3", "portfolio_comparison_score": "2", "paydex_3mo": "80", "paydex_24mo": "78", "credit_limit_conservative": "25000" } ๐งช Test flow Click Execute workflow (Manual Trigger). Confirm D&B Report returns the PDF response. Check Convert to PDF File for a binary file. Verify Extract from File produces a text field. Inspect Analyze PDF โ Structured Output for valid JSON. ๐ Security notes Do not hardcode tokens in nodes; use Credentials (HTTP Header Auth or Basic Auth). Restrict who can execute the workflow if it's accessible from outside your network. Avoid storing sensitive payloads in logs; mask tokens/headers. ๐งฉ Customize Map the structured JSON to Google Sheets, Postgres/BigQuery, or a CRM. Extend the schema with additional fields (e.g., number of employees, HQ address) โ keep it flat. Add validation (Set/IF nodes) to ensure required fields exist before writing downstream. ๐ฉน Troubleshooting Missing PDF text?* Ensure *Convert to File** source property is contents[0].contentObject. Unauthorized from D&B?** Refresh/verify token; confirm Header Auth credential contains Authorization: Bearer <token>. Parser errors?** Keep the agent output short and flat; the Structured Output node will auto-fix minor issues. Different DUNS/product?** Update the D&B Report URL query params (duns, productId, etc.). ๐๏ธ Sticky Notes (included) Overview:** "Fetch D&B Company Report (PDF) โ Convert โ Extract โ Summarize to Structured JSON (n8n)" Setup snippets for Data Blocks (optional) and Auth flow ๐ฌ Contact Need help customizing this (e.g., routing the PDF to Drive, mapping JSON to your CRM, or expanding the schema)? ๐ง robert@ynteractive.com ๐ https://www.linkedin.com/in/robert-breen-29429625/ ๐ https://ynteractive.com
by Cheng Siong Chin
Introduction Automates AI-driven assignment grading with HTML and CSV output. Designed for educators evaluating submissions with consistent criteria and exportable results. How It Works Webhook receives papers, extracts text, prepares data, loads answers, AI grades submissions, generates results table, converts to HTML/CSV, returns response. Workflow Template Webhook โ Extract Text โ Prepare Data โ Load Answer Script โ AI Grade (OpenAI + Output Parser) โ Generate Results Table โ Convert to HTML + CSV โ Format Response โ Respond to Webhook Workflow Steps Input & Preparation: Webhook receives paper, extracts text, prepares data, loads answer script. AI Grading: OpenAI evaluates against answer key, Output Parser formats scores and feedback. Output & Response: Generates results table, converts to HTML/CSV, returns multi-format response. Setup Instructions Trigger & Processing: Configure webhook URL, set text extraction parameters. AI Configuration: Add OpenAI API key, customize grading prompts, define Output Parser JSON schema. Prerequisites OpenAI API key Webhook platform n8n instance Use Cases University exam grading Corporate training assessments Customization Modify rubrics and criteria Add PDF output Integrate LMS (Canvas, Blackboard) Benefits Consistent AI grading Multi-format exports Reduces grading time by 90%
by Rohit Dabra
Odoo CRM MCP Server Workflow ๐ Overview This workflow connects an AI Agent with Odoo CRM using the Model Context Protocol (MCP). It allows users to manage CRM data in Odoo through natural language chat commands. The assistant interprets the userโs request, selects the appropriate Odoo action, and executes it seamlessly. ๐น Key Features Contacts Management**: Create, update, delete, and retrieve contacts. Opportunities Management**: Create, update, delete, and retrieve opportunities. Notes Management**: Create, update, delete, and retrieve notes. Conversational AI Agent**: Understands natural language and maps requests to Odoo actions. Model Used**: OpenAI Chat Model. This makes it easy for end-users to interact with Odoo CRM without needing technical commandsโjust plain language instructions. โถ๏ธ Demo Video Watch the full demo here: ๐ YouTube Demo Video โ๏ธ Setup Guide Follow these steps to set up and run the workflow: 1. Prerequisites An Odoo instance configured with CRM enabled. An n8n or automation platform account where MCP workflows are supported. An OpenAI API key with access to GPT models. MCP Server installed and running. 2. Import the Workflow Download the provided workflow JSON file. In your automation platform (n8n, Langflow, or other MCP-enabled tool), choose Import Workflow. Select the JSON file and confirm. 3. Configure MCP Server Go to your MCP Server Trigger node in the workflow. Configure it to connect with your Odoo instance. Set API endpoint. Provide authentication credentials (API key). Test the connection to ensure the MCP server can reach Odoo. 4. Configure the OpenAI Model Select the OpenAI Chat Model node in the workflow. Enter your OpenAI API Key. Choose the model (e.g., gpt-5 or gpt-5-mini). 5. AI Agent Setup The AI Agent node links the Chat Model, Memory, and MCP Client. Ensure the MCP Client is mapped to the correct Odoo tools (Contacts, Opportunities, Notes). The System Prompt defines assistant behaviorโuse the tailored system prompt provided earlier. 6. Activate and Test Turn the workflow ON (toggle Active). Open chat and type: "Create a contact named John Doe with email john@example.com." "Show me all opportunities." "Add a note to John Doe saying 'Follow-up scheduled for Friday'." Verify the results in your Odoo CRM. โ Next Steps Extend functionality with Tasks, Stages, Companies, and Communication Logs for a complete CRM experience. Add confirmation prompts for destructive actions (delete contact/opportunity/note). Customize the AI Agentโs system prompt for your organizationโs workflows.
by Satva Solutions
๐ข Manual Trigger Workflow starts manually to initiate the reconciliation process on demand. ๐ Fetch Invoices & Bank Statements Retrieves invoice data and bank statement data from Google Sheets for comparison. ๐ Merge Data Combines both datasets into a single structured dataset for processing. ๐งฉ Format Payload for AI Function node prepares and structures the merged data into a clean JSON payload for AI analysis. ๐ค AI Reconciliation AI Agent analyzes the invoice and bank statement data to identify matches, discrepancies, and reconciled entries. ๐งฎ Parse AI Output Parses the AI response into a structured format suitable for adding back to Google Sheets. ๐ Update Sheets Adds the reconciled data and reconciliation results into the target Google Sheet for recordkeeping. ๐งพ Prerequisites โ OpenAI API Credentials Required for the AI Reconciliation node to process and match transactions. Add your OpenAI API key in n8n โ Credentials โ OpenAI. โ Google Sheets Credentials Needed to read invoice and bank statement data and to write reconciled results. Add credentials in n8n โ Credentials โ Google Sheets. โ Google Sheets Setup The connected spreadsheet must contain the following tabs: Invoices โ for invoice data Bank_Statement โ for bank transaction data Reconciled_Data โ for storing the AI-processed reconciliation output โ Tab Structure & Required Headers Invoices Sheet Columns: Invoice_ID Invoice_Date Customer_Name Amount Status Bank_Statement Sheet Columns: Transaction_ID Transaction_Date Description Debit/Credit Amount Reconciled_Data Sheet Columns: Invoice_ID Transaction_ID Matched_Status Remarks Confidence_Score โ๏ธ n8n Environment Setup Ensure all nodes are connected correctly and the workflow has permission to access the required sheets. Test each fetch and write operation before running the full workflow.
by Bastian Diaz
๐ฏ Description Automatically generates, designs, stores, and logs complete Instagram carousel posts. It transforms a simple text prompt into a full post with copy, visuals, rendered images, Google Drive storage, and a record in Google Sheets. โ๏ธ Use case / What it does This workflow enables creators, educators, or community managers to instantly produce polished, on-brand carousel assets for social media. It integrates OpenAI GPT-4.1, Pixabay, Templated.io, Google Drive, and Google Sheets into one continuous content-production chain. ๐ก How it works 1๏ธโฃ Form Trigger โ Collects the user prompt via a simple web form. 2๏ธโฃ OpenAI GPT-4.1 โ Generates structured carousel JSON: titles, subtitles, topic, description, and visual keywords. 3๏ธโฃ Code (Format content) โ Parses the JSON output for downstream use. 4๏ธโฃ Google Drive (Create Folder) โ Creates a subfolder for the new carousel inside โRRSSโ. 5๏ธโฃ HTTP Request (Pixabay) โ Searches for a relevant image using GPTโs visual suggestion. 6๏ธโฃ Code (Get first result) โ Extracts the top Pixabay result and image URL. 7๏ธโฃ Templated.io โ Fills the design template layers (titles/subtitles/topic/image). 8๏ธโฃ HTTP Request (Download renders) โ Downloads the rendered PNGs from Templated.io. 9๏ธโฃ Google Drive (Upload) โ Uploads the rendered images into the created folder. 10๏ธโฃ Google Sheets (Save in DB) โ Logs metadata (title, topic, folder link, description, timestamp, status). ๐ Connectors used OpenAI GPT-4.1 (via n8n LangChain node) Templated.io API (design rendering) Pixabay API (stock image search) Google Drive (storage + folder management) Google Sheets (database / logging) Form Trigger (input collection) ๐งฑ Input / Output Input: User-submitted โPromptโ (text) via form Output: Generated carousel images stored in Google Drive Spreadsheet row in Google Sheets containing title, topic, description, Drive URL, status โ ๏ธ Requirements / Setup Valid credentials for: OpenAI API (GPT-4.1 access) Templated.io API key Pixabay API key Google Drive + Google Sheets OAuth connections Existing Google Drive folder ID for RRSS storage Spreadsheet with matching column headers (Created At, Title, Topic, Folder URL, Description, Status) Published form URL for user prompts ๐ Example applications / extensions Educational themes (mental health, fitness, sustainability). Extend to auto-publish to Instagram Business via Meta API. Add Notion logging or automated email notifications. Integrate scheduling (Cron node) to batch-generate weekly carousels.
by yusan25c
How It Works This template is a workflow that registers Jira tickets to Pinecone. By combining it with the Automated Jira Ticket Responses with GPT-4 and Pinecone Knowledge Base template, you can continuously improve the quality of automated responses in Jira. Prerequisites A Jira account and credentials (API key and email address) A Pinecone account and credentials (API key and environment settings) OpenAI credentials (API key) Setup Instructions Jira Credentials Register your Jira credentials (API key and email address) in n8n. Vector Database Setup (Pinecone) Register your Pinecone credentials (API key and environment variables) in n8n. AI Node Configure the OpenAI node with your credentials (API key). Step by Step Scheduled Trigger The workflow runs at regular intervals according to the schedule set in the Scheduled Trigger node. Jira Trigger (Completed Tickets) Retrieves the summary, description, and comments of completed Jira tickets. Register to Pinecone Converts the retrieved ticket information into vectors and registers them in Pinecone. Notes Configure the Scheduled Trigger interval carefully to avoid exceeding API rate limits. Further Reference For a detailed walkthrough (in Japanese), see this article: ๐ Automating knowledge registration to Pinecone with n8n (Qiita) You can find the template file on GitHub here: ๐ Template File on GitHub
by Moe Ahad
How it works User enters name of a city for which most current weather information will be gathered Custom Python code processes the weather data and generates a custom email about the weather AI agent further customizes the email and add a related joke about the weather Recipient gets the custom email for the city Set up instructions Enter city to get the weather data Add OpenWeather API and replace <your_API_key> with your actual API key Add your OpenAI API in OpenAI Chat Model Node Add your Gmail credentials and specify a recipient for the custom email
by Raz Hadas
This n8n template demonstrates how to automate stock market technical analysis to detect key trading signals and send real-time alerts to Discord. It's built to monitor for the Golden Cross (a bullish signal) and the Death Cross (a bearish signal) using simple moving averages. Use cases are many: Automate your personal trading strategy, monitor a portfolio for significant trend changes, or provide automated analysis highlights for a trading community or client group. ๐ก Good to know This template relies on the Alpha Vantage API, which has a free tier with usage limits (e.g., API calls per minute and per day). Be mindful of these limits, especially if monitoring many tickers. The data provided by free APIs may have a slight delay and is intended for informational and analysis purposes. Disclaimer**: This workflow is an informational tool and does not constitute financial advice. Always do your own research before making any investment decisions. โ๏ธ How it works The workflow triggers automatically every weekday at 5 PM, after the typical market close. It fetches a list of user-defined stock tickers from the Set node. For each stock, it gets the latest daily price data from Alpha Vantage via an HTTP Request and stores the new data in a PostgreSQL database to maintain a history. The workflow then queries the database for the last 121 days of data for each stock. A Code node calculates two Simple Moving Averages (SMAs): a short-term (60-day) and a long-term (120-day) average for both today and the previous day. Using If nodes, it compares the SMAs to see if a Golden Cross (short-term crosses above long-term) or a Death Cross (short-term crosses below long-term) has just occurred. Finally, a formatted alert message is sent to a specified Discord channel via a webhook. ๐ How to use Configure your credentials for PostgreSQL and select them in the two database nodes. Get a free Alpha Vantage API Key and add it to the "Fetch Daily History" node. For best practice, create a Header Auth credential for it. Paste your Discord Webhook URL into the final "HTTP Request" node. Update the list of stock symbols in the "Set - Ticker List" node to monitor the assets you care about. The workflow is set to run on a schedule, but you can press "Test workflow" to trigger it manually at any time. โ Requirements An Alpha Vantage account for an API key. A PostgreSQL database to store historical price data. A Discord account and a server where you can create a webhook. ๐จ Customising this workflow Easily change the moving average periods (e.g., from 60/120 to 50/200) by adjusting the SMA_SHORT and SMA_LONG variables in the "Compute 60/120 SMAs" Code node. Modify the alert messages in the "Set - Golden Cross Msg" and "Set - Death Cross Msg" nodes. Swap out Discord for another notification service like Slack or Telegram by replacing the final HTTP Request node.
by Can KURT
n8n โ Outlook AI Categorization & Labeling (Fully Automated) > Zero manual mapping. The workflow automatically discovers your Outlook folders, understands the context, assigns the correct category, and moves the email into the right folder. It uses the original Microsoft Outlook nodes plus an AI Agent. You can connect OpenAI or any other LLM provider. โจ Features Self-Discovery:** Scans your Outlook folders automatically โ no manual mapping required. AI-Powered Decisions:** Considers sender, subject, content, links, attachments, timing, and business context. Label + Move:** Assigns the right Outlook category and moves the email into the correct folder. Dual Category Logic:** Can apply both a primary and a secondary category (e.g., Action + Project). Error Handling:** Captures errors and continues without breaking the workflow. Flexible AI Backend:** Replace OpenAI with your own LLM if preferred. ๐ Setup (5 Steps) Connect Outlook In n8n โ Credentials โ Microsoft Outlook, grant at least Mail.ReadWrite. Connect AI In n8n โ Credentials, set up OpenAI (or another model). Works best with GPT-4.x or GPT-4o. Import the Workflow n8n โ Workflows โ Import from File/Clipboard and paste the provided JSON. Enable Trigger Adjust the Schedule Trigger (e.g., every 5 minutes). Run & Verify Test run and watch emails get categorized and moved automatically. ๐ง How It Works Schedule Trigger pulls new emails Loop Over Items processes them one by one Markdown / varEmail cleans the content Get Many Folders fetches Outlook categories and folders Summarize + Code prepare category IDs AI Agent applies deep categorization logic Update Category applies the Outlook category Move Folder places the email in the right folder Error Handling ensures workflow stability ๐งฉ System Prompt Example You are an advanced AI email categorization system. Your mission is to intelligently analyze and categorize emails with maximum accuracy and context awareness. INTELLIGENT CATEGORIZATION ENGINE: Parse all available categories: {{ $json.category }} Multi-layer analysis: Sender, Subject, Body, Links, Attachments Prioritize: Security threats, Action Required, Business Context Specialized: SaaS, Hosting, E-commerce, Finance, Support, Corporate Anti-Spam: Pattern detection, spoofing, red-flag subjects Dual Logic: Primary + Secondary categories when applicable OUTPUT FORMAT (JSON only): { "subject": "EXACT_EMAIL_SUBJECT", "category": "PRIMARY_CATEGORY_FROM_AVAILABLE_LIST", "subCategory": "SECONDARY_CATEGORY_IF_APPLICABLE", "analysis": "Reasoning", "confidence": "HIGH/MEDIUM/LOW" } Available Categories: {{ $json.category }} โ๏ธ Parameters & Notes Uses only existing Outlook categories (never invents new ones). Works with any LLM that supports Chat Completions. Requires Mail.ReadWrite permissions. Safe fallback: if unsure, it uses the Action category. ๐ก๏ธ Security Processes only what is needed for classification. No external logging of email content unless you configure it. AI provider can be swapped for self-hosted LLMs for compliance. ๐ License & Sharing License:** MIT (or your choice). Tags:** n8n, Outlook, Email, AI, Automation, Categorization Import Method:** Copy/paste workflow JSON into n8n. โ Summary Connect โ Import โ Run. No manual mapping. AI-powered categorization that labels and organizes your Outlook mailbox automatically.
by Shohani
Overview This n8n workflow automatically fetches the latest post from a Telegram channel, translates it using OpenAI, and republishes it to another channel. It supports text, images, and videos. Features Works Without Admin Privileges** - Does not require any bot to be an admin in the source channel. Scheduled execution** - Runs daily at a configurable time AI-powered translation** - Uses OpenAI GPT-4o-mini for natural translations Multi-media support** - Handles text, images, and videos Easy configuration** - All settings in one centralized node Automatic content cleaning** - Removes original channel signatures Prerequisites Required Credentials Telegram Bot API Create a bot via @BotFather Get your bot token Add the bot as an admin to your target channel OpenAI API Sign up at OpenAI Platform Generate an API key Ensure you have sufficient credits Channel Requirements Source Telegram channel must be public Bot must have admin rights in the target channel Setup Instructions 1. Import the Workflow Copy the workflow JSON and import it into your n8n instance The workflow will be imported in inactive state 2. Configure Credentials Telegram Bot Credentials Go to Credentials โ Add Credential Select Telegram Enter your bot token from BotFather Test the connection Save as "TelegramBot" OpenAI Credentials Go to Credentials โ Add Credential Select OpenAI Enter your OpenAI API key Save as "OpenAI API" 3. Configure Channel Settings Open the "Set Source Channel" node and modify: sourceChannel: "channel_here", // Source channel username (without @) targetChannel: "@your_channel_here", // Target channel (@channel or chat_id) targetLanguage: "Persian", // Target language for translation channelSignature: "signature text" // The channel signature to replaced 4. Adjust Schedule (Optional) Open the "Daily Schedule" node Default: Runs daily at 9:00 AM Modify triggerAtHour and triggerAtMinute as needed 5. Test the Workflow Click "Execute Workflow" to test manually Check if content appears in your target channel Verify translation quality and formatting 6. Activate the Workflow Toggle the workflow to Active status Monitor execution logs for any errors Content Filtering Modify the "Clean Post Content" node to remove specific text patterns: let cleanPost = $input.first().json.post .replaceAll('unwanted_text', '') .replaceAll(/regex_pattern/g, '') .trim(); Multiple Source Channels To monitor multiple channels: Duplicate the workflow Change the sourceChannel in each copy Use different schedules to avoid conflicts Custom Scheduling The Schedule Trigger supports various patterns: Hourly**: { "triggerAtMinute": 0 } Weekly**: { "triggerAtWeekday": 1, "triggerAtHour": 9 } Multiple times**: Use multiple schedule nodes Troubleshooting Common Issues No content fetched Verify source channel is public Check if channel name is correct (without @) Ensure channel has recent posts Translation fails Verify OpenAI API key is valid Check API usage limits and credits Ensure content is not empty Can't send to target channel Verify bot is admin in target channel Check channel username/ID format Test bot permissions manually Compliance Respect copyright and fair use policies Add proper attribution when required Follow Telegram's Terms of Service
by Sridevi Edupuganti
Telegram Voice โ AI Summary & Sentiment Analysis via Gmail This n8n template demonstrates how to capture Telegram voice messages, transcribe them into text using AssemblyAI, analyze the transcript with AI for summary and sentiment insights, and finally deliver a structured email report via Gmail. Use cases Automating meeting or lecture voice note transcriptions. Gathering student feedback or training session insights from voice messages. Quickly summarizing Telegram-delivered audio inputs into structured reports. Reducing manual effort in capturing sentiment and key action items from conversations. How it works A voice message is sent to a connected Telegram Bot. The workflow fetches the file and uploads it to AssemblyAI. AssemblyAI generates a transcript from the audio. The transcript is analyzed by OpenAI to extract: Executive summary (120โ180 words) Sentiment label and score Key points Action items (if any) Notable quotes Topics The formatted analysis is sent as an email report using Gmail. The workflow ends with a clean summary email containing actionable insights. How to use Import this workflow into your n8n instance. Set up and connect the required credentials: Telegram Bot API token AssemblyAI API key OpenAI API key Gmail OAuth2 account Replace placeholders (e.g., <<YOUR_EMAIL ID>> and <<YOUR_ASSEMBLYAI_API_KEY>>) with your actual values. Start the workflow. Whenever a voice message is received on the Telegram Bot, the workflow will process it end-to-end and deliver a polished email report. Requirements Telegram Bot account (API token) AssemblyAI account with API key OpenAI account with API key Gmail OAuth2 credentials configured in n8n Active n8n instance Customising this workflow You can customize the email formatting, sentiment thresholds, or extend the workflow to save transcripts into Google Drive, Airtable, or any other connected apps. Additionally, you can trigger the same workflow from multiple input sources (e.g., local audio files, Google Drive links, or Telegram).
by Yaron Been
Create Multi-Channel Content with O3 Director & GPT-4 Specialist Agents This n8n workflow creates a complete AI-powered content department. It starts when a chat request is received, then a Content Director Agent (powered by OpenAI O3) analyzes the request and delegates tasks to specialized agents (blogs, social, video, email, website, strategy). Each agent is powered by GPT-4.1-mini, keeping costs low and quality high. โ ๐ฉ Section 1: Trigger & Director Setup โ๏ธ Nodes 1๏ธโฃ When Chat Message Received What it does:** Starts the workflow whenever a user sends a content request. Why itโs useful:** Allows real-time or on-demand content creation from chat inputs. 2๏ธโฃ Content Director Agent (O3) What it does:** Analyzes user request, defines the best content mix, and delegates tasks to specialist agents. Why itโs useful:** Keeps your brand voice consistent and ensures all channels align to a unified content strategy. ๐ก Beginner Benefit โ Single entry point โ just type your content idea once โ AI Director coordinates everything for you โ No need to manage multiple tools โ ๐ฅ Section 2: Specialist Content Agents Each request gets routed to one (or several) of these agents, depending on the strategy. 3๏ธโฃ Blog Content Writer Long-form articles, editorials, and thought leadership pieces. 4๏ธโฃ Social Media Content Creator Social posts, captions, hashtags, and community content. 5๏ธโฃ Video Script Writer YouTube scripts, explainer videos, and video marketing content. 6๏ธโฃ Email Newsletter Writer Campaigns, nurture sequences, and newsletter copy. 7๏ธโฃ Website Copy Specialist Landing pages, product descriptions, and conversion-focused web copy. 8๏ธโฃ Content Strategist & Planner Editorial calendars, campaign planning, and audience strategy. ๐ก Beginner Benefit โ Each agent is an expert in its field โ Powered by GPT-4.1-mini โ faster and cheaper โ Parallel execution โ all content types can be generated at once โ ๐ง Section 3: Language Models & Execution Flow O3 Model โ Content Director** Handles analysis, strategy, and delegation. GPT-4.1-mini โ All Specialists** Powers blog, social, video, email, website, and strategy agents. Think Node** Helps the Content Director organize reasoning before delegating tasks. ๐ก Beginner Benefit โ AI Director (O3) = smart leadership โ Specialists (GPT-4.1-mini) = cost-efficient execution โ Built-in reasoning = better, more aligned campaigns ๐ Workflow Overview | Section | What Happens | Key Benefit | | --------------------------- | --------------------------------------------------------------- | -------------------------- | | ๐ฉ Trigger & Director Setup | Workflow starts from chat โ Content Director interprets request | Centralized control | | ๐ฅ Specialist Agents | Each AI agent produces tailored content | Multi-channel coverage | | ๐ง Models & Flow | O3 for Director, GPT-4.1-mini for specialists | Cost-efficient + strategic | ๐ How You Benefit Overall โ One input โ full content campaign โ Consistent brand voice across all platforms โ Cost-effective (O3 only for strategy, GPT-4.1-mini for bulk work) โ Ready-to-publish content in minutes โจ Youโve basically built an AI marketing department inside n8n โ no extra staff required! ๐