by Krishna Sharma
📄 AI-Powered Document Summarizer & Notifier This workflow monitors a Google Drive folder for new files (Google Docs or PDFs), extracts text, summarizes content with OpenAI, and sends results to Slack or Email. 🔧 How It Works Monitors a Google Drive folder for new files. Detects file type → Google Doc vs PDF. Extracts text (via Google Docs API or PDF extractor). Summarizes & analyzes content using OpenAI. Sends results to Slack and/or Email. 👤 Who Is This For? Business teams → Quick digests of reports, proposals, contracts. Educators / researchers → Summaries of long study materials. Founders / managers → Daily summaries without opening every file. Operations teams → Compliance and documentation tracking. 💡 Use Case / Problem Solved Reading long documents is time-consuming. Sharing key points across teams requires manual effort. Important context (sentiment, action items) is often missed. 👉 This workflow solves it by auto-summarizing documents and notifying teams instantly. ⚙️ What This Workflow Does Monitors Google Drive for new Google Docs or PDFs. Extracts text automatically. Uses OpenAI to generate: Title Summary Key Points Suggested Action Items Language detection Sentiment (positive, neutral, negative) Pushes output to Slack channel and/or Email inbox. 🛠️ Setup Instructions Prerequisites Google Drive (OAuth2) Google Docs (OAuth2) OpenAI API Key Slack (OAuth2) or Gmail (OAuth2) Steps to Configure Connect Google Drive Choose the folder you want to monitor. Set up file type routing Use an IF node to split Docs vs PDFs. For Google Docs Use Google Docs Get → extract text. For PDFs Use Google Drive Download → Extract PDF. Send text to OpenAI Connect to your OpenAI model. Customize the system prompt to generate title, summary, sentiment, etc. Notify Send output to Slack channel or Gmail. Save & activate your workflow. 📌 Notes Adjust OpenAI prompt to suit your context. For large PDFs, consider splitting into smaller chunks.
by koichi nagino
AI-Powered Invoice Processing from Gmail to Google Sheets with Slack Approval This workflow completely automates your invoice processing pipeline. It triggers when a new invoice email arrives in Gmail, uses AI to extract key data from the PDF attachment, logs it in a Google Sheet, and sends a request to Slack with simple links for one-click approval or rejection. Who's it for? This template is perfect for small business owners, finance teams, freelancers, and anyone looking to eliminate the manual work of processing invoices. It saves hours of data entry, reduces human error, and streamlines the approval process. How it works Trigger: The workflow starts when an email with a specific subject (e.g., "Invoice") arrives in Gmail. Extraction: It automatically downloads the first PDF attachment from the email and extracts all its text content. AI Processing: The extracted text is sent to an AI model, which intelligently identifies and pulls out key details: invoice number, issue date, company name, total amount, and due date. Logging: This structured data is appended as a new row in your Google Sheet. The status is automatically set to "pending". The full, raw text from the PDF is also saved for easy verification. Approval Request: A formatted message is sent to a designated Slack channel. This message includes the key invoice details and two unique links: one to approve and one to reject the invoice. Handle Response: When a user clicks either the "Approve" or "Reject" link, the corresponding Webhook in the workflow is triggered. Update Sheet: The workflow finds the correct row in the Google Sheet using the invoice number and updates its status to "approved" or "rejected". Confirmation: A final confirmation message is sent to the Slack channel, closing the loop and informing the team of the action taken. How to set up Credentials: Add your credentials for Gmail, OpenAI, Google Sheets, and Slack in the designated nodes. Gmail Trigger: In the "1. New Invoice Email Received" node, change the search query q from "incoice" to the keyword you use for invoice emails (e.g., "invoice"). Google Sheets: In the three Google Sheets nodes ("6. Log Invoice...", "9a. Update Status to Approved", and "9b. Update Status to Rejected"), enter your Google Sheet ID and the name of the sheet. Slack: In the three Slack nodes ("7. Send Approval Request...", "10a. Send Approval Confirmation", and "10b. Send Rejection Confirmation"), enter your Slack Channel ID. Webhook URLs: First, activate the workflow using the toggle in the top-right corner. Open the "Webhook (Approve)" node, go to the Production URL tab, and copy the URL. Paste this URL into the "7. Send Approval Request to Slack" node, replacing the https://YOUR_WEBHOOK_BASE_URL/webhook/approval part of the approval link. Repeat this process for the "Webhook (Reject)" node and the rejection link. Activate Workflow: Ensure the workflow is active for the Webhooks to work continuously. How to customize AI Prompt: You can modify the prompt in the "5. Extract Invoice Data with AI" node to extract different or additional fields to match your specific invoice formats. Slack Messages: Feel free to customize the text in all three Slack nodes to better fit your team's tone and communication style.
by Rahul Joshi
📊 Description Simplify your social media publishing process by automating post scheduling from Google Sheets directly to Meta (Facebook Pages). 📅💬 This workflow detects pending posts, uploads images with captions to your Facebook Page, updates the sheet status, and sends real-time notifications via Slack and email — keeping your marketing team always in sync. 🚀 What This Template Does 1️⃣ Trigger – Monitors a Google Sheet for new or pending posts every minute. ⏰ 2️⃣ Filter – Identifies the latest “pending” entry for publishing. 🔍 3️⃣ Extract – Captures post details like caption, image URL, and ID. 🧾 4️⃣ Publish – Uploads the post to your Meta (Facebook) Page using the Graph API. 📤 5️⃣ Validate – Confirms success or failure of the post operation. ✅ 6️⃣ Notify – Sends instant Slack and email updates on publishing status. 💌 7️⃣ Update – Marks the published post as “Completed” in Google Sheets. 📊 Key Benefits ✅ Hands-free publishing from Google Sheets to Meta ✅ Instant Slack and email alerts for post outcomes ✅ Prevents duplicate or failed post uploads ✅ Centralized content tracking and status updates ✅ Improves consistency and speed in social media operations Features Google Sheets trigger for post scheduling Facebook Graph API integration for auto-posting Slack and Outlook notifications for success/error alerts Automatic sheet updates post-publication Error handling and reporting for failed posts Requirements Google Sheets OAuth2 credentials Facebook Page Access Token via Graph API Slack Bot token for notifications Outlook or SMTP credentials for email updates Target Audience Marketing teams managing Facebook content calendars 📆 Social media managers seeking automated posting 📣 Agencies coordinating client content delivery 📋 Teams tracking campaign publishing performance 📊
by WeblineIndia
Monthly Energy Generation Report (PostgreSQL → PDF → Email) This workflow automatically collects monthly energy generation data from a PostgreSQL database, converts it into a structured PDF report and emails it to stakeholders. It eliminates manual report creation and ensures timely delivery of performance summaries. Who’s it for Energy companies monitoring solar, wind or hydro generation Operations & maintenance (O\&M) teams needing monthly summaries Managers and executives who require periodic performance reports Data analysts who want automated reporting instead of manual exports How it works Monthly Trigger → Schedules the workflow to run once a month. Postgres Node → Fetches energy data from the energy_data table. Code Node (Transform Data) → Structures the raw records into JSON with metadata (date_range, records, note). HTTP Request (PDF.co API) → Converts structured data into a formatted PDF report. Gmail Node (Send Report) → Sends the PDF report (or link) via email to the configured recipient. How to set up Import the workflow JSON into n8n. Configure credentials: PostgreSQL connection (DB host, user, password, database, schema). Gmail OAuth2 credentials. PDF.co API key (for HTML → PDF conversion). Update: Database table (energy_data). Email recipients in the Gmail node. PDF template (if custom formatting is required). Activate workflow. Requirements n8n (self-hosted or cloud) PostgreSQL database with energy generation data PDF.co API account with valid API key Gmail account with OAuth2 access Internet access for API calls How to customize Change the SQL query in the Postgres node to filter specific plants or date ranges. Update the Code node to add extra fields (e.g., average power, anomalies). Modify the PDF.co request body to use a custom HTML template for branding. Replace Gmail with Outlook, SMTP, or Slack for distribution. Add-ons Add Slack/Teams node to notify teams when reports are sent. Store PDFs in Google Drive or S3 for archival. Add a dashboard (e.g., Grafana or Superset) that references the same DB for real-time view. Integrate with Jira to auto-create tasks for underperformance alerts. Use Case Examples Solar company emailing monthly reports to plant owners. Wind farm operator generating regulatory compliance reports. O&M teams automating KPI summaries for executives. Consulting firms monitoring multiple clients’ energy production. Common Troubleshooting | Issue | Possible Cause | Solution | | ------------------------------ | -------------------------------------- | ------------------------------------------------------- | | Workflow does not trigger | Cron not set correctly | Verify Schedule Trigger node interval is monthly | | No data returned from Postgres | Wrong schema/table or DB creds | Check DB connection and table name | | PDF not generated | Invalid/missing PDF.co API key | Generate a new key in PDF.co dashboard | | Email not sent | Gmail OAuth expired or wrong recipient | Reconnect Gmail credentials and confirm recipient email | | PDF output malformed | Incorrect JSON → HTML conversion | Check Code node formatting and PDF.co request body | Need Help? Our n8n workflow automation experts at WeblineIndia can help you: Set up the PostgreSQL connection securely, Customize the PDF layout with your company branding, Add more delivery channels (Slack, Teams, S3), Extend reporting logic (KPIs, charts, anomaly detection), And so much more.
by Ali Amin
🎯 Accounting Alerts Automation Purpose: Automatically track Companies House filing deadlines for UK accounting firms and prevent costly penalties (£150-£1,500 per missed deadline). How it works: Daily automated checks pull live deadline data from Companies House API Color-coded email alerts (Red/Orange/Yellow/Green) prioritize urgent deadlines Interactive "Yes/No" buttons let recipients confirm completion status All data syncs back to Google Sheets for complete audit trail Value: Saves 2-3 hours/week per firm while eliminating manual tracking errors. ⚙️ Daily Deadline Check & Alert System Runs: Every weekday at 5 PM (Mon-Fri) What happens: Read Company Database - Fetches all tracked companies from Google Sheets Get Company Data - Pulls live filing deadlines from Companies House API for each company Update Due Dates - Syncs latest deadline data back to the tracking sheet Build Interactive Email - Creates HTML email with: Color-coded urgency indicators (days remaining) Sortable table by due date Clickable Yes/No confirmation buttons for each company Send via Gmail - Delivers consolidated report to accounting team Why automated: Manual deadline checking across 10-50+ companies is time-consuming and error-prone. This ensures nothing falls through the cracks. ✅ Email Response Handler (Webhook Flow) Triggered when: Recipient clicks "Yes" or "No" button in the alert email What happens: Webhook - Receives confirmation status (company_number, company_name, yes/no) Process Data - Extracts response details from the webhook payload Update Sheet - Records confirmation status in Google Sheets with timestamp Confirmation Page - Displays success message to user Why this matters: Provides instant feedback to the user and creates an audit trail of who confirmed what and when. No separate tracking system needed—everything updates automatically in the same spreadsheet. Result: Accountability without administrative burden. 📋 Setup Requirements Google Sheets Database Structure: Create a sheet with these columns: company_number (manually entered) company_name (manually entered) accounts_due (auto-updated) confirmation_due (auto-updated) confirmation_submitted (updated via email clicks) last_updated (auto-timestamp) Required Credentials: Google Sheets OAuth (for reading/writing data) Companies House API key (free from api.company-information.service.gov.uk) Gmail OAuth (for sending alerts) Webhook Configuration: Update webhook URL in "Build Interactive Email" node to match your n8n instance. Time to Setup: ~15 minutes once credentials are configured.
by n8nwizard
📌 Overview This advanced multi-phase n8n workflow automates the complete research, analysis, and ideation pipeline for a YouTube strategist. It scrapes competitor channels, analyzes top-performing titles and thumbnails, identifies niche trends, gathers audience sentiment from comments, and produces data-driven content ideas—automatically writing them into a structured Google Sheets dashboard. This system is ideal for: YouTube creators Agencies Content strategists Automation engineers Anyone who wants to generate YouTube content ideas backed by real data 🧠 High‑Level Architecture The workflow is split into 5 phases, each building on the previous: Phase 1 — Niche Outliers (Input: User Form) Scrapes 3 high‑quality channels from your niche, extracts their outlier videos, and analyzes why they work. Phase 2 — Broad Niche Insights (Weekly) Scrapes the top trending content in your broad niche (e.g., "AI", "fitness", "personal finance") and logs weekly insights. Phase 3 — Niche Insights (Daily) Scrapes the top videos in your specific micro‑niche daily to keep track of content momentum. Phase 4 — Comment Analysis Analyzes real comments from your channel to understand what your audience likes, dislikes, and wants more of. Phase 5 — Content Ideation Generates 3 highly‑optimized title + thumbnail concepts using all prior insights. Everything is automatically logged into a Google Sheets dashboard. 🧩 Phase-by-Phase Breakdown ⭐ Phase 1 — Niche Outliers (Form Trigger) User enters 3 YouTube channel URLs in a form. Workflow scrapes each channel using Apify YouTube Scraper. Filters for top-performing videos. Extracts: title, views, likes, thumbnail, URL. AI analyzes: Power words in titles (OpenRouter/GPT 4.1-mini) Thumbnail attention hooks (OpenAI Vision) All insights are appended into the “Niche Outliers” sheet. Purpose: Understand what the best creators in your niche are doing. 🌐 Phase 2 — Broad Niche Insights (Weekly — Sundays @ 5 AM) Workflow scrapes the top videos for a broad niche (e.g., “artificial intelligence”). Analyzes: Title structure Power words Thumbnail cues Writes weekly insights to “Broad Niche Weekly” sheet. Purpose: Stay informed about macro‑level trends. 🎯 Phase 3 — Niche Insights (Daily @ 6 AM) Scrapes the top videos in your specific micro‑niche (e.g., “n8n automations”). Runs title + thumbnail analysis. Appends daily results to “Niche Daily”. Results feed directly into Phase 5. Purpose: Track daily momentum and trending formats. 💬 Phase 4 — Comment Analysis (Channel Feedback) Scrapes your channel’s latest 5 videos. Extracts up to 30 comments from each. Aggregates comments. AI identifies: What viewers love What viewers dislike What viewers are asking for Stores patterns in “Comment Analysis” sheet. Purpose: Understand real audience demand. 💡 Phase 5 — Content Ideation Using AI Using insights from all previous phases: Top titles Power words Thumbnail patterns Daily niche trends Audience comment analysis Channel positioning The Creative Agent produces: 3 optimized video titles 3 matching thumbnail concepts These are appended to the “Ideation” sheet. A Slack notification is sent when ideation is ready. Purpose: Fully automated content idea generation. 🗂️ Outputs in Google Sheets The workflow populates these tabs: 📌 Niche Outliers (top competitor videos) 📌 Broad Niche Weekly (weekly trend analysis) 📌 Niche Daily (daily trend analysis) 📌 Comment Analysis (audience sentiment) 📌 Ideation (final titles + thumbnails) 🔧 What This Workflow Automates ✔ Competitor analysis ✔ Thumbnail + title breakdowns ✔ Daily niche tracking ✔ Weekly niche tracking ✔ Viewer sentiment analysis ✔ Fully AI‑generated content ideas ✔ Automatic data logging to Google Sheets ✔ Slack notifications This is essentially a 24/7 AI YouTube strategist. ⚙️ Setup Requirements Apify API Key** (used in 5 scraper nodes) OpenRouter API Key** (for GPT 4.1-mini intelligence) OpenAI API Key** (for thumbnail image analysis) Google Sheets OAuth2 Credential** Make a copy of the provided sheet template Fill Set nodes: Phase II: Broad niche (e.g., "AI") Phase III: Micro niche (e.g., "n8n automations") Phase IV: Your Channel URL Phase V: Your Channel Description 🧪 Testing Guide Test the Form Trigger with 3 competitor channel URLs. Test both Schedule Triggers (weekly + daily) manually. Verify Sheets are receiving rows. Run the full pipeline end‑to‑end. Confirm Slack notification. Everything should chain together smoothly. 🎉 Final Result By the end of this workflow, you have a: 🧠 Data‑driven YouTube strategy system that: studies your niche finds outliers understands your audience detects trends generates smart content ideas every day
by Rakin Jakaria
Use cases are many: Manage your Gmail inbox, schedule calendar events, and handle contact details — all from one central AI-powered assistant. Perfect for freelancers managing clients, agency owners who need streamlined communication, or busy professionals who want a personal AI secretary handling their email and calendar. Good to know At time of writing, each Gemini request is billed per token. See Gemini Pricing for the latest info. The workflow requires Gmail, Calendar, Sheets, and Telegram integrations. Ensure you’ve set up OAuth2 credentials correctly before running. How it works Triggers: The workflow listens for **new Gmail messages or Telegram commands. Smart AI Processing**: Incoming emails are summarized, classified (Client, Sponsorship, or Not Business), and labeled automatically. Auto-Replies**: Depending on classification, the assistant sends pre-written replies (e.g., client acknowledgment, sponsorship rates, or polite rejection). Calendar Management**: Through natural language requests in Telegram, you can schedule, update, or delete calendar events with conflict-checking in place. Contact Handling**: If you send an email to someone not yet in your database, the agent will prompt you for their email, add it to Google Sheets, and reuse it for future tasks. Memory**: The AI maintains conversation context, so repeated tasks feel seamless and natural. How to use Send commands via Telegram like: “Schedule a meeting with Sarah on Monday at 3 PM” “Send an email to David about the proposal” Watch as the assistant checks your calendar, sends emails, and keeps your contacts updated — all automatically. Requirements Gmail account (with labels created for Client, Sponsorship Request, and Not Business) Google Calendar for scheduling Google Sheets for contact management Google Gemini API key Telegram bot for live interaction Customising this workflow You can expand it to: Handle Slack or WhatsApp messages in addition to Telegram. Add more classification categories (e.g., Invoices, Personal, Leads). Extend auto-replies with dynamic templates stored in Google Sheets. Log all interactions to Notion or Airtable for a CRM-style history of communications. 👉 Rakin Jakaria
by PTS
Who this is for Anybody using Firefly III, especially home/self-hosted users, who want to add some level of automation to their transaction tracking, either in addition to or because they can't or don't want to use the dataimporter How it works - posting transactions User sends a transaction screenshot/image or statement to a Telegram bot Gemini analyzes it based on the user's requirements (asset account IDs & categories) The transaction information is parsed to create a suitable POST to a Firefly instance The transaction(s) are posted to Firefly via its API, using an OAuth2 credential How it works - requesting budget reports User sends the word 'Report' via telegram A 'GET' API request is sent to Firefly for all budgets between the beginning of the month and the request date, including remaining amounts for each This is converted to a CSV file The CSV is sent to the user via Telegram Prerequisites Telegram, and knowledge of how to set up a bot (search for BotFather in Telegram) An existing instance of Firefly III with admin access for creating OAuth2 credentials How to set it up - Credentials Open Telegram, and search for BotFather Create a new bot by following the instructions Save the API key provided In n8n, create a new Telegram credential using the info for the new bot Create an OAuth client in Firefly, using the redirect URL found in n8n's OAuth2 API credential creator Fill the n8n OAuth2 API credential form as Authorization Code, filling in the remaining parameters from the info created in Firefly Create a Gemini credential following the instructions in n8n How to set it up - the workflow Set the credential in each Telegram node Set the Firefly credential in each http node Set the correct base URL for the Firefly instance in each http node Set the desired Gemini credential and model in each AI node Set the correct Bank IDs (as per Firefly) and preferred categories in the AI node system message Customization options The user can specify all types of asset and expense accounts, as well as a specific list of categories and descriptions for Gemini to use. Gemini can also be swapped out for any other AI/LLM. Additionally, anyone can build on this by reviewing the Firefly API documents to automate almost any other part of the Firefly software.
by MAMI YAMANE
Here is the template specification based on the provided workflow and guidelines. Audit Instagram Influencer Safety and Engagement to Slack Description Protect your brand reputation and optimize your marketing budget by automatically vetting potential influencer partners. Manually analyzing engagement rates and reading through hundreds of comments to detect brand safety risks is time-consuming and prone to error. This workflow streamlines the due diligence process. By simply entering an Instagram username into a form, the system scrapes recent data, calculates engagement metrics to detect potential fake followers or bot activity, and uses AI to scan content for offensive language or competitor mentions. The final detailed audit report is delivered instantly to your Slack channel and logged in Google Sheets for record-keeping. Who is this for Influencer Marketing Managers:** To quickly vet creators before sending collaboration offers. Digital Agencies:** To perform scalable due diligence for client campaigns. Brand Managers:** To ensure potential partners align with brand safety guidelines and do not promote direct competitors. How it works Input: The workflow starts with an n8n Form where you enter the influencer’s Instagram handle and optional competitor names. Data Extraction: It triggers Apify (using the Instagram Scraper) to fetch the profile’s statistics and their most recent 30 posts. Engagement Analysis: A Code Node calculates the average engagement rate. It logic flags the account as "Suspicious" if the rate is unnaturally low (indicating fake followers) or suspiciously high (indicating bot farms). AI Safety Check: Recent post captions are aggregated and sent to OpenAI. The AI analyzes the text for risk flags (controversy, profanity), competitor mentions, and assigns a safety score. Reporting: The workflow saves the raw request and results to Google Sheets and sends a formatted summary report to a specific Slack channel. Requirements Apify Account:** You will need an API token and access to the Instagram Scraper actor. OpenAI Account:** An API key to perform the content safety analysis. Google Cloud Platform:** Credential with access to the Google Sheets API. Slack Workspace:** A configured Slack app/bot to post messages. How to set up Configure Credentials: Connect your Apify, Google Sheets, OpenAI, and Slack accounts in the respective nodes. Setup Google Sheet: Create a Google Sheet with two tabs: Audit Requests (Columns: username, timestamp) Audit Results (Columns: username, followers, engagementRate, status, safetyScore, riskFlags, recommendation) Configure Variables: Open the Workflow Configuration node (Set node) to input: Your apifyApiToken. engagementThresholdLow (default is 1%). engagementThresholdHigh (default is 10%). Update IDs: In the Store Audit Request and Store Audit Results nodes, select your created Google Sheet. In the Send Audit Report to Slack node, select the channel where you want reports to appear. How to customize the workflow Adjust Thresholds:* Change the engagementThresholdLow or engagementThresholdHigh values in the *Workflow Configuration** node to fit your specific industry benchmarks. Modify AI Criteria:* Edit the system prompt in the *AI Content Safety Audit** node to check for specific brand values, specific keywords, or tone of voice requirements. Change Output:* Replace the Slack node with an *Email* node (Gmail/Outlook) or a *Notion** node if you prefer to store reports in a project management database.
by Rajeet Nair
Overview This workflow automates CSV data processing from upload to database insertion. It accepts CSV files via webhook, uses AI to detect schema and standardize columns, cleans and validates the data, and stores it in Postgres. Errors are logged separately, and notifications are sent for visibility. How It Works CSV Upload A webhook receives CSV files for processing. Validation The workflow checks if the uploaded file is a valid CSV format. Invalid files are rejected with an error report. Data Extraction The CSV is parsed into structured rows for further processing. Schema Detection AI analyzes the data to: Infer column types Normalize column names Detect inconsistencies Data Normalization Values are cleaned and converted into proper formats (numbers, dates, booleans), with optional unit standardization. Data Quality Validation The workflow checks: Type mismatches Missing values Statistical outliers Conditional Processing Clean data → prepared and inserted into Postgres Errors → detailed report generated Database Insert Valid data is stored in the configured Postgres table. Error Logging Errors are logged into Google Sheets for tracking and debugging. Notifications A Slack message is sent with processing results. Setup Instructions Configure the webhook endpoint for CSV uploads Set your Postgres table name in the configuration node Add Anthropic/OpenAI credentials for schema detection Connect Slack for notifications Connect Google Sheets for error logging Configure error threshold settings Test with sample CSV files Activate the workflow Use Cases Cleaning and standardizing messy CSV data Automating ETL pipelines Preparing data for analytics or dashboards Validating incoming data before database storage Monitoring data quality with error reporting Requirements n8n instance with webhook access Postgres database OpenAI or Anthropic API access Slack workspace Google Sheets account Notes You can customize schema rules and normalization logic in the Code node. Adjust error thresholds based on your data tolerance. Extend validation rules for domain-specific requirements. Replace Postgres or Sheets with other storage systems if needed.
by Fahmi Fahreza
Create Airtable records from new ClickUp Doc pages This workflow automates the process of turning content from ClickUp Docs into structured data in Airtable. When a new task is created in ClickUp with a link to a ClickUp Doc in its name, this workflow triggers, fetches the entire content of that Doc, parses it into individual records, and then creates a new record for each item in a specified Airtable base and table. Who's it for This template is perfect for content creators, project managers, and operations teams who use ClickUp Docs for drafting or knowledge management and Airtable for tracking and organizing data. It helps bridge the gap between unstructured text and a structured database. How it works Trigger: The workflow starts when a new task is created in a specific ClickUp Team. Fetch & Parse URL: It gets the new task's details and extracts the ClickUp Doc URL from the task name. Get Doc Content: It uses the URL to fetch the main Doc and all its sub-pages from the ClickUp API. Process Content: A Code node parses the text from each page. It's designed to split content by * * * and separate notes by looking for the "notes:" keyword. Find Airtable Destination: The workflow finds the correct Airtable Base and Table IDs by matching the names you provide. Create Records: It loops through each parsed content piece and creates a new record in your specified Airtable table. How to set up Configure the Set Node: Open the "Configure Variables" node and set the following values: clickupTeamId: Your ClickUp Team ID. Find it in your ClickUp URL (e.g., app.clickup.com/9014329600/...). airtableBaseName: The exact name of your target Airtable Base. airtableTableName: The exact name of your target Airtable Table. airtableVerticalsTableName: The name of the table in your base that holds "Vertical" records, which are linked in the main table. Set Up Credentials: Add your ClickUp (OAuth2) and Airtable (Personal Access Token) credentials to the respective nodes. Airtable Fields: Ensure your Airtable table has fields corresponding to the ones in the Create New Record in Airtable node (e.g., Text, Status, Vertical, Notes). You can customize the mapping in this node. Activate Workflow: Save and activate the workflow. Test: Create a new task in your designated ClickUp team. In the task name, include the full URL of the ClickUp Doc you want to process. How to customize the workflow Parsing Logic:* You can change how the content is parsed by modifying the JavaScript in the Parse Content from Doc Pages Code node. For example, you could change the delimiter from * * to something else. Field Mapping:** Adjust the Create New Record in Airtable node to map data to different fields or add more fields from the source data. Trigger Events:** Modify the Trigger on New ClickUp Task node to respond to different events, such as taskUpdated or taskCommentPosted.
by Cheng Siong Chin
How It Works This workflow automates intelligent fleet operations management for transport operators, logistics companies, and smart mobility teams. It solves the problem of manually triaging high-volume vehicle telemetry data, a process prone to delays, missed safety thresholds, and inconsistent service prioritisation. Incoming vehicle telemetry is received via webhook and validated by a Telemetry Validation Agent using an AI model and output parser. Validated data is passed to a Fleet Orchestration Agent that coordinates three specialist sub-agents: a Safety Compliance Sub-Agent (checking thresholds and escalating breaches), a Service Scheduling Sub-Agent (optimising maintenance windows), and a Customer Email Notification Tool. The orchestrator then routes each case by service priority, namely: urgent, high, normal, or low, triggering the appropriate service preparation step and logging traceability data to Google Sheets. Compliance escalations follow a parallel path: checking status, preparing reports, and logging to Sheets. All branches converge into a unified webhook response, ensuring downstream systems receive a consistent, structured reply. Setup Steps Import workflow and configure the webhook trigger URL. Add AI model credentials to Validation Agent, Orchestration Agent, and both Sub-Agents. Connect Gmail credentials to the Customer Email Notification Tool. Link Google Sheets credentials; set target sheet IDs for Safety Traceability and Compliance Escalation logs. Configure the Fleet Management API Tool and Urgent Service API Call with your fleet service endpoint URLs. Set safety threshold values in the Safety Threshold Calculator node. Prerequisites OpenAI API key (or compatible LLM) Gmail account with OAuth credentials Google Sheets with log tabs pre-created Use Cases Logistics fleets auto-triaging vehicle fault alerts by severity Customisation Swap OpenAI for any LangChain-compatible model Benefits Eliminates manual telemetry triage, reducing response lag