by n8n Team
Who this template is for This template is for developers, content creators, or application builders who want to integrate an AI-powered text-to-image generation service into their applications or systems via an API endpoint. Use case Creating a secure API endpoint that converts text prompts into AI-generated images, with built-in content moderation to prevent inappropriate content generation. This can be used for creative applications, content creation tools, prototyping interfaces, or any system that needs on-demand image generation. How this workflow works Receives text prompt through a webhook endpoint Filters the prompt for inappropriate content using AI moderation Submits valid prompts to the Fal.ai Flux image generation service Polls for completion status and retrieves the generated image when ready Returns the image results in a structured JSON format to the client Set up steps Create a Fal.ai account and obtain API credentials Configure the HTTP Header Auth credentials with your Fal.ai API key Set up an OpenAI API key for the content moderation component Deploy the workflow and note the webhook URL for your API endpoint Test the endpoint by sending a POST request with a JSON body containing a "prompt" field
by Preston Zeller
How It Works This workflow automates the real estate lead qualification process by leveraging property data from BatchData. The automation follows these steps: When a new lead is received through your CRM webhook, the workflow captures their address information It then makes an API call to BatchData to retrieve comprehensive property details A sophisticated scoring algorithm evaluates the lead based on property characteristics like: Property value (higher values earn more points) Square footage (larger properties score higher) Property age (newer constructions score higher) Investment status (non-owner occupied properties earn bonus points) Lot size (larger lots receive additional score) Leads are automatically classified into categories (high-value, qualified, potential, or unqualified) The workflow updates your CRM with enriched property data and qualification scores High-value leads trigger immediate follow-up tasks for your team Notifications are sent to your preferred channel (Slack in this example) The entire process happens within seconds of receiving a new lead, ensuring your sales team can prioritize the most valuable opportunities immediately.. Who It's For This workflow is perfect for: Real estate agents and brokers looking to prioritize high-value property leads Mortgage lenders who need to qualify borrowers based on property assets Home service providers (renovators, contractors, solar installers) targeting specific property types Property investors seeking specific investment opportunities Real estate marketers who want to segment audiences by property value Home insurance agents qualifying leads based on property characteristics Any business that bases lead qualification on property details will benefit from this automated qualification system. About BatchData BatchData is a comprehensive property data provider that offers detailed information about residential and commercial properties across the United States. Their API provides: Property valuation and estimates Ownership information Property characteristics (size, age, bedrooms, bathrooms) Tax assessment data Transaction history Occupancy status (owner-occupied vs. investment) Lot details and dimensions By integrating BatchData with your lead management process, you can automatically verify and enrich leads with accurate property information, enabling more intelligent lead scoring and routing based on actual property characteristics rather than just contact information. This workflow demonstrates how to leverage BatchData's property API to transform your lead qualification process from manual research into an automated, data-driven system that ensures high-value leads receive immediate attention.
by Marth
How It Works ⚙️ This workflow acts as a communication bridge for your candidate pipeline: Webhook Trigger (Status Update): 🚀 The workflow activates when it receives data indicating a candidate's status has changed. This data could come from an internal form, a custom script, or a webhook from a basic Applicant Tracking System (ATS). Extract & Prepare Data (Function): 🧹 This node processes the incoming data. It extracts key information such as the candidate's name, the position they applied for, their previous status (if available), and their new status. It then formats this information into a clear, concise message suitable for a notification. Send Slack Notification: 📢 The prepared message is sent to a designated Slack channel (e.g., #recruitment-updates). This provides instant, real-time updates to your team, ensuring everyone is on the same page. (Alternative: Send Email Notification): This node can easily be swapped with a Gmail or SendGrid node to send email notifications to a predefined list of recipients instead of Slack. How to Set Up 🛠️ Follow these steps carefully to get your "Automated Candidate Status Notifier" workflow up and running: Import Workflow JSON: Open your n8n instance. Click on 'Workflows' in the left sidebar. Click the '+' button or 'New' to create a new workflow. Click the '...' (More Options) icon in the top right. Select 'Import from JSON' and paste the entire JSON code for this workflow. Configure Webhook Trigger (Status Update): Locate the 'Webhook Trigger (Status Update)' node (1. Webhook Trigger). Activate the workflow. n8n will provide a unique 'Webhook URL'. Crucial Step: Configure your data-sending system (e.g., a form submission, an ATS's webhook settings, or your custom script) to send candidate status update data (preferably in JSON format via POST request) to this n8n Webhook URL. Configure Extract & Prepare Data (Function): Locate the 'Extract & Prepare Data' node (2. Extract & Prepare Data). Adjust Field Names: Review the functionCode inside this node. You MUST adjust the variable assignments (e.g., inputData.candidateName, inputData.position) to accurately match the exact field names your sending system uses for candidate name, position, new status, old status, and notes. Use the 'Test Workflow' feature after sending a test webhook to inspect the incoming items[0].json.body data structure. The node automatically formats messages for Slack and Email. Configure Send Slack Notification: Locate the 'Send Slack Notification' node (3. Send Slack Notification). Credentials: Select your existing Slack API credential or click 'Create New' to set one up. Replace YOUR_SLACK_CREDENTIAL_ID with the actual ID or name of your credential from your n8n credentials. Channel: Replace YOUR_SLACK_CHANNEL_ID_OR_NAME with the exact ID or name of the Slack channel where you want to receive notifications (e.g., #recruitment-updates). OPTIONAL: Switch to Email Notification (Gmail/SendGrid/etc.): Delete the 'Send Slack Notification' node. Add a new 'Gmail' or 'SendGrid' (or your preferred email service) node. Configure its credentials. Set the 'To Email' field (e.g., your-team-email@example.com). Set the 'Subject' to ={{ $json.emailSubject }}. Set the 'HTML' body to ={{ $json.emailBody }}. Connect it from the 'Extract & Prepare Data' node. Review and Activate: Thoroughly review all node configurations. Ensure all placeholder values (like YOUR_...) are replaced and settings are correct. Click the 'Save' button in the top right corner. Finally, toggle the 'Inactive' switch to 'Active' to enable your workflow. 🟢 Your automated candidate status notifier is now live, keeping your team updated in real-time!
by Usman Liaqat
This workflow listens for incoming WhatsApp messages that contain media (e.g., images) and automatically downloads the media file using WhatsApp's private media URL. The trigger node activates when a WhatsApp message with media is received. The media ID is extracted from the message payload. A private media URL is retrieved using the media ID. The media file is downloaded using an authenticated HTTP request. Ideal for: Archiving WhatsApp media to external systems. Triggering further automations based on received media. Integrating with cloud storage like Google Drive, Dropbox, or Amazon S3. Set up steps Connect your WhatsApp Business API account. Add HTTP credentials for downloading media via private URL. Set up the webhook in your WhatsApp Business account. Extend the workflow as needed for your use case (e.g., file storage, alerts).
by Friedemann Schuetz
Welcome to my Airbnb Telegram Agent Workflow! This workflow creates an intelligent Telegram bot that helps users search and find Airbnb accommodations using natural language queries and voice messages. DISCLAIMER: This workflow only works with self-hosted n8n instances! You have to install the n8n-nodes-mcp-client Community Node! What this workflow does This workflow processes incoming Telegram messages (text or voice) and provides personalized Airbnb accommodation recommendations. The AI agent understands natural language queries, searches through Airbnb data using MCP tools, and returns mobile-optimized results with clickable links, prices, and key details. Key Features: Voice message support (speech-to-text and text-to-speech) Conversation memory for context-aware responses Mobile-optimized formatting for Telegram Real-time Airbnb data access via MCP integration This workflow has the following sequence: Telegram Trigger - Receives incoming messages from users Text or Voice Switch - Routes based on message type Voice Processing (if applicable) - Downloads and transcribes voice messages Text Preparation - Formats text input for the AI agent Airbnb AI Agent - Core logic that: Lists available MCP tools for Airbnb data Executes searches with parsed parameters Formats results for mobile display Response Generation - Sends formatted text response Voice Response (optional) - Creates and sends audio summary Requirements: Telegram Bot API**: Documentation Create a bot via @BotFather on Telegram Get bot token and configure webhook OpenAI API**: Documentation Used for speech transcription (Whisper) Used for chat completion (GPT-4) Used for text-to-speech generation MCP Community Client Node**: Documentation Custom integration for Airbnb data Requires MCP server setup with Airbnb/Airtable connection Provides tools for accommodation search and details Important: You need to set up an MCP server with Airbnb data access. The workflow uses MCP tools to retrieve real accommodation data, so ensure your MCP server is properly configured with the Airtable/Airbnb integration. Configuration Notes: Update the Telegram chat ID in the trigger for your specific bot Modify the system prompt in the Airbnb Agent for different use cases The workflow supports both individual users and can be extended for group chats Feel free to contact me via LinkedIn, if you have any questions!
by Mohamed Abubakkar
How it Works. The workflow runs automatically every day and collects analytics data for both today and yesterday. It cleans and standardizes both datasets in the same way so they are easy to compare. After that, it measures how performance has changed from one day to the next and interprets those changes to understand trends and context. Once all calculations are finished, the AI creates a clear, easy-to-read summary of what happened. This summary is then formatted and sent through the required communication channels, while the final data is saved for tracking over time and for creating follow-up tasks if needed. Key Features: Trigger runs once per day recommended (11: 57 PM). Fetch seperate data for today and yesterday in node. Compare the two days data and highlight if traffic is less Add the trend lowTraffic for low traffic identification. Using GPT-4 Mini for human readable summary to suit different communication channels. Sending report to WhatsApp / Email Stored the final structure data to Google Sheet for future analytics and historical record Setup Steps 1. Connect Required Credentials You must connect the following credentials: Google Analytics API Google Sheets OpenAi API Email SMTP WhatsApp API ClickUp API 2. Replace Defalut Values Update the workflow with: Your Google Analytics Id's Your Google Sheet Tabs Replace SMTP Credentials, sender and recipients Change with your OpenAi API key Create your WhatApi API Credentials ClickUp API Key's 3. Customize Email Template Modify subject, message body, or formatting style based on your reporting standards. 4. Adjust Trigger You may choose: Manual Trigger Cron Trigger for daily/weekly report Webhook Trigger integrated with your system Detailed Process Flow Schedule Trigger Node Type: Trigger Node Purpose: Automates the start of the workflow. Details: Runs every day (or every hour if real-time monitoring is needed). Eliminates manual data collection and ensures consistent reporting. Analytics Reports Node Type :- Google Analytics Node Purpose: Fetch website performance data Metrics includes: users, sessions, page views Combine the Data Node Type: Merge Node (Append) Purpose: Combines today’s and yesterday’s datasets into a single item for comparison. Details: Prepares data for calculating percentage changes. Maintains proper structure for further nodes. Calculate Percent Changes Node Type: Function Node Purpose: Computes day-over-day percentage changes for users, sessions, and page views. Details: Formula: ((today - yesterday) / yesterday) * 100 Handles increases and decreases correctly. Outputs values used for trend indicators and alerts. Generate AI Summary Node Type: OpenAI Node Purpose: Produces human-readable, professional insights about the daily analytics. Details: Summarizes key changes, trends, and recommendations. Provides context such as low-traffic warnings. Text output is used in emails, WhatsApp, and ClickUp tasks. Send Email or Whatsapp to Dedicated Person / Marketing Team Node Type: Email Node / Whatspp Purpose: Sends daily alert or report emails or whatsapp. Details: Includes formatted metrics and AI summary. Email subject and body clearly indicate trends and recommendations. Workflow Benefits Fully automated daily GA reporting AI-generated summaries for clear insights Alerts only triggered when necessary Historical logging for trends and dashboards Actionable tasks automatically created in ClickUp Multi-channel delivery via Email and WhatsApp Handles low-traffic scenarios gracefully
by Miquel Colomer
🎯 Precision Prospecting: Automate LinkedIn Lead Gen with n8n & Bright Data 📝 Overview This workflow turns n8n into an AI-powered prospector, automatically searching Google for LinkedIn profiles, scraping profile data via Bright Data, and summarizing key details. Ideal for sales and recruitment teams seeking targeted lead lists without manual research. 🎥 Workflow in Action Want to see this workflow in action? You have a chat window output below: 🔑 Key Features AI Chat Trigger**: Start prospecting via conversational prompts. Contextual Memory**: Retains the last 20 messages for coherent dialogue. Automated Google Search**: Generates site-restricted queries and fetches the top result. Bright Data Scraping**: Synchronously scrapes LinkedIn profile details by URL. Intelligent Filtering**: Extracts only valid LinkedIn profile links. Limit Control**: Returns a single, most relevant profile per request. LLM Summary**: Uses GPT-4o-mini to interpret and present scraped data. 🚀 How It Works (Step-by-Step) Prerequisites: n8n ≥ v1.0 with community nodes: install n8n-nodes-brightdata (not verified community node). API credentials: OpenAI, Bright Data (web unlocker zone “web\_unlocker1”). Webhook endpoint for chat trigger. Node Configuration: When chat message received (chatTrigger): Fires on user prompt. Simple Memory1 (memoryBufferWindow): Stores the last 20 chat messages. AI Prospector Agent (agent): Orchestrates search logic. Get 1 Google Result (brightData): Performs a Google search with site:linkedin.com/in. Get Links from Body (html): Extracts all `` hrefs from the search result page. Extract Links (splitOut): Splits out individual link entries. Filter only LinkedIn Profiles (filter): Ensures the URL contains “linkedin.com/” and starts with “https\://”. Limit (limit): Restricts output to the first valid profile URL. Search LinkedIn URI (toolWorkflow): Passes the URL to a secondary workflow to fetch the first link. Get LinkedIn Profile Data (brightDataTool): Scrapes the profile JSON. OpenAI Chat Model (lmChatOpenAi): Summarizes and formats the scraped data. Workflow Logic: User asks for a person by company & name, company & position, or LinkedIn URL. Agent builds a Google query (e.g., site:linkedin.com/in bright data cmo) and calls “Get 1 Google Result.” Extracted links are filtered and limited to the top valid profile. If user provided a direct LinkedIn URL, Agent skips search and scrapes immediately. Scraped profile JSON is passed to GPT-4o-mini to generate a concise summary. Testing & Optimization: Trigger via Execute Workflow for dry runs. Inspect intermediate node outputs in n8n’s Execution panel. Adjust maxIterations or memory window length for performance. Tune Bright Data zone or country settings to optimize scraping speed. Deployment & Monitoring: Activate the workflow and expose its webhook URL. Use n8n’s built-in Alerts or external monitoring (e.g., Slack notifications) on failures. Rotate credentials via n8n’s Credential Vault when needed. Version-control workflow via duplicates or Git-backed n8n instances. ✅ Pre-requisites OpenAI Account**: API key for GPT-4o-mini. Bright Data Account**: Zone “web\_unlocker1” and dataset gd_l1viktl72bvl7bjuj0. n8n Version**: v1.0+ with community nodes installed. Permissions**: Webhook access, Credential Vault read/write. 👤 Who Is This For? Sales teams automating outbound LinkedIn prospecting. Recruiters sourcing candidates without manual scraping. Marketing ops looking to enrich CRM with accurate profile data. 📈 Benefits & Use Cases Efficiency**: Reduces hours of manual search and data entry to seconds. Accuracy**: Filters out non-LinkedIn links and ensures high-quality results. Scalability**: Handle multiple prospect requests concurrently via chat or API. Integration**: Easily hook into CRMs or email sequencers downstream. Workflow created and verified by Miquel Colomer https://www.linkedin.com/in/miquelcolomersalas/ and N8nHackers https://n8nhackers.com
by Hubschrauber
Overview This template describes a possible approach to handle a pseudo-callback/trigger from an independent, external process (initiated from a workflow) and combine the received input with the workflow execution that is already in progress. This requires the external system to pass through some context information (resumeUrl), but allows the "primary" workflow execution to continue with BOTH its own (previous-node) context, AND the input received in the "secondary" trigger/process. Primary Workflow Trigger/Execution The workflow path from the primary trigger initiates some external, independent process and provides "context" which includes the value of $execution.resumeUrl. This execution then reaches a Wait node configured with Resume - On Webhook Call and stops until a call to resumeUrl is received. External, Independent Process The external, independent process could be anything like a Telegram conversation, or a web-service as long as: it results in a single execution of the Secondary Workflow Trigger, and it can pass through the value of resumeUrl associated with the Primary Workflow Execution Secondary Workflow Trigger/Execution The secondary workflow execution can start with any kind of trigger as long as part of the input can include the resumeUrl. To combine / rejoin the primary workflow execution, this execution passes along whatever it receives from its trigger input to the resume-webhook endpoint on the Wait node. Notes IMPORTANT: The workflow ids in the Set nodes marked **Update Me have embedded references to the workflow IDs in the original system. They will need to be CHANGED to make this demo work. Note: The Resume Other Workflow Execution node in the template uses the $env.WEBHOOK_URL configuration to convert to an internal "localhost" call in a Docker environment. This can be done differently. ALERT:** This pattern is NOT suitable for a workflow that handles multiple items because the first workflow execution will only be waiting for one callback. The second workflow (not the second trigger in the first workflow) is just to demonstrate how the Independent, External Process needs to work.
by Baptiste Fort
Still manually copy-pasting your Tally form responses? What if every submission went straight into Airtable — and the user got an automatic email right after? That’s exactly what this workflow does. No code, no headache — just a simple and fast automation: Tally → Airtable → Gmail. STEP 1 — Capture Tally Form Responses Goal Trigger the workflow automatically every time someone submits your Tally form. What we're setting up A webhook that catches form responses and kicks off the rest of the flow. Steps to follow Add a Webhook node Parameter : Value Method : POST Path : formulaire-tally Authentication : None Respond : Immediately Save the workflow → This will generate a URL like: https://your-workspace.n8n.cloud/webhook-test/formulaire-tally 💡 Use the Test URL first (found under Parameters > Test URL) Head over to Tally Go to your form → Form Settings > Integrations > Webhooks Paste the Test URL into the Webhook field Enable the webhook ✅ Submit a test entry → Tally won’t send anything until a real submission is made. This step is required for n8n to capture the structure. Expected output n8n receives a JSON object containing: General info (IDs, timestamps, etc.) A fields[] array with all the form inputs (name, email, etc.) Each field is nicely structured with a label, key, type, and most importantly, a value. Perfect foundation for the next step: data cleanup. STEP 2 — Clean and Structure the Form Data (Set node) Goal Take the raw data sent by Tally and turn it into clean, readable JSON that's easy to use in the rest of the workflow. Tally sends the responses inside a big array called field. Can you grab a field directly with something like {{$json"fields"["value"]}}? Yes. But a good workflow is like a sock drawer — when everything’s folded and labeled, life’s just easier. So we’re going to clean it up using a Set node. Steps to follow Add a Set node right after the Webhook. Enable the “Keep only set” option. Define the following fields in the Set node: Field name: Expression full_name: {{$json"fields"["value"]}} company_name: {{$json"fields"["value"]}} job_title: {{$json"fields"["value"]}} email: {{$json"fields"["value"]}} phone_number: {{$json"fields"["value"] ?? ""}} submission_date: {{$now.toISOString()}} ⚠️ The order of fields[] depends on your Tally form. If you change the question order, make sure to update these indexes accordingly. Expected output You’ll get a clean, structured JSON like this: Now your data is clear, labeled, and ready for the rest of your workflow. STEP 3 — Save Data in Airtable Goal Every time someone submits your Tally form, their info is automatically added to an Airtable base. No more copy-pasting — everything lands right where it should. Steps to follow Create your Airtable base Start by creating a base named Leads (or whatever you prefer), with a table called Form Submissions. Add the following columns in this exact order so everything maps correctly later: Generate an Airtable token So n8n can send data into your base: Go to 👉 [ https://airtable.com/create/tokens](https://airtable.com/create/tokens ) Click Create token Give it a name (e.g. Tally Automation) Check the following permissions: data.records:read data.records:write schema.bases:read Under Base access, either choose your base manually or select “All current and future bases” Click Create token and copy the generated key Add configure the Airtable node in n8n Node: Airtable Operation: Create Authentication: Personal Access Token Paste your token n8n will suggest your base and table (or you can manually grab the IDs from the URL: https://airtable.com/appXXXXXXXX/tblYYYYYYYY/...) Map your fields Inside the Airtable node, add the following field mappings: Every new Tally form submission automatically creates a new row in your Airtable base. STEP 4 — Send an Automatic Confirmation Email Goal Send a professional email as soon as a form is completed Steps to follow Add a Wait node You don’t want the email to go out instantly — it feels cold and robotic. → Add a Wait node right after Airtable. Mode: Wait for a period of time Delay: 5 to 10 minutes Unit: Minutes Add a Gmail > Send Email node Authentication: OAuth2 Connect a Gmail account (business or test) ⚠️ No API keys here — Gmail requires OAuth. Configure the Send Email node Field Value Credential to connect with Gmail account via OAuth2 Resource : Message Operation : Send To : {{ $json.fields["Email"] }} Subject : Thanks for reaching out! Email Type : HTML Message: (but do the mapping correctly using the Input so that lead receives its name correctly ) End of the Workflow And that’s it — your automation is live! Your lead fills out the Tally form → the info goes to Airtable → they get a clean, professional email without you doing a thing.
by Don Jayamaha Jr
🧪 Binance SM 1hour Indicators Tool A precision trading signal engine that interprets 1-hour candlestick indicators for Binance Spot Market pairs using a GPT-4.1-mini LLM. Ideal for swing traders seeking directional bias and momentum clarity across medium timeframes. 🎥 Watch Tutorial: 🎯 Purpose This tool provides a structured 1-hour market read using: RSI** (Relative Strength Index) MACD** (Moving Average Convergence Divergence) BBANDS** (Bollinger Bands) SMA & EMA** (Simple and Exponential Moving Averages) ADX** (Average Directional Index) It’s invoked as a sub-agent in broader AI workflows, such as the Binance Financial Analyst Tool and the Spot Market Quant AI Agent. ⚙️ Key Features | Feature | Description | | ---------------------- | ------------------------------------------------------------- | | 🔄 Subworkflow Trigger | Runs only when called by parent agent (not standalone) | | 🧠 GPT-4.1-mini LLM | Translates numeric indicators into natural-language summaries | | 📊 Real-time Data | Pulls latest 40×1h candles via internal webhook from Binance | | 📥 Input Format | { "message": "ETHUSDT", "sessionId": "telegram_chat_id" } | | 📤 Output Format | JSON summary + Telegram-friendly HTML overview | 💡 Example Output 📊 1h Technical Overview – ETHUSDT • RSI: 59 (Neutral) • MACD: Bullish Crossover • BBANDS: Price at Upper Band • EMA > SMA → Positive Slope • ADX: 28 → Moderate Trend Strength 🧩 Use Cases | Scenario | Result | | -------------------------------------- | ----------------------------------------------- | | Mid-frame market alignment | Verifies momentum between 15m and 4h timeframes | | Quant AI Agent input | Supplies trend context for entry/exit decisions | | Standalone medium-term signal snapshot | Validates swing trade setups or filters noise | 📦 Installation Instructions Import workflow into your n8n instance Confirm internal webhook /1h-indicators is live and authorized Insert your OpenAI credentials for GPT-4.1-mini node Use only when triggered via: Binance Financial Analyst Tool Binance Spot Market Quant AI Agent 🧾 Licensing & Support 🔗 Don Jayamaha – LinkedIn linkedin.com/in/donjayamahajr © 2025 Treasurium Capital Limited Company Architecture, prompts, and signal logic are proprietary. Redistribution or commercial use requires explicit licensing. No unauthorized cloning permitted.
by Daniel Shashko
How it Works This workflow accepts meeting transcripts via webhook (Zoom, Google Meet, Teams, Otter.ai, or manual notes), immediately processing them through an intelligent pipeline that eliminates post-meeting admin work. The system parses multiple input formats (JSON, form data, transcription outputs), extracting meeting metadata including title, date, attendees, transcript content, duration, and recording URLs. OpenAI analyzes the transcript to extract eight critical dimensions: executive summary, key decisions with ownership, action items with assigned owners and due dates, discussion topics, open questions, next steps, risks/blockers, and follow-up meeting requirements—all returned as structured JSON. The intelligence engine enriches each action item with unique IDs, priority scores (weighing urgency + owner assignment + due date), status initialization, and meeting context links, then calculates a completeness score (0-100) that penalizes missing owners and undefined deadlines. Multi-channel distribution ensures visibility: Slack receives formatted summaries with emoji categorization for decisions (✅), action items (🎯) with priority badges and owner assignments, and completeness scores (📊). Notion gets dual-database updates—meeting notes with formatted decisions and individual task cards in your action item database with full filtering and kanban capabilities. Task owners receive personalized HTML emails with priority color-coding and meeting context, while Google Calendar creates due-date reminders as calendar events. Every meeting logs to Google Sheets for analytics tracking: attendee count, duration, action items created, priority distribution, decision count, completeness score, and follow-up indicators. The workflow returns a JSON response confirming successful processing with meeting ID, action item count, and executive summary. The entire pipeline executes in 8-12 seconds from submission to full distribution. Who is this for? Product and engineering teams drowning in scattered action items across tools Remote-first companies where verbal commitments vanish after calls Executive teams needing auditable decision records without dedicated note-takers Startups juggling 10+ meetings daily without time for manual follow-up Operations teams tracking cross-functional initiatives requiring accountability Setup Steps Setup time:** 25-35 minutes Requirements:** OpenAI API key, Slack workspace, Notion account, Google Workspace (Calendar/Gmail/Sheets), optional transcription service Webhook Trigger: Automatically generates URL, configure as POST endpoint accepting JSON with title, date, attendees, transcript, duration, recording_url, organizer Transcription Integration: Connect Otter.ai/Fireflies.ai/Zoom webhooks, or create manual submission form OpenAI Analysis: Add API credentials, configure GPT-4 or GPT-3.5-turbo, temperature 0.3, max tokens 1500 Intelligence Synthesis: JavaScript calculates priority scores (0-40 range) and completeness metrics (0-100), customize thresholds Slack Integration: Create app with chat:write scope, get bot token, replace channel ID placeholder with your #meeting-summaries channel Notion Databases: Create "Meeting Notes" database (title, date, attendees, summary, action items, completeness, recording URL) and "Action Items" database (title, assigned to, due date, priority, status, meeting relation), share both with integration, add token Email Notifications: Configure Gmail OAuth2 or SMTP, customize HTML template with company branding Calendar Reminders: Enable Calendar API, creates events on due dates at 9 AM (adjustable), adds task owner as attendee Analytics Tracking: Create Google Sheet with columns for Meeting_ID, Title, Date, Attendees, Duration, Action_Items, High_Priority, Decisions, Completeness, Unassigned_Tasks, Follow_Up_Needed Test: POST sample transcript, verify Slack message, Notion entries, emails, calendar events, and Sheets logging Customization Guidance Meeting Types:** Daily standups (reduce tokens to 500, Slack-only), sprint planning (add Jira integration), client calls (add CRM logging), executive reviews (stricter completeness thresholds) Priority Scoring:** Add urgency multiplier for <48hr due dates, owner seniority weights, customer impact flags AI Prompt:** Customize to emphasize deadlines, blockers, or technical decisions; add date parsing for phrases like "by end of week" Notification Routing:** Critical priority (score >30) → Slack DM + email, High (20-30) → channel + email, Medium/Low → email only Tool Integrations:** Add Jira/Linear for ticket creation, Asana/Monday for project management, Salesforce/HubSpot for CRM logging, GitHub for issue creation Analytics:** Build dashboards for meeting effectiveness scores, action item velocity, recurring topic clustering, team productivity metrics Cost Optimization:** ~1,200 tokens/meeting × $0.002/1K (GPT-3.5) = $0.0024/meeting, use batch API for 50% discount, cache common patterns Once configured, this workflow becomes your team's institutional memory—capturing every commitment and decision while eliminating hours of weekly admin work, ensuring accountability is automatic and follow-through is guaranteed. Built by Daniel Shashko Connect on LinkedIn
by jason
This workflow was originally presented at the February 2022 n8n Meetup. Requirements In order to use this workflow, you will need the following in place: A configured Baserow account A group in Baserow called User Empowerment Demo A database in the User Empowerment Demo called Office Shopping List Inside the Office Shopping List database, you will need two tables: Shopping List: Column 1 - Single line text column named Item Shopper: Column 1 - Single line text column named Name Column 2 - Email column named Email An email account for sending out alerts Customization To make this workflow work for you, please customize the following items: All Baserow nodes will need to be updated with your own credentials, database, tables and fields The Send Shopping List node will need to be configured with your email credentials and email addresses The Create Shopper Form Set node will need to have the code in the HTML value modified to reflect your Production URL from the Submit Shopper node (See instructions below) The Cron node will need to be modified to reflect the timing that you wish to use Modifying the Webform The webform is the piece that people normally want to customize but is often the most complex because it is raw HTML. Here are some quick tips for making changes to the form. Webform Nodes There are two nodes that control what you see in the form: Create Shopper Form - displays the form and submits it to the correct webhook Create Response Page - displays the results when the form is submitted Editing the Webform The easiest way that I have found to edit the webform is to: Open up the Set node (Create Shopper Form or Create Response Page) that contains the HTML you wish to edit. Copy the contents of the HTML value to your favourite HTML editor Make your changes Paste the updated HTML back into the Set node Changing the Webhook URL the Webform Posts To In order for the webform to work properly, do the following: Determine the Production URL for the Submit Shopper webhook node In the Create Shopper Form node, look for the following line in the HTML value: form action="https://tephlon.app.n8n.cloud/webhook/submit-shopper" method="POST" Replace https://tephlon.app.n8n.cloud/webhook/submit-shopper with your Production URL Changing the Webform Image The image that is in the webform is actually embedded in the HTML in each of the Create Shopper Form or Create Response Page Set nodes and can be modified from there using these steps: Open up the appropriate Set node In the HTML value, find the line that starts with background-image:. It will be followed by a long string that looks like random characters Using a tool like Image to Base64 Converter, upload your image and generate a new CSS background source Replace the original background-image: line (including all the "random" characters) with the new generated CSS background source