by Atharva
🧾 An intelligent automation system that turns Google Meet recordings into structured meeting notes — integrating Fireflies.ai, OpenAI GPT-4.1-mini, Notion, Slack, Google Drive, and Gmail via n8n. 🎥 Demo: Watch the Loom walkthrough ⚙️ What It Does The Google Meet Notes Generator & Distributor automates the entire post-meeting workflow. Workflow Summary: Fetches Google Meet transcript via Fireflies.ai webhook. Aggregates and summarizes transcript using OpenAI GPT-4.1-mini into: Title Summary Decisions Action Items Risks Questions Stores the transcript as a text file on Google Drive and generates a shareable link. Creates a Notion page with all the meeting details. Posts the summary in #meeting Slack channel. DMs each attendee on Slack with personalized meeting notes. Sends email recap to all attendees via Gmail. Result: No more manual note-taking or scattered updates — everything centralized, formatted, and instantly shareable. 💡 Use Cases | Scenario | Description | | --------------------------- | --------------------------------------------------------------------------------------- | | Team Meeting Summaries | Automatically summarize and distribute meeting notes for internal teams. | | Project Management | Keep structured decisions, action items, and risks documented for each project meeting. | | Remote Teams | Notify distributed teams in Slack and via email without extra effort. | | Client Updates | Share polished meeting summaries with clients instantly. | | Knowledge Base / Notion | Archive structured notes in Notion for reference and compliance. | 🔧 Setup 1. Accounts and Tools Needed | Tool | Purpose | | -------------------------- | ---------------------------------------------- | | 🤖 Fireflies.ai API | Fetch Google Meet transcripts via webhook | | 🧠 OpenAI API | Summarize and structure transcript into notes | | 📓 Notion API | Create structured meeting pages | | 💬 Slack OAuth/Bot | Post summaries in channel and DM attendees | | 🗂 Google Drive OAuth2 | Store transcripts and generate shareable links | | ✉️ Gmail OAuth2 | Email meeting recaps to attendees | 2. Fireflies.ai Setup Get API key from Fireflies.ai. Configure n8n → Credentials → Fireflies API. Ensure webhook triggers are set to send meetingId to n8n. 3. OpenAI Setup Generate API key at OpenAI. Add to n8n → Credentials → OpenAI API. Use model GPT-4.1-mini in Agent and Generate Slack Message nodes. 4. Notion Setup Create a Notion integration and share your workspace. Add API token in n8n → Credentials → Notion API. Map workspace/page IDs to Notion Page node. 5. Slack Setup Create Slack Bot in workspace with chat:write and users:read permissions. Add OAuth token in n8n → Credentials → Slack API. Connect to Send a message #meeting and DMs to Attendees nodes. 6. Google Drive Setup Create a folder for transcripts. Enable Google Drive API in Google Cloud Console. Add OAuth2 credentials to n8n → Google Drive. 7. Gmail Setup Enable Gmail API in Google Cloud. Add OAuth2 credentials in n8n → Gmail. Connect to Email to Attendees node. 8. n8n Workflow Setup Import provided n8n workflow JSON. Configure all credentials: Fireflies, OpenAI, Notion, Slack, Google Drive, Gmail. Activate workflow. Test by sending a meeting ID via webhook. Workflow automatically: Fetch → Summarize → Store → Notion → Slack → DM → Email 🧠 Result A fully automated AI pipeline that transforms Google Meet recordings into polished, shareable meeting notes — eliminating manual note-taking and keeping your team informed in real time. 📞 Support & Contact If you face any issues during setup or execution, contact: 📧 Email: atharvapj5@gmail.com 🔗 LinkedIn: Atharva Jaiswal
by Cheng Siong Chin
Introduction Automates travel itinerary creation by searching flights and hotels via APIs, then uses AI to generate personalized recommendations delivered as HTML emails through Gmail. How It Works Webhook receives travel requests, searches Skyscanner and Booking.com APIs in parallel, merges results, uses AI to create optimized itineraries, formats as HTML email, sends via Gmail. Workflow Template Webhook → Parse & Validate → Parallel Searches (Flights: Skyscanner | Hotels: Booking.com) → Merge Data → AI Generate Itinerary → Format HTML Email → Send Gmail → Webhook Response Workflow Steps Trigger & Validate: Webhook receives request, extracts destination/dates/budget/preferences, validates data, converts to API parameters. Parallel Search: Skyscanner fetches flights with price/duration/airline. Booking.com retrieves hotels with ratings/pricing. Merge combines both into single JSON object. AI Generation: AI analyzes merged data, evaluates by price/duration/rating, creates itinerary with daily schedule, pairings, costs, and rankings. Delivery: Converts JSON to HTML email with tables and booking links. Gmail sends email. Webhook confirms success. Setup Instructions API Configuration: Add Skyscanner and Booking.com API credentials in n8n. Gmail Setup: Configure OAuth2 authentication. Customization: Copy webhook URL, adjust validation rules, modify AI prompts and HTML template. Prerequisites Skyscanner API key Booking.com API credentials Gmail with OAuth2 n8n instance Use Cases Personal vacation planning Business travel arrangements Customization Add APIs (Kiwi, Expedia) Filter by budget, Modify email design Benefits Saves 2-3 hours per trip Real-time pricing comparison
by Eric
Why use this? This workflow turns any event-related text into a new event on your calendar. Poster for a concert you want to go to? Snap a photo [with your iPhone] and boom city, it's in your calendar. † Parent-Teacher conference you can't forget? Forward that email to the webhook. † Appointment card from the doctor? Snap it in, baby! † How it works (Very, very simple) Data received by webhook. Ai Agent prompted to parse the text into structured event data. Create event in NextCloud cal (or Zoho, or GoogleCal). (Optional, intended use case) Set up the iOS Shortcut (linked in workflow) to turn your iPhone into the trigger for this workflow. Say "Siri, Add Event To Calendar," and she opens the camera, OCRs the text in the photo and sends that to the webhook. Boom city. Expected input structure [ { "body": { "cal": "work", <- this is optional for deciding among calendars "eventInfo": "Join us for Betty-Jean's 98th birthday! (Yes, we celebrate every year now...) It's October 11th at 2:30pm, at Betty-Jean's house. Come after lunch 'cause her kitchen hasn't been used in 20 years. She mellows out pretty early these days so plan for the party to end by 4:00pm." } } ] Extras Includes multiple calendar nodes.** Whether you're using NextCloud, Zoho or Google Cal, you can swap in the node you need. iOS Shortcut linked in workflow.** I also set up a Shortcut for the iPhone. The first time you use the Shortcut, you'll need to give it some permissions, and paste in your production webhook URL. Expansion option: Accept images**. iPhone has a native OCR feature but this isn't always an option. To make this workflow more versatile, consider building out a second branch to send an image to an Agent to parse the event data from the image directly. Expansion option: Multiple triggers**. You could add more triggers to receive event-related text from other sources, like an IMAP node reading your email (pro tip: set up a designated folder and give the IMAP access only to that folder). † Workflow begins with a webhook which can receive correctly-formatted data from anywhere on the web --- mailhook, webform, iOS Shortcut, etc. Direct data to this webhook from your source of text to use this workflow.
by Cheng Siong Chin
Introduction Exams create significant stress for students. This workflow automates syllabus analysis and predicts exam trends using AI, helping educators and students better prepare for GCE 'O' Level Mathematics in Singapore. How It Works Trigger → Fetch Syllabus → Extract & Prepare Data → Load History → AI Analyze → Parse → Format → Convert → Publish → Notify Workflow Template Manual Trigger → Fetch O-Level Math Syllabus → Extract Syllabus Text → Prepare Analysis Data → Load Historical Context → AI Analysis Agent → Parse AI Output → Format Report → Convert to HTML → Publish to WordPress → Send Slack Summary Data Collection & AI Processing HTTP retrieves O-Level Math syllabus from SEAB and extracts text. Loads 3-5 years exam history. OpenRouter compares syllabus vs trends, predicts topics with confidence scores. Report Generation & Publishing Formats AI insights to Markdown (topics, trends, recommendations), converts to HTML. Auto-publishes to WordPress and sends Slack summary with report link. Workflow Steps Fetch & extract syllabus from SEAB site Load historical exam content AI analyzes syllabus + trends via OpenRouter model Parse and format AI output to Markdown/HTML Auto-publish report to WordPress and Slack Setup Instructions Connect HTTP node to SEAB syllabus URL Configure OpenRouter AI model with API key Set WordPress and Slack credentials for publishing Prerequisites OpenRouter account, WordPress API access, Slack webhook, SEAB syllabus link. Use Cases Predict 2025 GCE Math topics, generate AI insights, publish summaries for educators. Customization Adapt for other subjects or boards by changing syllabus source and analysis prompt. Benefits Enables fast, data-driven exam forecasting and automated report publication.
by Hunyao
Upload a PDF and instantly get a neatly formatted Google Doc with all the readable text—no manual copy-paste, no messy line breaks. What this workflow does Accepts PDF uploads via a public form Sends the file to Mistral Cloud for high-accuracy OCR Detects and merges page images with their extracted text Cleans headers, footers, broken lines, and noise Creates a new Google Doc in your chosen Drive folder Writes the polished markdown text into the document What you need Mistral Cloud API key with OCR access Google Docs & Drive credentials connected in n8n Drive folder ID for new documents A PDF file to process (up to 100 MB) Setup Import the workflow into n8n and activate credentials. In Trigger • Form Submission, copy the webhook URL and share it or embed it. In Create • Google Doc, replace the default folder ID with yours. Fill out Mistral API key under Mistral Cloud API credentials. Save and activate the workflow. Visit the form, upload a PDF, name your future doc, and submit. Open Drive to view your newly generated, clean Google Doc. Example use cases Convert annual reports into editable text for analysis. Extract readable content from scan-only invoices for bookkeeping. Turn magazine PDFs into draft blog posts. Digitize lecture handouts for quick search and annotation. Convert image-heavy landing pages / advertorials into editable text for AI to analyze structure and content.
by Ranjan Dailata
Who this is for This workflow is designed for: Recruiters, Talent Intelligence Teams, and HR tech builders automating resume ingestion. Developers and data engineers building ATS (Applicant Tracking Systems) or CRM data pipelines. AI and automation enthusiasts looking to extract structured JSON data from unstructured resume sources (PDFs, DOCs, HTML, or LinkedIn-like URLs). What problem this workflow solves Resumes often arrive in different formats (PDF, DOCX, web profile, etc.) that are difficult to process automatically. Manually extracting fields like candidate name, contact info, skills, and experience wastes time and is prone to human error. This workflow: Converts any unstructured resume into a structured JSON Resume format. Ensures the output aligns with the JSON Resume Schema. Saves the structured result to Google Sheets and local disk for easy tracking and integration with other tools. What this workflow does The workflow automates the entire resume parsing pipeline: Step 1: Trigger Starts manually with an Execute Workflow button. Step 2: Input Setup A Set Node defines the resume_url (e.g., a hosted resume link). Step 3: Resume Content Extraction Sends the URL to Thordata Universal API, which retrieves the web content, cleans HTML/CSS, and extracts structured text and metadata. Step 4: Convert HTML → Markdown Converts the HTML content into Markdown to prepare for AI model parsing. Step 5: JSON Resume Builder (AI Extraction) Sends the Markdown to OpenAI GPT-4.1-mini, which extracts: basics: name, email, phone, location work: companies, roles, achievements education: institutions, degrees, dates skills, projects, certifications, languages, and more The output adheres to the JSON Resume Schema. Step 6: Output Handling Saves the final structured resume: Locally to disk Appends to a Google Sheet for analytics or visualization. Setup Prerequisites n8n instance (self-hosted or cloud) Credentials for: Thordata Universal API (HTTP Bearer Token). First time users Signup OpenAI API Key Google Sheets OAuth2 integration Steps Import the provided workflow JSON into n8n. Configure your Thordata Universal API Token under Credentials → HTTP Bearer Auth. Connect your OpenAI account under Credentials → OpenAI API. Link your Google Sheets account (used in the Append or update row in sheet node). Replace the resume_url in the Set Node with your own resume file or hosted link. Execute the workflow. How to customize this workflow Input Sources Replace the Manual Trigger with: A Webhook Trigger to accept resumes uploaded from your website. A Google Drive / Dropbox Trigger to process uploaded files automatically. Output Destinations Send results to: Notion, Airtable, or Supabase via API nodes. Slack / Email for recruiter notifications. Language Model Options You can upgrade from gpt-4.1-mini → gpt-4.1 or a custom fine-tuned model for improved accuracy. Summary Unstructured Resume Parser with Thordata Universal API + OpenAI GPT-4.1-mini — automates the process of converting messy, unstructured resumes into clean, structured JSON data. It leverages Thordata’s Universal API for document ingestion and preprocessing, then uses OpenAI GPT-4.1-mini to extract key fields such as name, contact details, skills, experience, education, and achievements with high accuracy.
by Daniel
Transform any website into a custom logo in seconds with AI-powered analysis—no design skills required! 📋 What This Template Does This workflow receives a website URL via webhook, captures a screenshot and fetches the page content, then leverages OpenAI to craft an optimized prompt based on the site's visuals and text. Finally, Google Gemini generates a professional logo image, which is returned as a binary response for immediate use. Automates screenshot capture and content scraping for comprehensive site analysis Intelligently generates tailored logo prompts using multimodal AI Produces high-quality, context-aware logos with Gemini's image generation Delivers the logo directly via webhook response 🔧 Prerequisites n8n self-hosted or cloud instance with webhook support ScreenshotOne account for website screenshots OpenAI account with API access Google AI Studio account for Gemini API 🔑 Required Credentials ScreenshotOne API Setup Sign up at screenshotone.com and navigate to Dashboard → API Keys Generate a new access key with screenshot permissions In the workflow, replace "[Your ScreenshotOne Access Key]" in the "Capture Website Screenshot" node with your key (no n8n credential needed—it's an HTTP query param) OpenAI API Setup Log in to platform.openai.com → API Keys Create a new secret key with chat completions access Add to n8n as "OpenAI API" credential type and assign to "OpenAI Prompt Generator" node Google Gemini API Setup Go to aistudio.google.com/app/apikey Create a new API key (free tier available) Add to n8n as "Google PaLM API" credential type and assign to "Generate Logo Image" node ⚙️ Configuration Steps Import the workflow JSON into your n8n instance Assign the required credentials to the OpenAI and Google Gemini nodes Replace the placeholder API key in the "Capture Website Screenshot" node's query parameters Activate the workflow to enable the webhook Test by sending a POST request to the webhook URL with JSON body: {"websiteUrl": "https://example.com"} 🎯 Use Cases Marketing teams prototyping brand assets**: Quickly generate logo variations for client websites during pitches, saving hours on manual design Web developers building portfolios**: Auto-create matching logos for new sites to enhance visual consistency in demos Freelance designers iterating ideas**: Analyze competitor sites to inspire custom logos without starting from scratch Educational projects on AI design**: Teach students how multimodal AI combines text and images for creative outputs ⚠️ Troubleshooting Screenshot fails (timeout/error)**: Increase "timeout" param to 120s or check URL accessibility; verify API key and quotas at screenshotone.com Prompt generation empty**: Ensure OpenAI credential has sufficient quota; test node isolation with a simple query Logo image blank or low-quality**: Refine the prompt in "Generate Logo Prompt" for more specifics (e.g., add style keywords); check Gemini API limits Webhook not triggering**: Confirm POST method and JSON body format; view execution logs for payload details
by Angel Menendez
Who it's for This workflow is ideal for YNAB users who frequently shop on Amazon and want their transaction memos to automatically show itemized purchase details. It's especially helpful for people who import bank transactions into YNAB and want to keep purchase records tidy without manual entry. How it works The workflow triggers on a set schedule, via a webhook, or manually. It retrieves all unapproved transactions from your YNAB budget, filters for Amazon purchases with empty memo fields, and processes each transaction individually. Using Gmail, it searches for matching Amazon emails (within ±5 days of the transaction date) and sends the email data to an AI agent powered by OpenAI. The AI extracts product names and prices, generating a concise memo line (up to 499 characters). If no valid purchase info is found, a fallback message is added instead. A 5-second delay prevents API rate limiting. How to set up Connect your YNAB account with valid API credentials. Connect Gmail with OAuth2 authentication. Add your OpenAI (or other LLM) API credentials. Configure the schedule trigger or use manual/webhook start. Run the workflow and monitor execution logs in n8n. Requirements YNAB API credentials Gmail OAuth2 connection OpenAI API key (or another compatible AI model) How to customize You can change the AI model (e.g., Gemini or Claude) or add HTML-to-Markdown conversion to lower token costs. Adjust the wait node delay to fit your API rate limits or modify the email date range for greater accuracy. Security note: Never store or share API keys or personal email data directly in the workflow. Use credential nodes to manage sensitive information securely.
by Roshan Ramani
Who's it for Business owners, marketers, and web developers who want to instantly respond to website contact form submissions and maintain organized lead records without manual monitoring. What it does This workflow automatically processes contact form submissions from your website, sending immediate WhatsApp notifications with formatted lead details while simultaneously logging all data to Google Sheets for organized lead management and follow-up tracking. How it works When someone submits your website contact form, the webhook instantly receives the data, formats it into a professional WhatsApp message with emojis and structure, sends the notification to your phone, and logs all details (name, email, phone, service, message, timestamp) to a Google Sheets database for permanent storage and analysis. Requirements WhatsApp Business API credentials Google Sheets API access with a spreadsheet containing these columns: date (for timestamp) name (contact's full name) email (contact's email address) phone (contact's phone number) service (requested service/interest) message (contact's message/inquiry) Website contact form that can POST to webhook URL with fields: name, email, phone, service, message n8n instance (self-hosted or cloud) Google Sheets Setup Create a new Google Sheet with the following column headers in row 1: Column A: date Column B: name Column C: email Column D: phone Column E: service Column F: message The workflow will automatically populate these columns with each form submission and use the date column for duplicate checking. How to set up Credentials Setup: Configure WhatsApp Business API credentials in the WhatsApp node Set up Google Sheets API connection and grant necessary permissions Configuration: Update the recipient phone number in the WhatsApp node (format: +1234567890) Replace the Google Sheets document ID with your spreadsheet ID Ensure your Google Sheet has the required column structure mentioned above Integration: Copy the webhook URL from the Contact Form Trigger node Configure your website form to POST data to this endpoint with field names: name, email, phone, service, message Testing: Test the workflow by submitting a sample form entry Verify WhatsApp notification is received and data appears in Google Sheets How to customize the workflow Message Format:** Modify the WhatsApp message template in the Format Lead Data node Additional Fields:** Add more form fields by updating both the Code node and Google Sheets mapping Email Notifications:** Include email alerts by adding an Email node after the Format Lead Data node Conditional Logic:** Set up different notifications for high-priority services or VIP customers Data Validation:** Add filtering rules in the Code node to handle spam or invalid submissions Multiple Recipients:** Configure the WhatsApp node to send alerts to multiple team members
by Julian Reich
This n8n template demonstrates how to automatically convert voice messages from Telegram into structured, searchable notes in Google Docs using AI transcription and intelligent tagging. Use cases are many: Try capturing ideas on-the-go while walking, recording meeting insights hands-free, creating voice journals, or building a personal knowledge base from spoken thoughts! Good to know OpenAI Whisper transcription costs approximately $0.006 per minute of audio ChatGPT tagging adds roughly $0.001-0.003 per message depending on length The workflow supports both German and English voice recognition Text messages are also supported - they bypass transcription and go directly to AI tagging Perfect companion: Combine with the "Weekly AI Review**" workflow for automated weekly summaries of all your notes! How it works Telegram receives your voice message or text and triggers the workflow An IF node intelligently detects whether you sent audio or text content For voice messages: Telegram downloads the audio file and OpenAI Whisper transcribes it to text For text messages: Content is passed directly to the next step ChatGPT analyzes the content and generates up to 3 relevant keywords (Work, Ideas, Private, Health, etc.) A function node formats everything with Swiss timestamps, message type indicators, and clean structure The formatted entry gets automatically inserted into your Google Doc with date, keywords, and full content Telegram sends you a confirmation with the transcribed/original text so you can verify accuracy How to use Simply send a voice message or text to your Telegram bot - the workflow handles everything automatically The manual execution can be used for testing, but in production this runs on every message Voice messages work best with clear speech in quiet environments for optimal transcription Requirements Telegram Bot Token and configured webhook OpenAI API account for Whisper transcription and ChatGPT tagging Google Docs API access for document writing A dedicated Google Doc where all notes will be collected Customising this workflow Adjust the AI prompt to use different tagging categories relevant to your workflow (e.g., project names, priorities, emotions) Add multiple Google Docs for different contexts (work vs. private notes) Include additional processing like sentiment analysis or automatic task extraction Connect to other apps like Notion, Obsidian, or your preferred note-taking system And don't forget to also implement the complimentary workflow Weekly AI Review!
by Meak
Form Lead Scoring with AI → Google Sheets + Slack When a new lead fills out your Typeform or any other form, this workflow classifies the message with AI, stores it in the right Google Sheet tab, and can send your team a Slack alert. Benefits Get new leads in real time from any form Classify each lead with AI (hot / neutral / cold) Save leads to the correct Google Sheets tab automatically Send Slack alerts for hot leads so you act fast Keep your pipeline clean and easy to scan How It Works Webhook receives a new form submission Parse name, email, phone, message, and timestamp AI analyzes the message and returns hot / neutral / cold Route to the matching Google Sheets tab (Hot, Neutral, Cold) (Optional) Post a Slack message with key details Who Is This For Agencies running paid ads and lead forms Sales teams that need quick triage Coaches, creators, and SaaS teams with waitlists Setup Connect your form tool (Typeform or other) to a webhook Add Google Gemini (or your AI model) credentials Connect Google Sheets (Spreadsheet ID + Tab names) (Optional) Connect Slack and select a channel Test with a few submissions to check routing ROI & Monetization Respond faster to hot leads → higher close rates Save 2–4 hours/week on manual sorting Offer as “AI lead scoring” for clients ($500–$2k/month) Strategy Insights In the full walkthrough, I show how to: Write a short, reliable prompt for clear labels Map form fields cleanly into Sheets Format Slack alerts for quick reading Expand with auto-replies or CRM sync later Check Out My Channel For more AI automation systems that get real results, check out my YouTube channel where I share the exact strategies I use to build automation workflows, win clients, and scale to $20k+ monthly revenue.
by Dietmar
Build a PDF to Vector RAG System: Mistral OCR, Weaviate Database and MCP Server A comprehensive RAG (Retrieval-Augmented Generation) workflow that transforms PDF documents into searchable vector embeddings using advanced AI technologies. 🚀 Features PDF Document Processing**: Upload and extract text from PDF files using Mistral's OCR capabilities Vector Database Storage**: Store document embeddings in Weaviate vector database for efficient retrieval AI-Powered Search**: Search through documents using semantic similarity with Cohere embeddings MCP Server Integration**: Expose the knowledge base as an AI tool through MCP (Model Context Protocol) Document Metadata**: Basic document metadata including filename, content, source, and upload timestamp Text Chunking**: Automatic text splitting for optimal vector storage and retrieval 🛠️ Technologies Used Mistral AI**: OCR and text extraction from PDF documents Weaviate**: Vector database for storing and retrieving document embeddings Cohere**: Multilingual embeddings and reranking for improved search accuracy MCP (Model Context Protocol)**: AI tool integration for external AI workflows n8n**: Workflow automation and orchestration 📋 Prerequisites Before using this template, you'll need to set up the following credentials: Mistral Cloud API: For PDF text extraction Weaviate API: For vector database operations Cohere API: For embeddings and reranking HTTP Header Auth: For MCP server authentication 🔧 Setup Instructions Import the template into your n8n instance Configure credentials for all required services Set up Weaviate collection named "KnowledgeDocuments" Configure webhook paths for the MCP server and form trigger Test the workflow by uploading a PDF document 📊 Workflow Overview PDF Upload → Text Extraction → Document Processing → Vector Storage → AI Search ↓ ↓ ↓ ↓ ↓ Form Trigger → Mistral OCR → Prepare Metadata → Weaviate DB → MCP Server 🎯 Use Cases Knowledge Base Management**: Create searchable repositories of company documents Research Documentation**: Process and search through research papers and reports Legal Document Search**: Index and search through legal documents and contracts Technical Documentation**: Make technical manuals and guides searchable Academic Literature**: Process and search through academic papers and publications ⚠️ Important Notes Model Consistency**: Use the same embedding model for both storage and retrieval Collection Management**: Ensure your Weaviate collection is properly configured API Limits**: Be aware of rate limits for Mistral, Cohere, and Weaviate APIs Document Size**: Consider chunking large documents for optimal processing 🔗 Related Resources n8n Documentation Weaviate Documentation Mistral AI Documentation Cohere Documentation MCP Protocol Documentation 📝 License This template is provided as-is for educational and commercial use.