by Samir Saci
Tags*: Sustainability, Business Travel, Carbon Emissions, Flight Tracking, Carbon Interface API Context Hi! I’m Samir — a Supply Chain Engineer and Data Scientist based in Paris, and founder of LogiGreen Consulting. I help companies monitor and reduce their environmental footprint by combining AI automation, carbon estimation APIs, and workflow automation. This workflow is part of our sustainability reporting initiative, allowing businesses to track the CO₂ emissions of employee flights. > Automate carbon tracking for your business travel with AI-powered workflows in n8n! 📬 For business inquiries, feel free to connect with me on LinkedIn Who is this template for? This workflow is designed for travel managers, sustainability teams, or finance teams who need to measure and report on emissions from business travel. Let’s imagine your company receives a flight confirmation email: The AI Agent reads the email and extracts structured data, such as flight dates, airport codes, and number of passengers. Then, the Carbon Interface API is called to estimate CO₂ emissions, which are stored in a Google Sheet for sustainability reporting. How does it work? This workflow automates the end-to-end process of tracking flight emissions from email to CO₂ estimation: 📨 Gmail Trigger captures booking confirmations 🧠 AI Agent extracts structured data (airports, dates, flight numbers) ✈️ Each flight leg is processed individually 🌍 Carbon Interface API returns distance and carbon emissions 📄 A second Google Sheet node appends the emission data for reporting Steps: 💌 Trigger on new flight confirmation email 🧠 Extract structured trip data using AI Agent (flights, airports, dates) 📑 Store flight metadata in Google Sheets 🧭 For each leg, call the Carbon Interface API 📥 Append distance, CO₂ in kg, and timestamp to the flight row What do I need to get started? You’ll need: A Gmail account receiving SAP Concur or travel confirmation emails A Google Sheet to record trip metadata and CO₂ emissions A free Carbon Interface API key Access to OpenAI for parsing the email via AI Agent A few sample flight confirmation emails to test Next Steps 🗒️ Use the sticky notes in the n8n canvas to: Add your Gmail and Carbon Interface credentials Send a sample booking email to your inbox Verify that emissions and distances are correctly added to your sheet This template was built using n8n v1.93.0 Submitted: June 7, 2025
by Samir Saci
Tags*: Ghost CMS, SEO Audit, Image Optimisation, Alt Text, Google Sheets, Automation Context Hi! I’m Samir — a Supply Chain Engineer and Data Scientist based in Paris, and founder of LogiGreen Consulting. I help companies and content creators use automation and analytics to improve visibility, enhance performance, and reduce manual work. > Let’s use n8n to automate SEO audits to increase your traffic! 📬 For business inquiries, feel free to connect on LinkedIn Who is this template for? This workflow is perfect for bloggers, marketers, or content teams using Ghost CMS who want to: Extract and review all images from articles Detect missing or short alt texts Check image file size and filename SEO compliance Push the audit results into a Google Sheet How does it work? This n8n workflow extracts all blog posts from Ghost CMS, scans the HTML to collect all embedded images, then evaluates each image for: ✅ Presence and length of alt text 📏 File size in kilobytes 🔤 Filename SEO quality (e.g. lowercase, hyphenated, no special chars) All findings are written to Google Sheets for further analysis or manual cleanup. 🧭 Workflow Steps: 🚀 Trigger the workflow manually or on schedule 📰 Extract blog post content from Ghost CMS 🖼️ Parse all ` tags with src and alt` attributes 📤 Store image metadata in a Google Sheet (step 1) 🌐 Download each image using HTTP request 🧮 Extract file size, extension, and filename SEO flag 📄 Update the audit sheet with size and format insights What do I need to get started? This workflow requires: A Ghost Content API key A Google Sheet (to log audit results) No AI or external APIs required — works fully with built-in nodes Next Steps 🗒️ Follow the sticky notes inside the workflow to: Plug in your Ghost blog credentials Select or create a Google Sheet Run the audit and start improving your SEO! This template was built using n8n v1.93.0 Submitted: June 8, 2025
by explorium
Automatically enrich prospect data from HubSpot using Explorium and create leads in Salesforce This n8n workflow streamlines the process of enriching prospect information by automatically pulling data from HubSpot, processing it through Explorium's AI-powered tools, and creating new leads in Salesforce with enhanced prospect details. Credentials Required To use this workflow, set up the following credentials in your n8n environment: HubSpot Type**: App Token (or OAuth2 for broader compatibility) Used for**: triggering on new contacts, fetching contact data Explorium API Type**: Generic Header Auth Header**: Authorization Value**: Bearer YOUR_API_KEY Get explorium api key Salesforce Type**: OAuth2 or Username/Password Used for**: creating new lead records Go to Settings → Credentials, create these three credentials, and assign them in the respective nodes before running the workflow. Workflow Overview Node 1: HubSpot Trigger This node listens for real-time events from the connected HubSpot account. Once triggered, the node passes metadata about the event to the next step in the flow. Node 2: HubSpot This node fetches contact details from HubSpot after the trigger event. Credential**: Connected using a HubSpot App Token Resource**: Contact Operation**: Get Contact Return All**: Disabled This node retrieves the full contact details needed for further processing and enrichment. Node 3: Match prospect This node sends each contact's data to Explorium's AI-powered prospect matching API in real time. Method**: POST Endpoint**: https://api.explorium.ai/v1/prospects/match Authentication**: Generic Header Auth (using a configured credential) Headers**: Content-Type: application/json The request body is dynamically built from contact data, typically including: full_name, company_name, email, phone_number, linkedin. These fields are matched against Explorium's intelligence graph to return enriched or validated profiles. Response Output: total_matches, matched_prospects, and a prospect_id. Each response is used downstream to enrich, validate, or create lead information. Node 4: Filter This node filters the output from the Match prospect step to ensure that only valid, matched results continue in the flow. Only records that contain at least one matched prospect with a non-null prospect_id are passed forward. Status: Currently deactivated (as shown by the "Deactivate" label) Node 5: Extract Prospect IDs from Matched Results This node extracts all valid prospect_id values from previously matched prospects and compiles them into a flat array. It loops over all matched items, extracts each prospect_id from the matched_prospects array and returns a single object with an array of all prospect_ids. Node 6: Explorium Enrich Contacts Information This node performs bulk enrichment of contacts by querying Explorium with a list of matched prospect_ids. Node Configuration: Method**: POST Endpoint**: https://api.explorium.ai/v1/prospects/contacts_information/bulk_enrich Authentication**: Header Auth (using saved credentials) Headers**: "Content-Type": "application/json", "Accept": "application/json" Returns enriched contact information, such as: emails**: professional/personal email addresses phone_numbers**: mobile and work numbers professions_email, **professional_email_status, mobile_phone Node 7: Explorium Enrich Profiles This additional enrichment node provides supplementary contact data enhancement, running in parallel with the primary enrichment process. Node 8: Merge This node combines multiple data streams from the parallel enrichment processes into a single output, allowing you to consolidate data from different Explorium enrichment endpoints. The "combine" setting indicates it will merge the incoming data streams rather than overwriting them. Node 9: Code - flatten This custom code node processes and transforms the merged enrichment data before creating the Salesforce lead. It can be used to: Flatten nested data structures Format data according to Salesforce field requirements Apply business logic or data validation Map Explorium fields to Salesforce lead properties Handle data type conversions Node 10: Salesforce This final node creates new leads in Salesforce using the enriched data returned by Explorium. Credential**: Salesforce OAuth2 or Username/Password Resource**: Lead Operation**: Create Lead The node creates new lead records with enriched information including contact details, company information, and professional data obtained through the Explorium enrichment process. Workflow Flow Summary Trigger: HubSpot webhook triggers on new/updated contacts Fetch: Retrieve contact details from HubSpot Match: Find prospect matches using Explorium Filter: Keep only successfully matched prospects (currently deactivated) Extract: Compile prospect IDs for bulk enrichment Enrich: Parallel enrichment of contact information through multiple Explorium endpoints Merge: Combine enrichment results Transform: Flatten and prepare data for Salesforce (Code node) Create: Create new lead records in Salesforce This workflow ensures comprehensive data enrichment while maintaining data quality and providing a seamless integration between HubSpot prospect data and Salesforce lead creation. The parallel enrichment structure maximizes data collection efficiency before creating high-quality leads in your CRM system.
by Miquel Colomer
📝 Overview This workflow transforms n8n into a smart real-estate concierge by combining an AI chat interface with Bright Data’s marketplace datasets. Users interact via chat to specify city, price, bedrooms, and bathrooms—and receive a curated list of three homes for sale, complete with images and briefings. 🎥 Workflow in Action Want to see this workflow in action? Play the video 🔑 Key Features AI-Powered Chat Trigger:** Instantly start conversations using LangChain’s Chat Trigger node. Contextual Memory:** Retain up to 30 recent messages for coherent back-and-forth. Bright Data Integration:** Dynamically filter “FOR\_SALE” properties by city, price, bedrooms, and bathrooms (limit = 3). Automated Snapshot Retrieval:** Poll for dataset readiness and fetch full snapshot content. HTML-Formatted Output:** Present results as a ` of ` items, embedding property images. 🚀 How It Works (Step-by-Step) Prerequisites: n8n ≥ v1.0 Community nodes: install n8n-nodes-brightdata (the unverified community node) API credentials: OpenAI, Bright Data Webhook endpoint to receive chat messages Node Configuration: Chat Trigger: Listens for incoming chat messages; shows a welcome screen. Memory Buffer: Stores the last 30 messages for context. OpenAI Chat Model: Uses GPT-4o-mini to interpret user intent. Real Estate AI Agent: Orchestrates filtering logic, calls tools, and formats responses. Bright Data “Filter Dataset” Tool: Applies user-defined filters plus homeStatus = FOR_SALE. Wait & Recover Snapshot: Polls until snapshot is ready, then fetches content. Get Snapshot Content: Converts raw JSON into a structured list. Workflow Logic: User sends search criteria → Agent validates inputs. Agent invokes “Filter Dataset” once all filters are present. Upon dataset readiness, the snapshot is retrieved and parsed. Final output rendered as a bullet list with property images. Testing & Optimization: Use the built-in Execute Workflow trigger for rapid dry runs. Inspect node outputs in n8n’s UI; adjust filter defaults or snapshot limits. Tune OpenAI model parameters (e.g., maxIterations) for faster responses. Deployment & Monitoring: Activate the main workflow and expose its webhook URL. Monitor executions in the “Executions” panel; set up alerts for errors. Archive or duplicate workflows as needed; update credentials via credential manager. ✅ Pre-requisites Bright Data Account:** API key for marketplaceDataset. OpenAI Account:** Access to GPT-4o-mini model. n8n Version:** v1.0 or later with community node support. Permissions:** Webhook access, credential vault read/write. 👤 Who Is This For? Real-estate agencies and brokers seeking to automate client queries. PropTech startups building conversational search tools. Data analysts who want on-demand property snapshots without manual scraping. 📈 Benefits & Use Cases Time Savings:** Replace manual MLS searches with an AI-driven chat. Scalability:** Serve multiple clients simultaneously via webchat or embedded widget. Consistency:** Always report exactly three properties, ensuring concise results. Engagement:** Visual listings with images boost user satisfaction and conversion. Workflow created and verified by Miquel Colomer https://www.linkedin.com/in/miquelcolomersalas/ and N8nHackers https://n8nhackers.com
by Gavin
This Workflow does a HTTPs request to ConnectWise Manage through their REST API. It will pull all tickets in the "New" status or whichever status you like, and notify your dispatch team/personnel whenever a new ticket comes in using Microsoft Teams. Video Explanation https://youtu.be/yaSVCybSWbM
by Matteo
This n8n workflow automates the handling of incoming emails. It detects and filters out spam, searches a knowledge base (FAQ) stored in a Pinecone vector database, and sends a reply using Gmail — all powered by an AI model (GPT-4o mini). How It Works Receiving Emails The Gmail Trigger node checks a Gmail inbox every hour. When a new email arrives, it starts the workflow. Fetching Full Email Content The get_message node retrieves all the details of the message: sender, subject, text, message ID, etc. Spam Filtering The Spam checker node uses GPT-4o mini to classify the email as either "spam" or "no spam". It detects not only classic spam but also automated messages (e.g. from Google or Microsoft). If marked as "spam", the workflow ends and nothing is processed. Conditional Filter The If node checks the spam result. Only "no spam" emails proceed to the AI Agent. AI-Based Reply The AI Agent node generates a response based on: The email content A system prompt defining the assistant’s behavior (polite, professional, under the name “Total AI Solutions”) Information retrieved from the Pinecone Vector Store, which contains FAQs The AI is instructed to always check the vector store before replying. The AI prepares both the subject and the body of the reply. Sending the Reply The Gmail node sends the reply to the original sender. It uses the original email's ID to keep the thread intact. Language Model The OpenAI Chat Model node provides GPT-4o mini as the language engine for generating responses. Memory Support The Simple Memory node maintains short-term context, helpful in multi-turn conversations. Knowledge Base (FAQ) The Pinecone Vector Store node connects to a Pinecone index (faqmattabott) containing vectorized FAQ content. Vectors are created using the Embeddings OpenAI node.
by Jacob
Unlock the full potential of your YouTube channel with our powerful integration that connects Google Sheets and DeepSeek AI — designed to skyrocket your video visibility and engagement without manual hassle. What this integration does for you: Automates video data management by pulling your YouTube URLs straight from Google Sheets — no more copy-pasting or manual tracking. Extracts your current titles and descriptions directly from YouTube, giving you a clear starting point. Generates 3 high-impact, SEO-optimized titles plus 1 compelling, conversion-focused description — crafted by DeepSeek’s AI to grab attention and rank higher. Updates your Google Sheet automatically with new optimized titles and descriptions — keeping all your video info in one place, ready to publish. Why it matters: In the crowded world of YouTube, having the right title and description can make all the difference between millions of views or being lost in the noise. This integration takes the guesswork out of optimization, saving you time and boosting your channel’s growth with proven AI-driven content. You need to Sheet with columns: Url Keyword Status Old Title New Title Old Description New Description My contact: jacobmarketingservice@gmail.com
by Gopal Debnath
💡 How It Works: ⏰ Triggers daily at 6:00 AM 📊 Fetches one random question from your Google Sheet 🧠 Formats question, options, correct answer, and explanation 📤 Sends it to: 📧 Email 💬 Telegram (via Bot) 📱 WhatsApp/SMS (via Twilio) 🔧 What You Need to Configure: YOUR_GOOGLE_SHEET_ID → your sheet with columns: question, optionA, optionB, optionC, optionD, correctAnswer, explanation Email credentials (SMTP) Telegram Bot Token & Chat ID Twilio phone numbers and credentials
by Friedemann Schuetz
What this workflow does This workflow retrieves Google Analytics data from the last 7 days and the same period in the previous year. The data is then prepared by AI as a table, analyzed and provided with a small summary. The summary is then sent by email to a desired address and, shortened and summarized again, sent to a Telegram account. This workflow has the following sequence: time trigger (e.g. every Monday at 7 a.m.) retrieval of Google Analytics data from the last 7 days assignment and summary of the data retrieval of Google Analytics data from the last 7 days of the previous year allocation and summary of the data preparation in tabular form and brief analysis by AI. sending the report as an email preparation in short form by AI for Telegram (optional) sending as Telegram message. Requirements The following accesses are required for the workflow: Google Analytics (via Google Analytics API): Documentation AI API access (e.g. via OpenAI, Anthropic, Google or Ollama) SMTP access data (for sending the mail) Telegram access data (optional for sending as Telegram message): Documentation Feel free to contact me via LinkedIn, if you have any questions!
by Automate With Marc
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🧠 Market Research & Business Case Study Generator Category: AI + Research | GPT + Perplexity | Business Strategy Skill Level: Intermediate Use Case: Market Research, Business Planning, Strategic Analysis 📌 Description: This template automates the creation of comprehensive, data-backed business case studies—perfect for entrepreneurs, analysts, consultants, and market researchers. For more of such build + step-by-step video tutorials, check out: https://www.youtube.com/@Automatewithmarc Just send a simple message like: “Give me a market opportunity analysis of a bicycle rental business in North Africa.” And the workflow does the rest. It scopes your research topic, performs live web research, and crafts a well-structured 1500-word business case study—all automatically saved to Google Docs. 🔧 How It Works: 🟢 Chat Trigger: Start the workflow by sending a prompt via the built-in Chat interface (Langchain Chat Trigger). 🧭 Research Scope Definer (GPT-4o): Breaks down the user input into structured components like industry, geography, trends, and challenges. 🌐 Deep Research (Perplexity Sonar): Performs live research to retrieve relevant industry data, consumer trends, competitive insights, and more. 📘 Business Case Writer (Claude Sonnet): Synthesizes the findings into a detailed case study with sections including: Executive Summary Market Overview Opportunity Analysis Competitive Landscape Risks & Challenges Strategic Recommendations Conclusion 📄 Google Docs Integration: The final output is appended to a connected Google Doc, so all your insights are neatly stored and ready to share. 🧰 Tools Used: OpenAI GPT-4o Perplexity Sonar Deep Research Anthropic Claude Sonnet Google Docs Chat Trigger ✅ Ideal For: Business consultants & strategy teams Market researchers & analysts Startup founders & product managers Educators & MBA students
by Darsheel
This n8n workflow acts as an AI-powered Inbox Assistant that automatically summarizes and classifies Gmail emails, prioritizes important messages, and sends a daily digest to Slack. It’s ideal for startup founders and small teams juggling investor intros, customer leads, and support queries — all from a busy Gmail inbox. Each email is processed using ChatGPT to generate a concise summary, classify the message (e.g., Support, Investor, Spam), and determine its urgency. High and medium priority messages are forwarded to Slack instantly. Lower priority emails are logged to Google Sheets for review. A daily 7 PM digest summarizes the day’s most important messages. 💡 Use Cases Preventing missed investor or lead emails Lightweight CRM alternative using Google Sheets Slack summaries of critical Gmail activity 🔧 How It Works Gmail node fetches new messages ChatGPT summarizes and extracts urgency + type High/medium urgency → sent to Slack + labeled in Gmail Low urgency → logged in Google Sheets Cron node triggers a daily 7 PM Slack summary ✅ Requirements OpenAI API Key (GPT-4 or GPT-4o recommended) Gmail access with read and label permission Slack Bot Token or Webhook URL Google Sheets integration (optional) 🛠 Customization Ideas Replace Slack with Telegram or WhatsApp Route investor leads to Airtable or Notion Add multi-language support in ChatGPT prompt Create weekly summaries via email
by Automate With Marc
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🧠 AI News Update Every 24 Hours (with Perplexity + GPT Formatter) Description: This workflow automatically delivers a clean, AI-curated summary of the latest AI news headlines from the past 24 hours directly to your inbox — every morning at 9 AM. For step-by-step video tutorial for this build, watch here: https://youtu.be/O-DLvaMVLso 🧰 How It Works: 🕘 Schedule Trigger Runs daily at 9 AM to start the workflow. 🔎 Perplexity AI Agent Searches for AI-related headlines published in the last 24 hours, including: Headline 1-sentence summary Source Full URL 🧠 GPT Formatter AI Agent Uses an OpenAI language model (GPT-4.1-mini) to reformat raw news data into a clean, readable email update. 🧷 Memory Buffer (Optional) Gives the formatter context and continuity if you want to personalize formatting further over time. 📧 Gmail Node Sends the formatted AI news digest to your inbox (or your team’s) daily. 📦 Tools & APIs Required: ✅ Perplexity AI API ✅ OpenAI API ✅ Gmail Account (OAuth2 credentials) 🔄 Use Cases: Daily AI trend monitoring for individuals or teams Automating internal market intelligence Research triggers for blog or content creation Email digests for newsletters or Slack updates 🛠️ Customizable Ideas: Swap Gmail for Slack, Telegram, Discord, etc. Modify the topic (e.g., Climate Tech, Crypto News) Add sentiment analysis or AI-generated commentary Send summary to Google Docs or Notion instead