by Samir Saci
Tags: Sustainability, Web Scraping, OpenAI, Google Sheets, Newsletter, Marketing Context Hey! I’m Samir, a Supply Chain Engineer and Data Scientist from Paris, and the founder of LogiGreen Consulting. We use AI, automation, and data to support sustainable business practices for small, medium and large companies. I use this workflow to bring awareness about sustainability and promote my business by delivering automated daily news digests. > Promote your business with a fully automated newsletter powered by AI! This n8n workflow scrapes articles from the official EU news website and sends a daily curated digest, highlighting only the most relevant sustainability news. 📬 For business inquiries, feel free to connect with me on LinkedIn Who is this template for? This workflow is useful for: Business owners** who want to promote their service or products with a fully automated newsletter Sustainability professionals** staying informed on EU climate news Consultants and analysts** working on CSRD, Green Deal, or ESG initiatives Corporate communications teams** tracking relevant EU activity Media curators** building newsletters What does it do? This n8n workflow: ⏰ Triggers automatically every morning 🌍 Scrapes articles from the EU Commission News Portal 🧠 Uses OpenAI GPT-4o to classify each article for sustainability relevance 📄 Stores the results in a Google Sheet for tracking 🧾 Generates a beautiful HTML digest email, including titles, summaries, and images 📬 Sends the digest via Gmail to your mailing list How it works Trigger at 08:30 every morning Scrape and extract article blocks from the EU news site Use OpenAI to decide if articles are sustainability-related Store relevant entries in Google Sheets Generate HTML email with a professional layout and logo Send the digest via Gmail to a configured recipient list What do I need to get started? You’ll need: A Google Sheet connected to your n8n instance An OpenAI account with GPT-4 or GPT-4o access A Gmail OAuth credential setup Follow the Guide! Follow the sticky notes inside the workflow or check out my step-by-step tutorial on how to configure and deploy it. 🎥 Watch My Tutorial Notes You can customize the system prompt to adjust how AI classifies “sustainability” Works well for tracking updates relevant to climate action, green transition, and circular economy This workflow was built using n8n version 1.85.4 Submitted: April 24, 2025
by PollupAI
This n8n workflow automates the import of your Google Keep notes into a structured Google Sheet, using Google Drive, OpenAI for AI-powered processing, and JSON file extraction. It's perfect for users who want to turn exported Keep notes into a searchable, filterable spreadsheet – optionally enhanced by AI summarization or transformation. Who is this for? Researchers, knowledge workers, and digital minimalists who rely on Google Keep and want to better organize or analyze their notes. Anyone who regularly exports Google Keep notes and wants a clean, automated workflow to store them in Google Sheets. Users looking to apply AI to process, summarize, or extract insights from raw notes. What problem is this workflow solving? Exporting Google Keep notes via Google Takeout gives you unstructured .json files that are hard to read and manage. This workflow solves that by: Filtering relevant .json files Extracting note content (Optionally) applying AI to analyze or summarize each note Writing the result into a structured Google Sheet What this workflow does Google Drive Search: Looks for .json files inside a specified "Keep" folder. Loop: Processes files in batches of 10. File Filtering: Filters by .json extension. Download + Extract: Downloads each file and extracts note content from JSON. Optional Filtering: Only keeps non-archived notes or those meeting content criteria. AI Processing (optional): Uses OpenAI to summarize or transform the note content. Prepare for Export: Maps note fields to be written. Google Sheets: Appends or updates the target sheet with the note data. Setup Export your Google Keep notes using Google Takeout: Deselect all, then choose only Google Keep. Choose “Send download link via email”. Unzip the downloaded archive and upload the .json files to your Google Drive. Connect Google Drive, OpenAI, and Google Sheets in n8n. Set the correct folder path for your notes in the “Search in ‘Keep’ folder” node. Point the Google Sheet node to your spreadsheet How to customize this workflow to your needs Skip AI processing: If you don't need summaries or transformations, remove or disable the OpenAI Chat Model node. Filter criteria: Customize the Filter node to extract only recent notes, or those containing specific keywords. AI prompts: Edit the Tools Agent or Chat Model node to instruct the AI to summarize, extract tasks, categorize notes, etc. Field mapping: Adjust the “Set fields for export” node to control what gets written to the spreadsheet. Use this template to build a powerful knowledge extraction tool from your Google Keep archive – ideal for backups, audits, or data-driven insights.
by Immanuel
Automated Research Report Generation with OpenAI, Wikipedia, Google Search, Gmail/Telegram and PDF Output Description What Problem Does This Solve? 🛠️ This workflow automates the process of generating professional research reports for researchers, students, and professionals. It eliminates manual research and report formatting by aggregating data, generating content with AI, and delivering the report as a PDF via Gmail or Telegram. Target audience: Researchers, students, educators, and professionals needing quick, formatted research reports. What Does It Do? 🌟 Aggregates research data from Wikipedia, Google Search, and SerpApi. Refines user queries and generates structured content using OpenAI. Converts the content into a professional HTML report, then to PDF. Sends the PDF report via Gmail or Telegram. Key Features 📋 Real-time data aggregation from multiple sources. AI-driven content generation with OpenAI. Automated HTML-to-PDF conversion for professional reports. Flexible delivery via Gmail or Telegram. Error handling for robust execution. Setup Instructions Prerequisites ⚙️ n8n Instance**: Self-hosted or cloud n8n instance. API Credentials**: OpenAI API: API key with GPT model access, stored in n8n credentials. SerpApi (Google Search): API key from SerpApi, stored in n8n credentials (do not hardcode in nodes). Gmail API: Credentials from Google Cloud Console with Gmail scope. Telegram API: Bot token from BotFather on Telegram. Installation Steps 📦 Import the Workflow: Copy the workflow JSON from the "Template Code" section below. Import it into n8n via "Import from File" or "Import from URL". Configure Credentials: Add API credentials in n8n’s Credentials section for OpenAI, SerpApi, Gmail, and Telegram. Assign credentials to respective nodes. For example: In the SerpApi Google Search node, use n8n credentials for SerpApi: api_key={{ $credentials.SerpApiKey }}. In the Send Research PDF on Gmail node, use Gmail credentials. In the Send PDF to Telegram node, use Telegram bot credentials. Set Up Nodes: OpenAI Nodes (Research AI Agent, OpenAI Chat Model, OpenAI Chat Middle Memory): Update the model (e.g., gpt-4o) and prompt as needed. Input Validation (Input Validation node): Ensure your input query format matches the expected structure (e.g., topic: "AI ethics"). Delivery Options (Send Research PDF on Gmail, Send PDF to Telegram): Configure recipient email or Telegram chat ID. Test the Workflow: Run the workflow by clicking the "Test Workflow" node. Verify that the research report PDF is generated and sent via Gmail or Telegram. How It Works High-Level Steps 🔍 Query Refinement**: Refines the input query for better research. Aggregate Data**: Fetches data from Wikipedia, Google Search, and SerpApi. Generate Report**: Uses OpenAI to create a structured report. Convert to PDF**: Converts the report to HTML, then PDF. Deliver Report**: Sends the PDF via Gmail or Telegram. Detailed descriptions are available in the sticky notes within the workflow screenshot above. Node Names and Actions Research and Report Generation Test Workflow: Triggers the workflow for testing. Input Validation: Validates the input query. Query Refiner: Refines the query for better results. Research AI Agent: Coordinates research using OpenAI. OpenAI Chat Model: Generates content for the report. Structured Output Parser: Parses OpenAI output into structured data. OpenAI Chat Middle Memory: Retains context during research. Wikipedia Google Search: Fetches data from Wikipedia. SerpApi Google Search: Fetches data via SerpApi. Merge Split Items: Merges data from multiple sources. Aggregate: Aggregates all research data. Generate PDF HTML: Creates an HTML report. Convert HTML to PDF: Converts HTML to PDF. Download PDF: Downloads the PDF file. Send PDF to Telegram: Sends the PDF via Telegram. Send Research PDF on Gmail: Sends the PDF via Gmail. Customization Tips Expand Data Sources** 📡: Add more sources (e.g., academic databases) by adding nodes to Merge Split Items. Change Report Style** ✍️: Update the Generate PDF HTML node to modify the HTML template (e.g., adjust styling or sections). Alternative Delivery** 📧: Add nodes to send the PDF via other platforms (e.g., Slack). Adjust AI Model** 🧠: Modify the OpenAI Chat Model node to use a different model (e.g., gpt-3.5-turbo).
by Ranjan Dailata
Who this is for The Google Trend Data Extract & Summarization workflow is ideal for trend researchers, digital marketers, content strategists, and AI developers who want to automate the extraction, summarization, and distribution of Google Trends data. This end-to-end solution helps transform trend signals into human-readable insights and delivers them across multiple channels. It is built for: Market Researchers** - Tracking trends by topic or region Content Strategists** - Identifying content opportunities from trending data SEO Analysts** - Monitoring search volume and shifts in keyword popularity Growth Hackers** - Reacting quickly to real-time search behavior AI & Automation Engineers** - Creating automated trend monitoring systems What problem is this workflow solving? Google Trends data can provide rich insights into user interests, but the raw data is not always structured or easily interpretable at scale. Manually extracting, cleaning, and summarizing trends from multiple regions or categories is time-consuming. This workflow solves the following problems: Automates the conversion of markdown or scraped HTML into clean textual input Transforms unstructured data into structured format ready for processing Uses AI summarization to generate easy-to-read insights from Google Trends Distributes summaries via email and webhook notifications Persists responses to disk for archiving, auditing, or future analytics What this workflow does Receives input: Sets an URL for the data extraction and analysis. Uses Bright Data’s Web Unlocker to extract content from relevant site. Markdown to Textual Data Extractor: Converts markdown content into plaintext using n8n’s Function or Markdown nodes Structured Data Extract: Parses the plaintext into structured JSON suitable for AI processing Summarize Google Trends: Sends structured data to Google Gemini with a summarization prompt to extract key takeaways Send Summary via Gmail: Composes an email with the AI-generated summary and sends it to a designated recipient Persist to Disk: Writes the AI structured data to disk Webhook Notification: Sends the summarized response to an external system (e.g., Slack, Notion, Zapier) using a webhook Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set URL and Bright Data Zone for setting the brand content URL and the Bright Data Zone name. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Update Source : Update the workflow input to read from Google Sheet or Airbase etc. Gemini Prompt Tuning : Customize prompts to extract summaries like: Summarize the most significant trend shifts Generate content ideas from the trending search topics Email Personalization : Configure Gmail node to: Use dynamic subject lines like: Weekly Google Trends Summary – {{date}} Send to multiple stakeholders or mailing lists File Storage Customization : Save with timestamps, e.g., trends_summary_2025-04-29.json Extend to S3 or cloud drive integrations Webhook Use Cases : Send summary to: Internal dashboards Slack channels Automation tools like Make, Zapier etc.
by Hichul
n8n workflow template description [template] This workflow automatically drafts replies to your emails using an OpenAI Assistant, streamlining your inbox management. It's designed for support teams, sales professionals, or anyone looking to accelerate their email response process by leveraging AI to create context-aware draft replies in Gmail. How it works The workflow runs on a schedule (every minute) to check for emails with a specific label in your Gmail account. It takes the content of the newest email in a thread and sends it to your designated OpenAI Assistant for processing. A draft reply is generated by the AI assistant. This AI-generated reply is then added as a draft to the original email thread in Gmail. Finally, the initial trigger label is removed from the email thread to prevent it from being processed again. Set up steps Connect your accounts: You'll need to connect your Gmail and OpenAI accounts in the respective nodes. Configure the trigger: In the "Get threads with specific labels" Gmail node, specify the label that you want to use to trigger the workflow (e.g., generate-reply). Any email you apply this label to will be processed. Select your OpenAI Assistant: In the "Ask OpenAI Assistant" node, choose the pre-configured Assistant you want to use for generating replies. Configure label removal: In the "Remove AI label from email" Gmail node, ensure the same trigger label is selected to be removed after the draft has been successfully created. Activate the workflow: Save and activate the workflow to begin automating your email replies.
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for The Automated Resume Job Matching Engine is an intelligent workflow designed for career platforms, HR tech startups, recruiting firms, and AI developers who want to streamline job-resume matching using real-time data from LinkedIn and job boards. This workflow is tailored for: HR Tech Founders** - Building next-gen recruiting products Recruiters & Talent Sourcers** - Seeking automated candidate-job fit evaluation Job Boards & Portals** - Enriching user experience with AI-driven job recommendations Career Coaches & Resume Writers** - Offering personalized job fit analysis AI Developers** - Automating large-scale matching tasks using LinkedIn and job data What problem is this workflow solving? Manually matching a resume to job description is time-consuming, biased, and inefficient. Additionally, accessing live job postings and candidate profiles requires overcoming web scraping limitations. This workflow solves: Automated LinkedIn profile and job post data extraction using Bright Data MCP infrastructure Semantic matching between job requirements and candidate resume using OpenAI 4o mini Pagination handling for high-volume job data End-to-end automation from scraping to delivery via webhook and persisting the job matched response to disk What this workflow does Bright Data MCP for Job Data Extraction Uses Bright Data MCP Clients to extract multiple job listings (supports pagination) Pulls job data from LinkedIn with the pre-defined filtering criteria's OpenAI 4o mini LLM Matching Engine Extracts paginated job data from the Bright Data MCP extracted info via the MCP scrape_as_html tool. Extracts textual job description information via the scraped job information by leveraging the Bright Data MCP scrape_as_html tool. AI Job Matching node handles the job description and the candidate resume compare to generate match scores with insights Data Delivery Sends final match report to a Webhook Notification endpoint Persistence of AI matched job response to disk Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. In n8n, configure the OpenAi account credentials. In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token>. Update the Set input fields for candidate resume, keywords and other filtering criteria's. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. Update the file name and path to persist on disk. How to customize this workflow to your needs Target Different Job Boards Set input fields with the sites like Indeed, ZipRecruiter, or Monster Customize Matching Criteria Adjust the prompt inside the AI Job Match node Include scoring metrics like skills match %, experience relevance, or cultural fit Automate Scheduling Use a Cron Node to periodically check for new jobs matching a profile Set triggers based on webhook or input form submissions Output Customization Add Markdown/PDF formatting for report summaries Extend with Google Sheets export for internal analytics Enhance Data Security Mask personal info before sending to external endpoints
by Gaurav
Automate your entire guest communication journey from booking to post-stay with personalized welcome emails, review requests, and daily operational reports. Perfect for hotels, B&Bs, and short-term rental properties looking to enhance guest experience while reducing manual work and improving operational efficiency. How it works Pre-arrival welcome emails - Automatically sends personalized welcome emails 1-2 days before guest check-in with reservation details, hotel amenities, and contact information Post-stay review requests - Sends automated review request emails 24 hours after checkout with Google Reviews links and return guest discount codes Daily staff reports - Generates comprehensive arrival/departure reports every morning at 6 AM for front desk, housekeeping, and management teams Smart tracking - Prevents duplicate emails by automatically updating tracking status in your Google Sheets database Professional templates - Uses responsive HTML email templates that work across all devices and email clients Set up steps Connect Google Sheets - Link your hotel reservation spreadsheet (must include columns for guest details, check-in/out dates, and email tracking) Configure Gmail account - Set up Gmail credentials for sending automated emails Customize hotel information - Update hotel name, contact details, and branding in the "Edit Fields" nodes Set staff email addresses - Configure recipient addresses for daily operational reports Adjust timing - Modify schedule triggers if you want different timing for emails and reports (currently set to every 6 hours for guest emails and 6 AM daily for staff reports) Time investment: ~30 minutes for initial setup, then fully automated operation.
by Eric
Use case Instead of this: https://us06web.zoom.us/j/83456429326?pwd=1hVesbyHCsOfstyVU3z4CR6D46A8K.1 share this: mydomain.com/meet-me Do you ever wish you had one, simple URL that you can share with people to hop on a Zoom meeting? 😃 You could waste time: 👎👎 creating a recurring Zoom meeting 😫 saving the link somewhere 😵💫 finding it, copying it each time you need it 😭 sharing an ugly long link with everyone 🤢 Or... You could create a 🌹 beautiful link using your own domain/website that redirects to your Zoom meeting, and share that beautified URL with everyone. 😌 And it will be easy for you to remember 💡 > NOTE Zoom now forces a one-year max lifetime on recurring videos. 😐 So I created this simple workflow to solve a few headaches. ☺️ What this workflow does Triggers once, annually (360 days) Creates a new, recurring meeting in Zoom Updates a redirect script with the new Zoom URL on a Wordpress Page Notifies you in a Slack channel What this workflow lacks in breakthrough innovation, it makes up for with usefulness and peace of mind. Have fun and make it your own! Setup Add your credentials in each node this pre-requires you have a Zoom, Wordpress and Slack account, and have gotten API access on those accounts Create a Page in Wordpress, and get its ID. (Or create a new Page in WP.) Configure node parameters according to your needs. TEST!!!! Don't ever skip this step. Ever. Set it and forget it. > NOTE You can replace the Wordpress node with another website CMS node, or generic HTTP request for a non-wordpress site. You can also remove or replace the Slack node with other notification functionality (eg. sms, whatsapp, email...) Template was created in n8n v1.58.2
by Dr. Firas
AI-Powered HR Workflow: CV Analysis and Evaluation from Gmail to Sheets Who is this for? This workflow is designed for HR professionals, recruiters, startup founders, and operations teams who receive candidate resumes by email and want to automate the evaluation process using AI. It's ideal for teams that receive high volumes of applications and want to streamline screening without sacrificing quality. What problem is this workflow solving? Manually reviewing every resume is time-consuming, inconsistent, and often inefficient. This workflow automates the initial screening process by: Extracting resume data directly from incoming emails Analyzing resumes using GPT-4 to evaluate candidate fit Saving scores and notes in Google Sheets for easy filtering It helps teams qualify candidates faster while staying organized. What this workflow does Detects when a new email with a CV is received (Gmail) Filters out non-relevant messages using an AI classifier Extracts the resume text (PDF parsing) Uploads the original file to Google Drive Retrieves job offer details from a connected Google Sheet Uses GPT-4 to evaluate the candidate’s fit for the job Parses the AI output to extract the candidate's score Logs the results into a central Google Sheet Sends a confirmation email to the applicant Setup Install n8n self-hosted Add your OpenAI API Key in the AI nodes Enable the following APIs in your Google Cloud Console: Gmail API Google Drive API Google Sheets API Create OAuth credentials and connect them in n8n Configure your Gmail trigger to watch the inbox receiving CVs Create a Google Sheet with columns like: Candidate, Score, Job, Status, etc. How to customize this workflow to your needs Adjust the AI scoring prompt to match your company’s hiring criteria Add new columns to the Google Sheet for additional metadata Include Slack or email notifications for each qualified candidate Add multiple job profiles and route candidates accordingly Add a Telegram or WhatsApp step to notify HR in real time 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Arlin Perez
📨 Categorize and Label Existing Gmail Emails Automatically with GPT-4o mini 👥 Who's it for This workflow is perfect for individuals or teams who want to sort and label existing emails in their Gmail inbox 🗃️ using AI. Ideal for cleaning up unlabeled emails in bulk — no coding required! For sorting incoming emails messages in your gmail inbox, please use this free workflow: Categorize and Label Incoming Gmail Emails Automatically with GPT-4o mini 🤖 What it does It manually processes a selected number of existing Gmail emails, skips those that already have labels, sends the content to an AI Agent powered by GPT-4o mini 🧠, and applies a relevant Gmail label based on the email content. All labels must already exist in Gmail. ⚙️ How it works ▶️ Manual Trigger – The workflow starts manually when you click "Execute Workflow". 📥 Gmail Get Many Messages – Pulls a batch of existing inbox emails (default: 50). 🚫 Filter – Skips emails that already have one or more labels. 🧠 AI Agent (GPT-4o mini) – Analyzes the content and assigns a category. 🧾 Structured Output Parser – Converts the AI output into structured JSON. 🔀 Switch Node – Routes each email to the right label based on the AI result. 🏷️ Gmail Nodes – Apply the correct Gmail label to the email. 📋 Requirements Gmail account connected to n8n Gmail labels must be manually created in your inbox beforehand Labels must exactly match the category names defined in the AI prompt OpenAI credentials with GPT-4o mini access n8n's AI Agent & Structured Output Parser nodes 🛠️ How to set up In your Gmail account, create all the labels you want to use for categorizing emails Open the workflow and adjust the email fetch limit in the Gmail node (e.g., 50, 100) Confirm that the Filter skips emails that already have labels Define your categories in the AI Agent prompt — these must match the Gmail labels exactly In the Switch Node, create a condition for each label/category Ensure each Gmail Label Node applies the correct existing label Save the workflow and run it manually whenever you want to organize your inbox ✅ 🎨 How to customize the workflow Add or remove categories in the AI prompt & Switch Node Adjust the batch size of emails to process more or fewer per run Fine-tune the AI prompt to suit your inbox type (e.g., work, personal, client support)
by Adam Janes
How it works: Whenever a new event is scheduled on your Google Calendar, this workflow generates a Meeting Briefing email, giving an overview of each person on the call and the company they work for. It makes use of the web search tool on the OpenAI Responses API to make lookups. The workflow triggers when a new event is added to the calendar, loops over each attendee, generating reports on each person and their company, collates the results, and sends the briefing as an email. Set up steps: Add your credentials for Google Calendar (for viewing events) and Gmail (to send the email) Add your OpenAI credentials as a Header Auth on the Company Search and Person Search nodes. Name: Authorization Value: Bearer {{ YOUR_API_KEY }} Edit the "Edit Fields" node with the email that you want to send the briefing to, and a short bit of context about yourself.
by Junichiro Tobe
Who is this for? This workflow is perfect for busy professionals, students, or anyone who struggles to keep their Gmail inbox organized and clutter-free. What problem is this workflow solving? It helps you avoid email overload by automating the process of organizing your Gmail inbox. Unnecessary emails are archived, while important emails are categorized into "MustRead" or "NotNeed" for better prioritization. What this workflow does Connects to your Gmail inbox. Automatically archives emails that are unnecessary or irrelevant. Sorts remaining emails into two categories: MustRead: Emails that require immediate attention. NotNeed: Less critical emails for review later. Setup Connect your Gmail account to the workflow. Define the criteria for "MustRead" and "NotNeed" emails by updating the filter rules in the nodes. Activate the workflow to start organizing your inbox. How to customize this workflow to your needs Adjust the filters for archiving emails based on your specific preferences. Modify the sorting rules for "MustRead" and "NotNeed" categories to match your workflow. Add additional actions, such as sending notifications for "MustRead" emails.