by Angel Menendez
Phishing Email Detection and Reporting with n8n Who is this for? This workflow is designed for IT teams, security professionals, and managed service providers (MSPs) looking to automate the process of detecting, analyzing, and reporting phishing emails. What problem is this workflow solving? Phishing emails are a significant cybersecurity threat, and manually detecting and reporting them is time-consuming and prone to errors. This workflow streamlines the process by automating email analysis, generating detailed reports, and logging incidents in a centralized system like Jira. What this workflow does This workflow automates phishing email detection and reporting by integrating Gmail and Microsoft Outlook email triggers, analyzing the content and headers of incoming emails, and generating Jira tickets for flagged phishing emails. Hereβs what happens: Email Triggers: Captures incoming emails from Gmail or Microsoft Outlook. Email Analysis: Extracts email content, headers, and metadata for analysis. HTML Screenshot: Converts the emailβs HTML body into a visual screenshot. AI Phishing Detection: Leverages ChatGPT to analyze the email and detect potential phishing indicators. Jira Integration: Automatically creates a Jira ticket with detailed analysis and attaches the email screenshot for review by the security team. Customizable Reports: Includes options to customize ticket descriptions and adapt the workflow to organizational needs. Setup Authentication: Set up Gmail and Microsoft Outlook OAuth credentials in n8n to access your email accounts securely. API Keys: Add API credentials for the HTML screenshot service (hcti.io) and ChatGPT. Jira Integration: Configure your Jira project and issue types in the workflow. Workflow Configuration: Update sticky notes and nodes to include any additional setup or configuration details unique to your system. How to customize this workflow to your needs Email Filters**: Modify email triggers to filter specific subjects or sender addresses. Analysis Scope**: Adjust the ChatGPT prompt to refine phishing detection logic. Integration**: Replace Jira with your preferred ticketing system or modify the ticket fields to include additional information. This workflow provides an end-to-end automated solution for phishing email management, enhancing efficiency and reducing security risks. Itβs perfect for teams looking to minimize manual effort and improve incident response times.
by Joseph LePage
π Confluence Page AI Chatbot Workflow This n8n workflow template enables users to interact with an AI-powered chatbot designed to retrieve, process, and analyze content from Confluence pages. By leveraging Confluence's REST API and an AI agent, the workflow facilitates seamless communication and contextual insights based on Confluence page data. π How the Workflow Works π Input Chat Message The workflow begins when a user sends a chat message containing a query or request for information about a specific Confluence page. π Data Retrieval The workflow uses the Confluence REST API to fetch page details by ID, including its body in the desired format (e.g., storage, view). The retrieved HTML content is converted into Markdown for easier processing. π€ AI Agent Interaction An AI-powered agent processes the Markdown content and provides dynamic responses to user queries. The agent is context-aware, ensuring accurate and relevant answers based on the Confluence page's content. π¬ Dynamic Responses Users can interact with the chatbot to: Summarize the page's content. Extract specific details or sections. Clarify complex information. Analyze key points or insights. π Use Cases π Knowledge Management**: Quickly access and analyze information stored in Confluence without manually searching through pages. π Team Collaboration**: Facilitate discussions by summarizing or explaining page content during team chats. π Research and Documentation**: Extract critical insights from large documentation repositories for efficient decision-making. βΏ Accessibility**: Provide an alternative way to interact with Confluence content for users who prefer conversational interfaces. π οΈ Resources for Getting Started Confluence API Setup: Generate an API token for authentication via Atlassian's account management portal. Refer to Confluence's REST API documentation for endpoint details and usage instructions. n8n Installation: Install n8n locally or on a server using the official installation guide. AI Agent Configuration: Set up OpenAI or other supported language models for natural language processing.
by Grzegorz Hanus
Summarize YouTube Videos & Chat About Content with GPT-4o-mini via Telegram Description This n8n workflow automates the process of summarizing YouTube video transcripts and enables users to interact with the content through AI-powered question answering via Telegram. It leverages the GPT-4o-mini model to generate summaries and provide insights based on the videoβs transcript. How It Works Input: The workflow starts by receiving a YouTube video URL. This can be submitted through: A Telegram chat message. A webhook (e.g., triggered by a shortcut on Apple devices). Transcript Extraction: The URL is processed to extract the video transcript using the custom youtubeTranscripter community node (available here). The transcript is concatenated into a single text and stored in a Google Docs document. Summarization: The GPT-4o-mini AI model analyzes the transcript and generates a structured summary, including: A general overview. Key moments. Instructions (if applicable). The summary is then sent back to the user via Telegram. Interactive Q&A: Users can ask questions about the video content via Telegram. The AI retrieves the stored transcript from Google Docs and provides accurate, context-based answers, which are sent back through Telegram. Setup Instructions To configure this workflow, follow these steps: Import the Workflow: Download the provided JSON template and import it into your n8n instance. Install the Community Node: Install the youtubeTranscripter community node via npm: npm install n8n-nodes-youtube-transcription-kasha Important: This node requires a self-hosted n8n instance due to its external dependencies. Configure Nodes: Webhook: Set up the webhook to receive YouTube URLs. Alternatively, configure the Telegram node if using Telegram as the input method. Google Docs: Provide valid credentials to enable writing the transcript to a Google Docs document. AI Model: Set up the GPT-4o-mini model for summarization and Q&A functionality. Test the Workflow: Send a YouTube URL via your chosen input method (Telegram or webhook) and confirm that the summary is generated and delivered correctly. Customization Language**: Adjust the AI prompts to generate summaries and answers in any desired language. Output Format**: Modify the summary structure by editing the prompt in the summarization node. Input Methods**: Replace the Telegram node with another messaging or input node to adapt the workflow to different platforms. Who Can Benefit? This template is perfect for: Content Creators**: Quickly summarize video content for repurposing or review. Students and Researchers**: Extract key insights from educational or informational videos efficiently. General Users**: Interact with video content via AI without needing to watch the full video. Problem Solved This workflow simplifies video content consumption by: Automating the extraction and summarization of key points. Enabling interactive Q&A to address specific questions without rewatching the video. Additional Notes Disclaimer**: The youtubeTranscripter community node is required and only works on self-hosted n8n instances due to its reliance on external services. Apple Users**: Enhance your experience with a custom shortcut to share YouTube videos directly to the workflow. Download the shortcut here.
by Joseph LePage
Generate SEO-Optimized WordPress Content with Perplexity Research Who is This For? This workflow is ideal for content creators, marketers, and businesses looking to streamline the creation of SEO-optimized blog posts for WordPress. It is particularly suited for professionals in the AI consulting and workflow automation industries. What Problem Does This Workflow Solve? Creating high-quality, SEO-friendly blog posts can be time-consuming and challenging, especially when trying to balance research, formatting, and publishing. This workflow automates the process by integrating research capabilities, AI-driven content creation, and seamless WordPress publishing. It reduces manual effort while ensuring professional-grade output. What This Workflow Does Research: Gathers detailed insights from Perplexity AI based on user-provided queries. Content Generation: Uses OpenAI models to create structured blog posts, including titles, slugs, meta descriptions, and HTML content optimized for WordPress. Image Handling: Automatically fetches and uploads featured images to WordPress posts. Publishing: Drafts the blog post directly in WordPress with all necessary formatting and metadata. Notification: Sends a success message via Telegram upon completion. Setup Guide Prerequisites: A WordPress account with API access. OpenAI API credentials. Perplexity AI API credentials. Telegram bot credentials for notifications. Steps: Import the workflow into your n8n instance. Configure API credentials for WordPress, OpenAI, Perplexity AI, and Telegram. Customize the form trigger to define your research query. Test the workflow using sample queries to ensure smooth execution. How to Customize This Workflow to Your Needs Modify the research query prompt in the "Form Trigger" node to suit your industry or niche. Adjust content generation guidelines in the "Copywriter AI Agent" node for specific formatting preferences. Replace the image URL in the "Set Image URL" node with your own source or dynamic image selection logic.
by Yang
π What this workflow does This workflow captures a full-page screenshot of any website added to a Google Sheet and automatically uploads the screenshot to a designated Google Drive folder. It uses Dumpling AIβs screenshot API to generate the image and manages file storage through Google Drive. π€ Who is this for This is ideal for: Marketers and outreach teams capturing snapshots of client or lead websites Lead generation specialists tracking landing page visuals Researchers or analysts who need to archive website visuals from URLs Anyone looking to automate website screenshot collection at scale β Requirements A Google Sheet with a column labeled Website where URLs will be added Dumpling AI** API access for screenshot capture A connected Google Drive account with an accessible folder to store screenshots βοΈ How to set up Replace the Google Sheet and folder IDs in the workflow with your own. Connect your Dumpling AI and Google credentials in n8n. Make sure your sheet contains a Website column with valid URLs. Activate the workflow to begin watching for new entries. π How it works (Workflow Steps) Watch New Row in Google Sheets: Triggers when a new row is added to the sheet. Request Screenshot from Dumpling AI: Sends the website URL to Dumpling AI and gets a screenshot URL. Download Screenshot: Fetches the image file from the returned URL. Upload Screenshot to Google Drive: Uploads the file to a selected folder in Google Drive. π οΈ Customization Ideas Add timestamped filenames using the current date or domain name Append the Google Drive URL back to the same row in the sheet for easy access Extend the workflow to send Slack or email notifications when screenshots are saved Add filters to validate URLs before sending them to Dumpling AI
by Samir Saci
Tags: Sustainability, Web Scraping, OpenAI, Google Sheets, Newsletter, Marketing Context Hey! Iβm Samir, a Supply Chain Engineer and Data Scientist from Paris, and the founder of LogiGreen Consulting. We use AI, automation, and data to support sustainable business practices for small, medium and large companies. I use this workflow to bring awareness about sustainability and promote my business by delivering automated daily news digests. > Promote your business with a fully automated newsletter powered by AI! This n8n workflow scrapes articles from the official EU news website and sends a daily curated digest, highlighting only the most relevant sustainability news. π¬ For business inquiries, feel free to connect with me on LinkedIn Who is this template for? This workflow is useful for: Business owners** who want to promote their service or products with a fully automated newsletter Sustainability professionals** staying informed on EU climate news Consultants and analysts** working on CSRD, Green Deal, or ESG initiatives Corporate communications teams** tracking relevant EU activity Media curators** building newsletters What does it do? This n8n workflow: β° Triggers automatically every morning π Scrapes articles from the EU Commission News Portal π§ Uses OpenAI GPT-4o to classify each article for sustainability relevance π Stores the results in a Google Sheet for tracking π§Ύ Generates a beautiful HTML digest email, including titles, summaries, and images π¬ Sends the digest via Gmail to your mailing list How it works Trigger at 08:30 every morning Scrape and extract article blocks from the EU news site Use OpenAI to decide if articles are sustainability-related Store relevant entries in Google Sheets Generate HTML email with a professional layout and logo Send the digest via Gmail to a configured recipient list What do I need to get started? Youβll need: A Google Sheet connected to your n8n instance An OpenAI account with GPT-4 or GPT-4o access A Gmail OAuth credential setup Follow the Guide! Follow the sticky notes inside the workflow or check out my step-by-step tutorial on how to configure and deploy it. π₯ Watch My Tutorial Notes You can customize the system prompt to adjust how AI classifies βsustainabilityβ Works well for tracking updates relevant to climate action, green transition, and circular economy This workflow was built using n8n version 1.85.4 Submitted: April 24, 2025
by PollupAI
This n8n workflow automates the import of your Google Keep notes into a structured Google Sheet, using Google Drive, OpenAI for AI-powered processing, and JSON file extraction. It's perfect for users who want to turn exported Keep notes into a searchable, filterable spreadsheet β optionally enhanced by AI summarization or transformation. Who is this for? Researchers, knowledge workers, and digital minimalists who rely on Google Keep and want to better organize or analyze their notes. Anyone who regularly exports Google Keep notes and wants a clean, automated workflow to store them in Google Sheets. Users looking to apply AI to process, summarize, or extract insights from raw notes. What problem is this workflow solving? Exporting Google Keep notes via Google Takeout gives you unstructured .json files that are hard to read and manage. This workflow solves that by: Filtering relevant .json files Extracting note content (Optionally) applying AI to analyze or summarize each note Writing the result into a structured Google Sheet What this workflow does Google Drive Search: Looks for .json files inside a specified "Keep" folder. Loop: Processes files in batches of 10. File Filtering: Filters by .json extension. Download + Extract: Downloads each file and extracts note content from JSON. Optional Filtering: Only keeps non-archived notes or those meeting content criteria. AI Processing (optional): Uses OpenAI to summarize or transform the note content. Prepare for Export: Maps note fields to be written. Google Sheets: Appends or updates the target sheet with the note data. Setup Export your Google Keep notes using Google Takeout: Deselect all, then choose only Google Keep. Choose βSend download link via emailβ. Unzip the downloaded archive and upload the .json files to your Google Drive. Connect Google Drive, OpenAI, and Google Sheets in n8n. Set the correct folder path for your notes in the βSearch in βKeepβ folderβ node. Point the Google Sheet node to your spreadsheet How to customize this workflow to your needs Skip AI processing: If you don't need summaries or transformations, remove or disable the OpenAI Chat Model node. Filter criteria: Customize the Filter node to extract only recent notes, or those containing specific keywords. AI prompts: Edit the Tools Agent or Chat Model node to instruct the AI to summarize, extract tasks, categorize notes, etc. Field mapping: Adjust the βSet fields for exportβ node to control what gets written to the spreadsheet. Use this template to build a powerful knowledge extraction tool from your Google Keep archive β ideal for backups, audits, or data-driven insights.
by Ranjan Dailata
Who this is for The Google Trend Data Extract & Summarization workflow is ideal for trend researchers, digital marketers, content strategists, and AI developers who want to automate the extraction, summarization, and distribution of Google Trends data. This end-to-end solution helps transform trend signals into human-readable insights and delivers them across multiple channels. It is built for: Market Researchers** - Tracking trends by topic or region Content Strategists** - Identifying content opportunities from trending data SEO Analysts** - Monitoring search volume and shifts in keyword popularity Growth Hackers** - Reacting quickly to real-time search behavior AI & Automation Engineers** - Creating automated trend monitoring systems What problem is this workflow solving? Google Trends data can provide rich insights into user interests, but the raw data is not always structured or easily interpretable at scale. Manually extracting, cleaning, and summarizing trends from multiple regions or categories is time-consuming. This workflow solves the following problems: Automates the conversion of markdown or scraped HTML into clean textual input Transforms unstructured data into structured format ready for processing Uses AI summarization to generate easy-to-read insights from Google Trends Distributes summaries via email and webhook notifications Persists responses to disk for archiving, auditing, or future analytics What this workflow does Receives input: Sets an URL for the data extraction and analysis. Uses Bright Dataβs Web Unlocker to extract content from relevant site. Markdown to Textual Data Extractor: Converts markdown content into plaintext using n8nβs Function or Markdown nodes Structured Data Extract: Parses the plaintext into structured JSON suitable for AI processing Summarize Google Trends: Sends structured data to Google Gemini with a summarization prompt to extract key takeaways Send Summary via Gmail: Composes an email with the AI-generated summary and sends it to a designated recipient Persist to Disk: Writes the AI structured data to disk Webhook Notification: Sends the summarized response to an external system (e.g., Slack, Notion, Zapier) using a webhook Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set URL and Bright Data Zone for setting the brand content URL and the Bright Data Zone name. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Update Source : Update the workflow input to read from Google Sheet or Airbase etc. Gemini Prompt Tuning : Customize prompts to extract summaries like: Summarize the most significant trend shifts Generate content ideas from the trending search topics Email Personalization : Configure Gmail node to: Use dynamic subject lines like: Weekly Google Trends Summary β {{date}} Send to multiple stakeholders or mailing lists File Storage Customization : Save with timestamps, e.g., trends_summary_2025-04-29.json Extend to S3 or cloud drive integrations Webhook Use Cases : Send summary to: Internal dashboards Slack channels Automation tools like Make, Zapier etc.
by Hichul
n8n workflow template description [template] This workflow automatically drafts replies to your emails using an OpenAI Assistant, streamlining your inbox management. It's designed for support teams, sales professionals, or anyone looking to accelerate their email response process by leveraging AI to create context-aware draft replies in Gmail. How it works The workflow runs on a schedule (every minute) to check for emails with a specific label in your Gmail account. It takes the content of the newest email in a thread and sends it to your designated OpenAI Assistant for processing. A draft reply is generated by the AI assistant. This AI-generated reply is then added as a draft to the original email thread in Gmail. Finally, the initial trigger label is removed from the email thread to prevent it from being processed again. Set up steps Connect your accounts: You'll need to connect your Gmail and OpenAI accounts in the respective nodes. Configure the trigger: In the "Get threads with specific labels" Gmail node, specify the label that you want to use to trigger the workflow (e.g., generate-reply). Any email you apply this label to will be processed. Select your OpenAI Assistant: In the "Ask OpenAI Assistant" node, choose the pre-configured Assistant you want to use for generating replies. Configure label removal: In the "Remove AI label from email" Gmail node, ensure the same trigger label is selected to be removed after the draft has been successfully created. Activate the workflow: Save and activate the workflow to begin automating your email replies.
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for The Automated Resume Job Matching Engine is an intelligent workflow designed for career platforms, HR tech startups, recruiting firms, and AI developers who want to streamline job-resume matching using real-time data from LinkedIn and job boards. This workflow is tailored for: HR Tech Founders** - Building next-gen recruiting products Recruiters & Talent Sourcers** - Seeking automated candidate-job fit evaluation Job Boards & Portals** - Enriching user experience with AI-driven job recommendations Career Coaches & Resume Writers** - Offering personalized job fit analysis AI Developers** - Automating large-scale matching tasks using LinkedIn and job data What problem is this workflow solving? Manually matching a resume to job description is time-consuming, biased, and inefficient. Additionally, accessing live job postings and candidate profiles requires overcoming web scraping limitations. This workflow solves: Automated LinkedIn profile and job post data extraction using Bright Data MCP infrastructure Semantic matching between job requirements and candidate resume using OpenAI 4o mini Pagination handling for high-volume job data End-to-end automation from scraping to delivery via webhook and persisting the job matched response to disk What this workflow does Bright Data MCP for Job Data Extraction Uses Bright Data MCP Clients to extract multiple job listings (supports pagination) Pulls job data from LinkedIn with the pre-defined filtering criteria's OpenAI 4o mini LLM Matching Engine Extracts paginated job data from the Bright Data MCP extracted info via the MCP scrape_as_html tool. Extracts textual job description information via the scraped job information by leveraging the Bright Data MCP scrape_as_html tool. AI Job Matching node handles the job description and the candidate resume compare to generate match scores with insights Data Delivery Sends final match report to a Webhook Notification endpoint Persistence of AI matched job response to disk Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. In n8n, configure the OpenAi account credentials. In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token>. Update the Set input fields for candidate resume, keywords and other filtering criteria's. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. Update the file name and path to persist on disk. How to customize this workflow to your needs Target Different Job Boards Set input fields with the sites like Indeed, ZipRecruiter, or Monster Customize Matching Criteria Adjust the prompt inside the AI Job Match node Include scoring metrics like skills match %, experience relevance, or cultural fit Automate Scheduling Use a Cron Node to periodically check for new jobs matching a profile Set triggers based on webhook or input form submissions Output Customization Add Markdown/PDF formatting for report summaries Extend with Google Sheets export for internal analytics Enhance Data Security Mask personal info before sending to external endpoints
by Gaurav
Automate your entire guest communication journey from booking to post-stay with personalized welcome emails, review requests, and daily operational reports. Perfect for hotels, B&Bs, and short-term rental properties looking to enhance guest experience while reducing manual work and improving operational efficiency. How it works Pre-arrival welcome emails - Automatically sends personalized welcome emails 1-2 days before guest check-in with reservation details, hotel amenities, and contact information Post-stay review requests - Sends automated review request emails 24 hours after checkout with Google Reviews links and return guest discount codes Daily staff reports - Generates comprehensive arrival/departure reports every morning at 6 AM for front desk, housekeeping, and management teams Smart tracking - Prevents duplicate emails by automatically updating tracking status in your Google Sheets database Professional templates - Uses responsive HTML email templates that work across all devices and email clients Set up steps Connect Google Sheets - Link your hotel reservation spreadsheet (must include columns for guest details, check-in/out dates, and email tracking) Configure Gmail account - Set up Gmail credentials for sending automated emails Customize hotel information - Update hotel name, contact details, and branding in the "Edit Fields" nodes Set staff email addresses - Configure recipient addresses for daily operational reports Adjust timing - Modify schedule triggers if you want different timing for emails and reports (currently set to every 6 hours for guest emails and 6 AM daily for staff reports) Time investment: ~30 minutes for initial setup, then fully automated operation.
by Eric
Use case Instead of this: https://us06web.zoom.us/j/83456429326?pwd=1hVesbyHCsOfstyVU3z4CR6D46A8K.1 share this: mydomain.com/meet-me Do you ever wish you had one, simple URL that you can share with people to hop on a Zoom meeting? π You could waste time: ππ creating a recurring Zoom meeting π« saving the link somewhere π΅βπ« finding it, copying it each time you need it π sharing an ugly long link with everyone π€’ Or... You could create a πΉ beautiful link using your own domain/website that redirects to your Zoom meeting, and share that beautified URL with everyone. π And it will be easy for you to remember π‘ > NOTE Zoom now forces a one-year max lifetime on recurring videos. π So I created this simple workflow to solve a few headaches. βΊοΈ What this workflow does Triggers once, annually (360 days) Creates a new, recurring meeting in Zoom Updates a redirect script with the new Zoom URL on a Wordpress Page Notifies you in a Slack channel What this workflow lacks in breakthrough innovation, it makes up for with usefulness and peace of mind. Have fun and make it your own! Setup Add your credentials in each node this pre-requires you have a Zoom, Wordpress and Slack account, and have gotten API access on those accounts Create a Page in Wordpress, and get its ID. (Or create a new Page in WP.) Configure node parameters according to your needs. TEST!!!! Don't ever skip this step. Ever. Set it and forget it. > NOTE You can replace the Wordpress node with another website CMS node, or generic HTTP request for a non-wordpress site. You can also remove or replace the Slack node with other notification functionality (eg. sms, whatsapp, email...) Template was created in n8n v1.58.2