by Dajeel Dulal
Turn any LinkedIn post into a personalized cold email opener that sounds like a human wrote it in seconds. Whether you're in sales, partnerships, or outreach, this tool reads LinkedIn posts like a human, distills the core message, and gives you a smart, conversational opener to kick off the relationship the right way. How It Works: 1.) Paste the post + author info into a short form. 2.) AI reads the post like a B2B sales expert would. 3.) Output = personalized opener, company name, prospect’s name, and next steps. 4.) Copy-paste into your cold email and hit send. The opener isn’t generic fluff — it references real details, sounds natural, and shows you actually paid attention. Perfect For: SDRs and BDRs Agency outreach Partnership prospecting Any cold outreach that starts with a real conversation Setup Steps Setup time: ~2-3 mins 1.) Add your OpenAI credentials (or use n8n’s built-in credits). 2.) Open the form and test it with the sample post. 3.) Tweak the AI prompt if you want to target a different niche or tone. (Optional) Connect to Google Sheets, a CRM, or your email tool. You're live.
by Jimleuk
This n8n template demonstrates an approach to image embeddings for purpose of building a quick image contextual search. Use-cases could for a personal photo library, product recommendations or searching through video footage. How it works A photo is imported into the workflow via Google Drive. The photo is processed by the edit image node to extract colour information. This information forms part of our semantic metadata used to identify the image. The photo is also processed by a vision-capable model which analyses the image and returns a short description with semantic keywords. Both pieces of information about the image are combined with the metadata of the image to form a document describing the image. This document is then inserted into our vector store as a text embedding which is associated with our image. From here, the user can query the vector store as they would any document and the relevant image references and/or links should be returned. Requirements Google account to download image files from Google Drive. OpenAI account for the Vision-capable AI and Embedding models. Customise this workflow Text summarisation is just one of many techniques to generate image embeddings. If the results are unsatisfactory, there are dedicated image embedding models such as Google's vertex AI multimodal embeddings.
by Tarek Mustafa
Who is this for? Jira users who want to automate the generation of a Lessons Learned or Retrospective report after an Epic is Done. What problem is this workflow solving? / use case Lessons Learned / Retrospective reports are often omitted in Agile teams because they take time to write. With the use of n8n and AI this process can be automated. What is this workflow doing Triggers automatically upon an Epic reaching the "Done" status in Jira. Collects all related tasks and comments associated with the completed Epic. Intelligently filters the gathered data to provide the LLM with the most relevant information. Utilizes an LLM with a structured System Message to generate insightful reports. Delivers the finalized report directly to your specified Google Docs document. Setup Create a Jira API key and follow the Credentials Setup in the Jira trigger node. Create credentials for Google Docs and paste your document ID into the Node. How to customize this workflow to your needs Change the System Message in the AI Agent to fit your needs.
by Jimleuk
Note: This template only works for self-hosted n8n. This n8n template demonstrates how to use the Langchain code node to track token usage and cost for every LLM call. This is useful if your templates handle multiple clients or customers and you need a cheap and easy way to capture how much of your AI credits they are using. How it works In our mock AI service, we're offering a data conversion API to convert Resume PDFs into JSON documents. A form trigger is used to allow for PDF upload and the file is parsed using the Extract from File node. An Edit Fields node is used to capture additional variables to send to our log. Next, we use the Information Extractor node to organise the Resume data into the given JSON schema. The LLM subnode attached to the Information Extractor is a custom one we've built using the Langchain Code node. With our custom LLM subnode, we're able to capture the usage metadata using lifecycle hooks. We've also attached a Google Sheet tool to our LLM subnode, allowing us to send our usage metadata to a google sheet. Finally, we demonstrate how you can aggregate from the google sheet to understand how much AI tokens/costs your clients are liable for. Check out the example Client Usage Log - https://docs.google.com/spreadsheets/d/1AR5mrxz2S6PjAKVM0edNG-YVEc6zKL7aUxHxVcffnlw/edit?usp=sharing How to use SELF-HOSTED N8N ONLY** - the Langchain Code node is only available in the self-hosted version of n8n. It is not available in n8n cloud. The LLM subnode can only be attached to non-"AI agent" nodes; Basic LLM node, Information Extractor, Question & Answer Chain, Sentiment Analysis, Summarization Chain and Text Classifier. Requirements Self-hosted version of n8n OpenAI for LLM Google Sheets to store usage metadata Customising this template Bring the custom LLM subnode into your own templates! In many cases, it can be a drop-in replacement for the regular OpenAI subnode. Not using Google Sheets? Try other databases or a HTTP call to pipe into your CRM.
by Derek Cheung
Use case This workflow enables a Telegram bot that can: Accept speech input in one of 55 supported languages Automatically detect the language spoken and translate the speech to another language Responds back with the translated speech output. This allows users to communicate across language barriers by simply speaking to the bot, which will handle the translation seamlessly. How does it work? Translation In the translation step the workflow converts the user's speech input to text and detects the language of the input text. If it's English, it will translate to French. If it's French, it will translate to English. To change the default translation languages, you can update the prompt in the AI node. Output In the output step, we provide the translated text output back to the user and speech output is generated in the translated language. Setup steps Obtain Telegram API Token Start a chat with the BotFather. Enter /newbot and reply with your new bot's display name and username. Copy the bot token and use it in the Telegram node credentials in n8n. Update the Settings node to customize the desired languages Activate the flow Full list of supported languages All supported languages:
by Radouane Driouich
Automatically Categorize Gmail Emails with GPT-4o-mini Multi-Label Analysis Description The "Automatically Categorize Gmail Emails with GPT-4o-mini Multi-Label Analysis" template is designed specifically for professionals, business owners, entrepreneurs, and anyone struggling to manage a high volume of daily emails. It solves common inbox problems such as email overload, missed important messages, manual sorting inefficiencies, and unorganized inbox clutter. By using intelligent content analysis powered by GPT-4o-mini, this workflow automatically categorizes incoming Gmail messages with relevant labels, ensuring efficient email management and significantly boosting productivity. Workflow Overview How It Works Email Detection**: Continuously monitors your Gmail inbox every minute to detect new incoming emails. Content Extraction**: Retrieves key email components including sender details, subject line, and body content for analysis. Intelligent Labeling**: Utilizes GPT-4o-mini AI to contextually analyze each email and assign 1-3 relevant labels based on your existing Gmail label structure. Automatic Application**: Applies the selected labels directly to your emails, equipped with robust error-handling mechanisms to ensure accuracy and reliability. Key Benefits Organized Inbox**: Automatically maintains inbox order and clarity. Time-Saving**: Reduces manual email management effort significantly. Customization**: Fully adaptable to specific labeling and organizational requirements. Pre-conditions Before using this template, ensure the following prerequisites are met: Active Gmail account with OAuth2 enabled. Active OpenAI account with GPT-4o-mini API key. Clearly defined labels set up in your Gmail account (e.g., "Work", "Personal", "Urgent"). Setup Instructions Follow these straightforward setup steps to activate the workflow: Connect Gmail Account Authorize your Gmail account using OAuth2 (takes approximately 2-3 minutes). Configure OpenAI GPT-4o-mini API Enter and validate your GPT-4o-mini API key to enable advanced email analysis. Establish Gmail Labels Ensure necessary labels are created within Gmail. Examples include "Work", "Personal", and "Urgent". Activate and Verify Click the "Activate" button in n8n. Send a test email to your Gmail inbox to confirm that labels are applied correctly. Customization Tips You can easily customize this workflow to fit your specific needs: Modify Gmail Labels**: Create and adapt labels to match your business or personal categorization strategy. Adjust GPT-4o-mini Criteria**: Fine-tune the AI prompts to improve accuracy and relevance based on your unique email management needs. Expand the Workflow**: Integrate additional conditions, actions, or external applications to further automate and optimize your email management processes. Improve your daily workflow efficiency and achieve a clutter-free Gmail inbox by leveraging the power of GPT-4o-mini today.
by Jimleuk
This n8n template demonstrates how you can leverage existing support site search to power your Support Chatbots and agents. Building a support chatbot need not be complicated! If building and indexing vector stores or duplicating data isn't necessarily your thing, an alternative implementation of the RAG approach is to leverage existing knowledge-bases such as support portals. In this way, document management and maintenance of your support agent is significantly reduced. Disclaimer: This template example uses AcuityScheduling's help center website but is not associated, supported nor endorsed by the company. How it works A simple AI agent is connected with chat trigger to receive user queries. The AI agent is instructed to fetch information from the knowledge-base via the attached custom workflow tool (aka "knowledgebase tool"). There is no step to replicate the entire support articles database into a vector store. You may choose not too because of time, cost and maintainence involved. Instead, the tool leverages the existing support portal's search API to retrieve knowledge-base articles. Finally, the search results are formatted before sending an aggregated response back to the agent. How to use? Customise the subworkflow to work with your own support portal API and format accordingly. Try the following queries How do I connect my icloud to acuityScheduling? How do I download past invoices for my Acuity account? Requirements OpenAI for LLM. If your organisation's APIs require authorisation, you may need to add custom credentials as necessary. Customising this workflow Add additional tools to reach other parts of your internal knowledgebase. Not using OpenAI? Feel free to swap but ensure the LLM has tools/function calling support.
by Ranjan Dailata
Who is this for? This workflow is designed for HR professionals, employer branding teams, talent acquisition strategists, market researchers, and business intelligence analysts who want to monitor, understand, and act upon employee sentiment and company perception on Glassdoor. It's ideal for organizations that value real-time feedback, are tracking employer brand perception, or need summarized insights for leadership reporting without sifting through thousands of raw reviews. What problem is this workflow solving? Manually reviewing and analyzing Glassdoor reviews is tedious, subjective, and not scalable especially for larger companies or those with many subsidiaries. This workflow: Automates review collection by making a Glassdoor company request via the Bright Data Web Scrapper API. Uses Google Gemini to summarize the content. Sends an actionable summary to HR dashboards, leadership teams, or alert systems via the Webhook notification. What this workflow does Makes an HTTP Request to Glassdoor via the Bright Data Web Scrapper API. Polls the BrightData Glassdoor for the completion of the request. Downloads the Glassdoor response when a new snapshot is ready. Sends the prompt to Google Gemini for summarization. Delivers the summarized insights (strengths, weaknesses, sentiment, patterns) to a configured webhook or dashboard endpoint. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). A webhook or endpoint to receive the summary (e.g., Slack, Notion, or custom HR dashboard). How to customize this workflow to your needs Change Summary Focus by updating the Summarization of Glassdoor Response node Summarization methods and prompts to extract specific insights: Cultural feedback Leadership issues Compensation comments Exit motivation Update the HTTP Request to Glassdoor node with a specific Glassdoor Company information that you are looking for. Format the output to produce a customized summary to Markdown or HTML for rich delivery. Integrate with HR Systems BambooHR, Workday, SAP SuccessFactors via API. Google Sheets or Airtable
by ikbendion
Reddit Poster to Discord This workflow checks Reddit every 15 minutes for new posts and sends selected posts to a Discord channel via webhook. Flow Overview: Schedule Trigger Runs every 15 minutes. Fetch Latest Posts Retrieves up to 3 new posts from any subreddit. Filter Posts Skips moderator or announcement posts based on author ID. Fetch Full Post Data Gets full details for the remaining post. Extract Image URL Parses the post to extract a direct image link. Send to Discord Sends the post title, image, and link to a Discord webhook. Setup Notes: Create a Reddit app and connect credentials in n8n. Add your subreddit name to both Reddit nodes. Connect a Discord webhook for posting.
by Msaid Mohamed el hadi
Automated YouTube Leads: Turn Comments into Enriched Prospects Workflow Overview This cutting-edge n8n workflow is a powerful automation tool designed to revolutionize how businesses and marketers identify and qualify leads directly from YouTube video comments. By leveraging specialized Apify Actors and an intelligent AI agent, this workflow seamlessly transforms raw comment data into comprehensive lead profiles, saving valuable time and resources. This workflow automatically: Discovers & Scrapes Comments: Monitors a Google Sheet for new YouTube video URLs. Automatically extracts all comments from specified YouTube videos using a dedicated Apify Actor. Marks videos as "scrapped" to avoid reprocessing. Intelligent Lead Enrichment: Retrieves unprocessed comments from Google Sheets. Activates an advanced AI agent (powered by OpenRouter's cutting-edge models) to research comment authors. Utilizes Google Search (via Serper API) and specialized Apify scrapers (for website content and Instagram profiles) to find publicly available information like social media links, bios, and potential contact details. Generates concise descriptions for each lead based on gathered data. Organized Data Storage: Creates new entries in a dedicated Google Sheet for each new lead. Updates lead profiles with all discovered enriched data (email, social media, short bio, etc.). Marks comments as "processed" once their authors have been researched and enriched. Key Benefits 🤖 Full Automation: Eliminates manual data collection and research, freeing up your team for strategic tasks. 💡 Smart Lead Enrichment: AI intelligently sifts through information to build rich, actionable lead profiles. ⏱️ Time-Saving: Instant, scalable lead generation without human intervention. 📈 Enhanced Lead Quality: Go beyond basic contact info with comprehensive social and professional context. 📊 Centralized Data: All leads are neatly organized in Google Sheets for easy access and integration. Setup Requirements n8n Installation: Install n8n (cloud or self-hosted). Import the workflow configuration. Configure API credentials. Set up scheduling preferences for continuous operation. Google Sheets Credentials: A Google Cloud API key with access to Google Sheets. Set up OAuth2 authentication in n8n for read/write access to your "youtube leads" spreadsheet (containing "videos", "comments", and "leads" sheets). OpenRouter API Access: Create an OpenRouter account. Generate an API key to access their chat models (e.g., google/gemini-2.5-flash-preview-05-20) for AI agent operations. Apify API Access: Create an Apify account. Generate a personal API token. This token is used to run the following Apify Actors: mohamedgb00714/youtube-video-comments (for comment extraction) mohamedgb00714/fireScraper-AI-Website-Content-Markdown-Scraper (for website content extraction) mohamedgb00714/instagram-full-profile-scraper (for Instagram profile details) Serper API Key: Sign up for an account on Serper.dev. Obtain an API key for performing Google searches to find social media profiles and other information. Potential Use Cases Content Creators: Identify highly engaged audience members for community building or direct outreach. Marketing Teams: Discover potential customers or influencers interacting with competitor content. Sales Professionals: Build targeted lead lists based on specific interests expressed in comments. Market Researchers: Analyze audience demographics and interests by enriching profiles of commenters on relevant videos. Recruiters: Find potential candidates based on their expertise or engagement in industry-specific discussions. Future Enhancement Roadmap CRM Integration: Directly push enriched leads into popular CRM systems (e.g., HubSpot, Salesforce). Automated Outreach: Implement automated email or social media messaging for qualified leads. Sentiment Analysis: Analyze comment sentiment before enrichment to prioritize positive interactions. Multi-Platform Support: Expand comment extraction and lead enrichment to other platforms (e.g., TikTok, Facebook). Advanced Lead Scoring: Develop a scoring model based on engagement, profile completeness, and relevance. Ethical Considerations Data Privacy: Ensure all collected data is publicly available and used in compliance with relevant privacy regulations (e.g., GDPR, CCPA). Platform Guidelines: Adhere strictly to YouTube's Terms of Service and Apify's usage policies. Transparency: If engaging with leads, be transparent about how their information was obtained (if applicable). No Spam: This tool is designed for lead identification, not for unsolicited mass messaging. Technical Requirements n8n v1.0.0 or higher (recommended for latest features and stability) Google Sheets API access OpenRouter API access Apify API access Serper API access Stable internet connection Workflow Architecture [YouTube Video URLs (Google Sheet)] ⬇️ [Schedule/Manual Trigger] ⬇️ [Extract Comments (Apify YouTube Scraper)] ⬇️ [Save Raw Comments (Google Sheet)] ⬇️ [AI Agent (OpenRouter) for Lead Research] ⬇️ [Google Search (Serper) & Web Scraping (Apify FireScraper/Instagram Scraper)] ⬇️ [Save Enriched Leads (Google Sheet)] ⬇️ [Mark Comments Processed (Google Sheet)] Connect With Me Exploring AI-Powered Lead Generation? 📧 Email: mohamedgb00714@gmail.com 💼 LinkedIn: Mohamed el Hadi Msaid Transform your YouTube engagement into a powerful lead generation engine with intelligent, automated insights\!
by Angel Menendez
Who is this for? This workflow is perfect for HR teams, recruiters, and hiring platforms that need to automate the extraction of key candidate details—like name, email, skills, and education—from resume files submitted in various formats. What problem does this solve? Manually reviewing and extracting structured data from resumes is time-consuming and error-prone. This automation eliminates that bottleneck, standardizing candidate data for seamless integration into CRMs, applicant tracking systems, or Google Sheets. What this workflow does This n8n template listens for uploaded resume files, detects their format (PDF, DOC, TXT, CSV, etc.), and automatically extracts the raw text using n8n’s built-in file extraction tools. The extracted text is then parsed using an OpenAI-powered agent that returns structured fields such as: Full Name Email Address Skill Keywords Education Details Optionally, you can push the structured output to Google Sheets (node included, currently disabled). Setup Clone this workflow into your n8n instance. Enable the When chat message received trigger if using n8n chat. Provide your OpenAI credentials and enable the LangChain Agent node. (Optional) Connect Google Sheets by authenticating with your Google account and filling in your target document and sheet. Watch the setup and demo video here: 🎥 https://youtu.be/2SUPiNmLWdA How to customize Modify the OpenAI system message to extract different fields (e.g., phone number, LinkedIn). Replace the Google Sheets node with a webhook to push results to your ATS. Add filters to limit accepted file types or max file size. > ⚠️ This template is designed to be secure. It uses credentials stored in the n8n credential manager—no hardcoded secrets required.
by go-surfe
This template enables fully automated lead enrichment using Surfe’s bulk API. Simply drop a Google Spreadsheet into your Google Drive, and n8n will handle everything — from reading the leads, enriching them in batches, filtering valid data, and pushing results to HubSpot. 1. ❓ What Problem Does This Solve? Manually enriching contact lists is tedious, error-prone, and doesn’t scale. Whether you’re importing leads from events, marketing forms, or partners, this workflow ensures each record is enriched and synced to your CRM — hands-free. 2. 🧰 Prerequisites To use this template, you’ll need: A self-hosted or cloud instance of n8n A Surfe API Key A Google Drive and Sheets account (with OAuth or service account) A HubSpot account with access to create/update contacts (via OAuth or Private App Token) The workflow JSON file (included with this tutorial) 3. 📌 Input File Format To run the automation, you must upload a Google Spreadsheet to a specific folder in your Drive. The spreadsheet must contain the following columns: first name (required) last name (required) Either company name or company domain (at least one is required) linkedin url (optional) 🛑 Important: Any row missing first name, last name, and both company name and company domain will be ignored automatically by the workflow. Each row represents a person to enrich. We recommend including the linkedin url if available, but it's not mandatory. 4. ⚙️ Setup Instructions 4.1 🔐 Create Your Credentials in n8n 4.1.1 📁 Google Drive To connect Google Drive and Google Sheets in your workflow, you need to authorize n8n via Google OAuth 2.0 using a Client ID and Client Secret from the Google Cloud Console. 📋 Step 1: Create a Google Cloud Project Visit Google Cloud Console Create a new project or select an existing one Navigate to APIs & Services → OAuth consent screen ⚙️ Step 2: Configure the OAuth Consent Screen Enter the following: App name (e.g. n8n Integration) User support email Choose Audience Type: Internal if you’re using a Google Workspace account External if using a personal Gmail account Under Contact information email address Click Save and Continue 🔑 Step 3: Create OAuth Client Credentials Go to APIs & Services → Credentials Click + Create Credentials → OAuth Client ID Select Web application as the application type Name it (e.g. n8n Google Drive Access) In Authorized redirect URIs, paste this: https://oauth.n8n.cloud/oauth2/callback (Or your self-hosted n8n redirect URI) Click Create Copy the Client ID and Client Secret ✅ Step 4: Finish Setup in n8n In n8n, go to Credentials → Create New → Google Drive / Google Sheets Choose OAuth2 Paste your: Client ID Client Secret Redirect URL (should match Google Console) Click Sign in with Google Authorize access and save the credential ✅ Your Google Drive is now ready to use in workflows. 4.1.2 📊 Google Sheets OAuth2 API Go to n8n → Credentials Create new credentials: Type: Google Sheets OAuth2 API Here a pop-up will open where you can login to your Google account from where you will read the Google Sheets When it’s done you should see this on n8n 4.1.3 📧 Gmail OAuth2 API Go to n8n → Credentials Create new credentials: Type: Gmail OAuth2 API A pop-up window will appear where you can log in with your Google account that is linked to Gmail Make sure you grant email send permissions when prompted 4.1.4 🚀 Surfe API In your Surfe dashboard → Use Surfe Api → copy your API key Go to n8n → Credentials → Create Credential Choose Credential Type: Bearer Auth Name it something like SURFE API Key Paste your API key into the Bearer Token Save 4.1.5 🎯 HubSpot 🔓 Private App Token Go to HubSpot → Settings → Integrations → Private Apps Create an app with scopes: crm.objects.contacts.read crm.objects.contacts.write crm.schemas.contacts.read Save the App token Go to n8n → Credentials → Create Credential → HubSpot App Token Paste your App Token ✅ You are now all set for the credentials 4.2 📥 Import and Configure the N8N Workflow Import the provided JSON workflow into N8N Create a New Blank Workflow click the … on the top left Import from File 4.2.1 🔗 Link Nodes to Your Credentials In the workflow, link your newly created credentials to each node of this list : Google Drive Node -> Credentials to connect with → Google Drive Account Google Sheets -> Credentials to connect with → Google Sheets Account Gmail Node Credentials to connect with → Gmail account Hubspot Node →Credentials to connect with → Gmail account Surfe HTTP nodes: Authentication → Generic Credential Type Generic Auth Type → Bearer Auth Bearer Auth → Select the credentials you created before Surfe HTTP nodes Surfe HTTP nodes HubSpot node → Credentials to connect with → select your HubSpot credentials in the list 4.2.2 🔧 Additional Setup for the Google Drive Trigger Node 5. 🔄 How This N8N Workflow Works A new Google Sheet containing a linkedin_url column is added to a specific folder in Google Drive n8n detects the new file automatically via the Google Drive Trigger All rows are read and batched in groups of 500 to comply with Surfe’s API limits Each batch is sent to Surfe’s Bulk Enrichment API n8n polls Surfe until the enrichment job is complete It extracts the enriched contact data from Surfe’s response Only contacts with both email and phone number are kept These validated leads are pushed to HubSpot Finally, a Gmail notification is sent to confirm the job is complete 6. 🧩 Use Cases Post-event contact enrichment** – After a trade show, upload a list of LinkedIn profile URLs from badge scans or lead capture forms Outbound LinkedIn campaign follow-ups** – Gather LinkedIn URLs from manual outreach and enrich them into usable CRM leads CRM data enhancement** – Use LinkedIn URLs to fill in missing contact info for existing or imported contacts List building from LinkedIn exports** – Upload a list of LinkedIn profiles (e.g. from Sales Navigator) and turn them into fully enriched contacts in HubSpot 7. 🛠 Customization Ideas 🔁 Add retry logic for failed Surfe enrichment jobs 📤 Log enriched contacts into a Google Sheet or Airtable 🔍 Add pre-check logic to avoid creating duplicates in HubSpot 📊 Extend the flow to generate a basic summary report of enriched vs rejected contacts 8. ✅ Summary This workflow turns a basic Google Sheet of LinkedIn URLs into fully enriched, CRM-ready contacts — automatically synced with HubSpot. Just upload your file. Let Surfe do the rest.