by Jimleuk
Note: This template only works for self-hosted n8n. This n8n template demonstrates how to use the Langchain code node to track token usage and cost for every LLM call. This is useful if your templates handle multiple clients or customers and you need a cheap and easy way to capture how much of your AI credits they are using. How it works In our mock AI service, we're offering a data conversion API to convert Resume PDFs into JSON documents. A form trigger is used to allow for PDF upload and the file is parsed using the Extract from File node. An Edit Fields node is used to capture additional variables to send to our log. Next, we use the Information Extractor node to organise the Resume data into the given JSON schema. The LLM subnode attached to the Information Extractor is a custom one we've built using the Langchain Code node. With our custom LLM subnode, we're able to capture the usage metadata using lifecycle hooks. We've also attached a Google Sheet tool to our LLM subnode, allowing us to send our usage metadata to a google sheet. Finally, we demonstrate how you can aggregate from the google sheet to understand how much AI tokens/costs your clients are liable for. Check out the example Client Usage Log - https://docs.google.com/spreadsheets/d/1AR5mrxz2S6PjAKVM0edNG-YVEc6zKL7aUxHxVcffnlw/edit?usp=sharing How to use SELF-HOSTED N8N ONLY** - the Langchain Code node is only available in the self-hosted version of n8n. It is not available in n8n cloud. The LLM subnode can only be attached to non-"AI agent" nodes; Basic LLM node, Information Extractor, Question & Answer Chain, Sentiment Analysis, Summarization Chain and Text Classifier. Requirements Self-hosted version of n8n OpenAI for LLM Google Sheets to store usage metadata Customising this template Bring the custom LLM subnode into your own templates! In many cases, it can be a drop-in replacement for the regular OpenAI subnode. Not using Google Sheets? Try other databases or a HTTP call to pipe into your CRM.
by Adam Janes
How it works The automation loads rows from a Google Sheet of leads that you want to contact. It makes a Google search via Apify for LinkedIn links based on the First name / Last name / Company. Another Apify actor fetches the right LinkedIn profile based on the first profile which is retuned The same process is done for the company that the lead works for, giving extra context. If the lead has a current company listed on their LinkedIn, we use that URL to do the lookup, rather than doing a separate Google search. A call is made to OpenRouter to get an LLM to generate an email based on a prompt designed to do personalized outreach. An email is sent via a Gmail node. Set up steps Connect your Google Sheets + Gmail accounts to use these APIs. Make an account with Apify and enter your credentials. Set your details in the "Set My Data" node to customize the workflow to revolve around your company + value proposition. I would recommend changing the prompt in the "Generate Personalized Email" node to match the tone of voice that you want your agent to have. You can change the guidelines to e.g. change whether the agent introduces itself, and give more examples in the style you want to make the output better.
by Pavel Zamorev
This n8n template automates the transformation of raw meeting notes into structured tasks and documents using GPT (or another model) , syncing them to Notion and TickTick via a Telegram bot. Use Cases Automate note-taking and formatting for daily standups, brainstorming sessions, or client calls. Reduce cognitive load by eliminating manual tracking of ideas and tedious formatting. Convert discussions into actionable tasks instantly with TickTick and structured notes in Notion. How It Works Capture Notes: Send raw meeting notes to a Telegram bot. AI Processing: The workflow sends the text to AI, which: Removes duplicates and extracts key points. Formats content into structured Markdown notes for Notion. Identifies tasks with deadlines (e.g., "- Prepare presentation (Responsible: John, Deadline: Friday)"). Task Parsing: Extracts task titles, removing metadata like "Responsible" and "Deadline." Review & Edit: The bot returns formatted notes and tasks for review in Telegram. Sync & Publish: Notes are published to a Notion database. Tasks are exported to TickTick via API. Confirmation: A Telegram reaction (e.g., ๐ emoji) confirms successful processing. Setup Instructions Set Up Telegram Bot: Create a Telegram bot via BotFather and obtain an API token. Add the token to the "Telegram Trigger" and "Send-Edited-Notes" nodes under credentials (telegramApi). Configure OpenAI: Obtain an OpenAI API key and add it to the "Edit-Notes" node (openAiApi credentials). Ensure the model is set to gpt-4.1-mini in the node parameters. Set Up Notion: Create a Notion database for notes (e.g., "Meetings"). Add the database ID to the "Create a Database Page" node (databaseId). Configure Notion API credentials (notionApi) in the node. Set Up TickTick: Obtain a TickTick API key and add it to the "Create a Task" node (tickTickOAuth2Api credentials). Specify your TickTick project ID in the node (projectId). Deploy Workflow: Ensure your n8n instance is self-hosted to support community nodes (TickTick, Notion). Activate the workflow in n8n. Test: Send a test message to the Telegram bot (e.g., "Discussed project timeline. Tasks: - Prepare slides (Responsible: Alice, Deadline: Friday)"). Verify that notes appear in Notion, tasks in TickTick, and a ๐ reaction in Telegram. Configuration Examples Telegram Trigger: { "parameters": { "updates": ["message"], "additionalFields": {} }, "credentials": { "telegramApi": { "id": "your-telegram-api-id", "name": "meeting notes" } } } OpenAI Prompt (in "Edit-Notes" node): Analyze the quick meeting notes from {{ $json.message.text }} Generate meeting notes and a task list in the following format:\nMeeting Notes:\n- [Note 1]\n- [Note 2]\n\nTasks:\n- [Task 1] \n- [Task 2] Notion Database Page { "parameters": { "resource": "databasePage", "databaseId": "your-notion-database-id", "title": "MN {{ $now }}", "blockUi": { "blockValues": [ { "textContent": "{{ $json.message.text }}" } ] } } } Requirements Requires an OpenAI API key (or another model). APIs: Pre-configured Notion and TickTick API credentials are required. The template includes setup guides. Setup: Uses community nodes, requiring a self-hosted n8n instance. Customizing This Workflow Replace the Telegram bot with a webhook or form for alternative inputs (e.g., mobile apps). Modify the OpenAI prompt in the "Edit-Notes" node to customize note and task formats. Add filters in the "Split Notes and Tasks" node to prioritize tasks (e.g., ++#urgent++). Integrate Google Calendar via an additional HTTP Request node to auto-set deadlines based on text (e.g., "by Friday").
by Radouane Driouich
Automatically Categorize Gmail Emails with GPT-4o-mini Multi-Label Analysis Description The "Automatically Categorize Gmail Emails with GPT-4o-mini Multi-Label Analysis" template is designed specifically for professionals, business owners, entrepreneurs, and anyone struggling to manage a high volume of daily emails. It solves common inbox problems such as email overload, missed important messages, manual sorting inefficiencies, and unorganized inbox clutter. By using intelligent content analysis powered by GPT-4o-mini, this workflow automatically categorizes incoming Gmail messages with relevant labels, ensuring efficient email management and significantly boosting productivity. Workflow Overview How It Works Email Detection**: Continuously monitors your Gmail inbox every minute to detect new incoming emails. Content Extraction**: Retrieves key email components including sender details, subject line, and body content for analysis. Intelligent Labeling**: Utilizes GPT-4o-mini AI to contextually analyze each email and assign 1-3 relevant labels based on your existing Gmail label structure. Automatic Application**: Applies the selected labels directly to your emails, equipped with robust error-handling mechanisms to ensure accuracy and reliability. Key Benefits Organized Inbox**: Automatically maintains inbox order and clarity. Time-Saving**: Reduces manual email management effort significantly. Customization**: Fully adaptable to specific labeling and organizational requirements. Pre-conditions Before using this template, ensure the following prerequisites are met: Active Gmail account with OAuth2 enabled. Active OpenAI account with GPT-4o-mini API key. Clearly defined labels set up in your Gmail account (e.g., "Work", "Personal", "Urgent"). Setup Instructions Follow these straightforward setup steps to activate the workflow: Connect Gmail Account Authorize your Gmail account using OAuth2 (takes approximately 2-3 minutes). Configure OpenAI GPT-4o-mini API Enter and validate your GPT-4o-mini API key to enable advanced email analysis. Establish Gmail Labels Ensure necessary labels are created within Gmail. Examples include "Work", "Personal", and "Urgent". Activate and Verify Click the "Activate" button in n8n. Send a test email to your Gmail inbox to confirm that labels are applied correctly. Customization Tips You can easily customize this workflow to fit your specific needs: Modify Gmail Labels**: Create and adapt labels to match your business or personal categorization strategy. Adjust GPT-4o-mini Criteria**: Fine-tune the AI prompts to improve accuracy and relevance based on your unique email management needs. Expand the Workflow**: Integrate additional conditions, actions, or external applications to further automate and optimize your email management processes. Improve your daily workflow efficiency and achieve a clutter-free Gmail inbox by leveraging the power of GPT-4o-mini today.
by Jimleuk
This n8n template demonstrates how you can leverage existing support site search to power your Support Chatbots and agents. Building a support chatbot need not be complicated! If building and indexing vector stores or duplicating data isn't necessarily your thing, an alternative implementation of the RAG approach is to leverage existing knowledge-bases such as support portals. In this way, document management and maintenance of your support agent is significantly reduced. Disclaimer: This template example uses AcuityScheduling's help center website but is not associated, supported nor endorsed by the company. How it works A simple AI agent is connected with chat trigger to receive user queries. The AI agent is instructed to fetch information from the knowledge-base via the attached custom workflow tool (aka "knowledgebase tool"). There is no step to replicate the entire support articles database into a vector store. You may choose not too because of time, cost and maintainence involved. Instead, the tool leverages the existing support portal's search API to retrieve knowledge-base articles. Finally, the search results are formatted before sending an aggregated response back to the agent. How to use? Customise the subworkflow to work with your own support portal API and format accordingly. Try the following queries How do I connect my icloud to acuityScheduling? How do I download past invoices for my Acuity account? Requirements OpenAI for LLM. If your organisation's APIs require authorisation, you may need to add custom credentials as necessary. Customising this workflow Add additional tools to reach other parts of your internal knowledgebase. Not using OpenAI? Feel free to swap but ensure the LLM has tools/function calling support.
by GiovanniSegar
Video walkthrough https://www.youtube.com/watch?v=OwIFK-r-NtQ Summary of agent This agent can write and rewrite its own rules, allowing you to mold its behavior. It receives rules from a database as system instructions and has tools to create, edit, or delete them. This is a great baseline for new agent builds. You can tell it things like "Next time, use present tense when talking about this subject" and it will use a tool to save this as a rule, then receive that instruction in all future iterations. How to start using it Option 1: With a Postgres database (e.g., Supabase) Supabase Schema: Create a table (e.g., agent_rules) with the following columns: id: bigint (Primary Key, auto-incrementing) created_at: timestamp with time zone (Default: now()) rule_text: text agent: text Workflow Updates: Update the Postgres credentials in the "Get rules from database," "Insert rule into database," and "Execute query on rule database" nodes. Update the agent value (currently 'TestAgent') in the "Get rules from database" and "Insert rule into database" nodes if you want a different agent name. Update the Anthropic API credentials. Option 2: With Google Sheets Google Sheet Setup: Create a Google Sheet with columns for rule_text and agent. Workflow Updates: Example Google Sheets nodes are included. You will need to: Connect your Google Sheets credentials. Select your Google Sheet (with rule_text and agent columns) in all relevant Google Sheets nodes. Replace the existing Postgres nodes ("Get rules from database", "Insert rule into database", "Execute query on rule database") with the configured Google Sheets nodes. Update the agent value (currently 'TestAgent') in the Google Sheets nodes if you want a different agent name. Update the Anthropic API credentials. Agent Instructions: Update the agent's system message and remove the database schema section as it is no longer relevant
by Ranjan Dailata
Who is this for? This workflow is designed for HR professionals, employer branding teams, talent acquisition strategists, market researchers, and business intelligence analysts who want to monitor, understand, and act upon employee sentiment and company perception on Glassdoor. It's ideal for organizations that value real-time feedback, are tracking employer brand perception, or need summarized insights for leadership reporting without sifting through thousands of raw reviews. What problem is this workflow solving? Manually reviewing and analyzing Glassdoor reviews is tedious, subjective, and not scalable especially for larger companies or those with many subsidiaries. This workflow: Automates review collection by making a Glassdoor company request via the Bright Data Web Scrapper API. Uses Google Gemini to summarize the content. Sends an actionable summary to HR dashboards, leadership teams, or alert systems via the Webhook notification. What this workflow does Makes an HTTP Request to Glassdoor via the Bright Data Web Scrapper API. Polls the BrightData Glassdoor for the completion of the request. Downloads the Glassdoor response when a new snapshot is ready. Sends the prompt to Google Gemini for summarization. Delivers the summarized insights (strengths, weaknesses, sentiment, patterns) to a configured webhook or dashboard endpoint. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). A webhook or endpoint to receive the summary (e.g., Slack, Notion, or custom HR dashboard). How to customize this workflow to your needs Change Summary Focus by updating the Summarization of Glassdoor Response node Summarization methods and prompts to extract specific insights: Cultural feedback Leadership issues Compensation comments Exit motivation Update the HTTP Request to Glassdoor node with a specific Glassdoor Company information that you are looking for. Format the output to produce a customized summary to Markdown or HTML for rich delivery. Integrate with HR Systems BambooHR, Workday, SAP SuccessFactors via API. Google Sheets or Airtable
by Msaid Mohamed el hadi
Automated YouTube Leads: Turn Comments into Enriched Prospects Workflow Overview This cutting-edge n8n workflow is a powerful automation tool designed to revolutionize how businesses and marketers identify and qualify leads directly from YouTube video comments. By leveraging specialized Apify Actors and an intelligent AI agent, this workflow seamlessly transforms raw comment data into comprehensive lead profiles, saving valuable time and resources. This workflow automatically: Discovers & Scrapes Comments: Monitors a Google Sheet for new YouTube video URLs. Automatically extracts all comments from specified YouTube videos using a dedicated Apify Actor. Marks videos as "scrapped" to avoid reprocessing. Intelligent Lead Enrichment: Retrieves unprocessed comments from Google Sheets. Activates an advanced AI agent (powered by OpenRouter's cutting-edge models) to research comment authors. Utilizes Google Search (via Serper API) and specialized Apify scrapers (for website content and Instagram profiles) to find publicly available information like social media links, bios, and potential contact details. Generates concise descriptions for each lead based on gathered data. Organized Data Storage: Creates new entries in a dedicated Google Sheet for each new lead. Updates lead profiles with all discovered enriched data (email, social media, short bio, etc.). Marks comments as "processed" once their authors have been researched and enriched. Key Benefits ๐ค Full Automation: Eliminates manual data collection and research, freeing up your team for strategic tasks. ๐ก Smart Lead Enrichment: AI intelligently sifts through information to build rich, actionable lead profiles. โฑ๏ธ Time-Saving: Instant, scalable lead generation without human intervention. ๐ Enhanced Lead Quality: Go beyond basic contact info with comprehensive social and professional context. ๐ Centralized Data: All leads are neatly organized in Google Sheets for easy access and integration. Setup Requirements n8n Installation: Install n8n (cloud or self-hosted). Import the workflow configuration. Configure API credentials. Set up scheduling preferences for continuous operation. Google Sheets Credentials: A Google Cloud API key with access to Google Sheets. Set up OAuth2 authentication in n8n for read/write access to your "youtube leads" spreadsheet (containing "videos", "comments", and "leads" sheets). OpenRouter API Access: Create an OpenRouter account. Generate an API key to access their chat models (e.g., google/gemini-2.5-flash-preview-05-20) for AI agent operations. Apify API Access: Create an Apify account. Generate a personal API token. This token is used to run the following Apify Actors: mohamedgb00714/youtube-video-comments (for comment extraction) mohamedgb00714/fireScraper-AI-Website-Content-Markdown-Scraper (for website content extraction) mohamedgb00714/instagram-full-profile-scraper (for Instagram profile details) Serper API Key: Sign up for an account on Serper.dev. Obtain an API key for performing Google searches to find social media profiles and other information. Potential Use Cases Content Creators: Identify highly engaged audience members for community building or direct outreach. Marketing Teams: Discover potential customers or influencers interacting with competitor content. Sales Professionals: Build targeted lead lists based on specific interests expressed in comments. Market Researchers: Analyze audience demographics and interests by enriching profiles of commenters on relevant videos. Recruiters: Find potential candidates based on their expertise or engagement in industry-specific discussions. Future Enhancement Roadmap CRM Integration: Directly push enriched leads into popular CRM systems (e.g., HubSpot, Salesforce). Automated Outreach: Implement automated email or social media messaging for qualified leads. Sentiment Analysis: Analyze comment sentiment before enrichment to prioritize positive interactions. Multi-Platform Support: Expand comment extraction and lead enrichment to other platforms (e.g., TikTok, Facebook). Advanced Lead Scoring: Develop a scoring model based on engagement, profile completeness, and relevance. Ethical Considerations Data Privacy: Ensure all collected data is publicly available and used in compliance with relevant privacy regulations (e.g., GDPR, CCPA). Platform Guidelines: Adhere strictly to YouTube's Terms of Service and Apify's usage policies. Transparency: If engaging with leads, be transparent about how their information was obtained (if applicable). No Spam: This tool is designed for lead identification, not for unsolicited mass messaging. Technical Requirements n8n v1.0.0 or higher (recommended for latest features and stability) Google Sheets API access OpenRouter API access Apify API access Serper API access Stable internet connection Workflow Architecture [YouTube Video URLs (Google Sheet)] โฌ๏ธ [Schedule/Manual Trigger] โฌ๏ธ [Extract Comments (Apify YouTube Scraper)] โฌ๏ธ [Save Raw Comments (Google Sheet)] โฌ๏ธ [AI Agent (OpenRouter) for Lead Research] โฌ๏ธ [Google Search (Serper) & Web Scraping (Apify FireScraper/Instagram Scraper)] โฌ๏ธ [Save Enriched Leads (Google Sheet)] โฌ๏ธ [Mark Comments Processed (Google Sheet)] Connect With Me Exploring AI-Powered Lead Generation? ๐ง Email: mohamedgb00714@gmail.com ๐ผ LinkedIn: Mohamed el Hadi Msaid Transform your YouTube engagement into a powerful lead generation engine with intelligent, automated insights\!
by Angel Menendez
Who is this for? This workflow is perfect for HR teams, recruiters, and hiring platforms that need to automate the extraction of key candidate detailsโlike name, email, skills, and educationโfrom resume files submitted in various formats. What problem does this solve? Manually reviewing and extracting structured data from resumes is time-consuming and error-prone. This automation eliminates that bottleneck, standardizing candidate data for seamless integration into CRMs, applicant tracking systems, or Google Sheets. What this workflow does This n8n template listens for uploaded resume files, detects their format (PDF, DOC, TXT, CSV, etc.), and automatically extracts the raw text using n8nโs built-in file extraction tools. The extracted text is then parsed using an OpenAI-powered agent that returns structured fields such as: Full Name Email Address Skill Keywords Education Details Optionally, you can push the structured output to Google Sheets (node included, currently disabled). Setup Clone this workflow into your n8n instance. Enable the When chat message received trigger if using n8n chat. Provide your OpenAI credentials and enable the LangChain Agent node. (Optional) Connect Google Sheets by authenticating with your Google account and filling in your target document and sheet. Watch the setup and demo video here: ๐ฅ https://youtu.be/2SUPiNmLWdA How to customize Modify the OpenAI system message to extract different fields (e.g., phone number, LinkedIn). Replace the Google Sheets node with a webhook to push results to your ATS. Add filters to limit accepted file types or max file size. > โ ๏ธ This template is designed to be secure. It uses credentials stored in the n8n credential managerโno hardcoded secrets required.
by go-surfe
This template enables fully automated lead enrichment using Surfeโs bulk API. Simply drop a Google Spreadsheet into your Google Drive, and n8n will handle everything โ from reading the leads, enriching them in batches, filtering valid data, and pushing results to HubSpot. 1. โ What Problem Does This Solve? Manually enriching contact lists is tedious, error-prone, and doesnโt scale. Whether youโre importing leads from events, marketing forms, or partners, this workflow ensures each record is enriched and synced to your CRM โ hands-free. 2. ๐งฐ Prerequisites To use this template, youโll need: A self-hosted or cloud instance of n8n A Surfe API Key A Google Drive and Sheets account (with OAuth or service account) A HubSpot account with access to create/update contacts (via OAuth or Private App Token) The workflow JSON file (included with this tutorial) 3. ๐ Input File Format To run the automation, you must upload a Google Spreadsheet to a specific folder in your Drive. The spreadsheet must contain the following columns: first name (required) last name (required) Either company name or company domain (at least one is required) linkedin url (optional) ๐ Important: Any row missing first name, last name, and both company name and company domain will be ignored automatically by the workflow. Each row represents a person to enrich. We recommend including the linkedin url if available, but it's not mandatory. 4. โ๏ธ Setup Instructions 4.1 ๐ Create Your Credentials in n8n 4.1.1 ๐ Google Drive To connect Google Drive and Google Sheets in your workflow, you need to authorize n8n via Google OAuth 2.0 using a Client ID and Client Secret from the Google Cloud Console. ๐ Step 1: Create a Google Cloud Project Visit Google Cloud Console Create a new project or select an existing one Navigate to APIs & Services โ OAuth consent screen โ๏ธ Step 2: Configure the OAuth Consent Screen Enter the following: App name (e.g.ย n8n Integration) User support email Choose Audience Type: Internal if youโre using a Google Workspace account External if using a personal Gmail account Under Contact information email address Click Save and Continue ๐ Step 3: Create OAuth Client Credentials Go to APIs & Services โ Credentials Click + Create Credentials โ OAuth Client ID Select Web application as the application type Name it (e.g.ย n8n Google Drive Access) In Authorized redirect URIs, paste this: https://oauth.n8n.cloud/oauth2/callback (Or your self-hosted n8n redirect URI) Click Create Copy the Client ID and Client Secret โ Step 4: Finish Setup in n8n In n8n, go to Credentials โ Create New โ Google Drive / Google Sheets Choose OAuth2 Paste your: Client ID Client Secret Redirect URL (should match Google Console) Click Sign in with Google Authorize access and save the credential โ Your Google Drive is now ready to use in workflows. 4.1.2 ๐ Google Sheets OAuth2 API Go to n8n โ Credentials Create new credentials: Type: Google Sheets OAuth2 API Here a pop-up will open where you can login to your Google account from where you will read the Google Sheets When itโs done you should see this on n8n 4.1.3 ๐ง Gmail OAuth2 API Go to n8n โ Credentials Create new credentials: Type: Gmail OAuth2 API A pop-up window will appear where you can log in with your Google account that is linked to Gmail Make sure you grant email send permissions when prompted 4.1.4 ๐ Surfe API In your Surfe dashboard โ Use Surfe Api โ copy your API key Go to n8n โ Credentials โ Create Credential Choose Credential Type: Bearer Auth Name it something like SURFE API Key Paste your API key into the Bearer Token Save 4.1.5 ๐ฏ HubSpot ๐ Private App Token Go to HubSpot โ Settings โ Integrations โ Private Apps Create an app with scopes: crm.objects.contacts.read crm.objects.contacts.write crm.schemas.contacts.read Save the App token Go to n8n โ Credentials โ Create Credential โ HubSpot App Token Paste your App Token โ You are now all set for the credentials 4.2 ๐ฅ Import and Configure the N8N Workflow Import the provided JSON workflow into N8N Create a New Blank Workflow click the โฆ on the top left Import from File 4.2.1 ๐ Link Nodes to Your Credentials In the workflow, link your newly created credentials to each node of this list : Google Drive Node -> Credentials to connect with โ Google Drive Account Google Sheets -> Credentials to connect with โ Google Sheets Account Gmail Node Credentials to connect with โ Gmail account Hubspot Node โCredentials to connect with โ Gmail account Surfe HTTP nodes: Authentication โ Generic Credential Type Generic Auth Type โ Bearer Auth Bearer Auth โ Select the credentials you created before Surfe HTTP nodes Surfe HTTP nodes HubSpot node โ Credentials to connect with โ select your HubSpot credentials in the list 4.2.2 ๐ง Additional Setup for the Google Drive Trigger Node 5. ๐ How This N8N Workflow Works A new Google Sheet containing a linkedin_url column is added to a specific folder in Google Drive n8n detects the new file automatically via the Google Drive Trigger All rows are read and batched in groups of 500 to comply with Surfeโs API limits Each batch is sent to Surfeโs Bulk Enrichment API n8n polls Surfe until the enrichment job is complete It extracts the enriched contact data from Surfeโs response Only contacts with both email and phone number are kept These validated leads are pushed to HubSpot Finally, a Gmail notification is sent to confirm the job is complete 6. ๐งฉ Use Cases Post-event contact enrichment** โ After a trade show, upload a list of LinkedIn profile URLs from badge scans or lead capture forms Outbound LinkedIn campaign follow-ups** โ Gather LinkedIn URLs from manual outreach and enrich them into usable CRM leads CRM data enhancement** โ Use LinkedIn URLs to fill in missing contact info for existing or imported contacts List building from LinkedIn exports** โ Upload a list of LinkedIn profiles (e.g.ย from Sales Navigator) and turn them into fully enriched contacts in HubSpot 7. ๐ Customization Ideas ๐ Add retry logic for failed Surfe enrichment jobs ๐ค Log enriched contacts into a Google Sheet or Airtable ๐ Add pre-check logic to avoid creating duplicates in HubSpot ๐ Extend the flow to generate a basic summary report of enriched vs rejected contacts 8. โ Summary This workflow turns a basic Google Sheet of LinkedIn URLs into fully enriched, CRM-ready contacts โ automatically synced with HubSpot. Just upload your file. Let Surfe do the rest.
by AppStoneLab Technologies LLP
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. ๐ค AI Image Generator Telegram Bot Transform simple text descriptions into stunning AI-generated images through a Telegram bot powered by Google's Gemini 2.0 Flash image. This workflow automatically enhances user prompts with professional prompt engineering techniques and delivers high-quality images directly to your Telegram chat. ๐ฏ What This Workflow Does This automation creates an intelligent Telegram bot that: Receives text messages** from users describing what image they want Enhances prompts** using AI-powered prompt engineering to add artistic details, lighting, composition, and style specifications Generates images** using Google's Gemini 2.0 Flash image generation model Delivers results** instantly back to the user's Telegram chat โก๏ธ Key Features Smart Prompt Enhancement**: Automatically transforms basic requests like "a cat on a windowsill" into detailed, professional prompts with lighting, composition, and style details Professional Image Quality**: Leverages Google's latest Gemini 2.0 Flash model for high-quality image generation Instant Delivery**: Images are generated and sent back to users within seconds User-Friendly**: Simple text-to-image conversion through familiar Telegram interface Structured Output**: Uses JSON schema to ensure consistent prompt formatting ๐ ๏ธ How It Works Telegram Trigger: Listens for incoming messages from users AI Prompt Enhancement: Uses Gemini 2.5 Flash Lite to analyze and expand user requests into detailed prompts Structured Processing: Formats the enhanced prompt using JSON schema for consistency Image Generation: Sends the enhanced prompt to Gemini 2.0 Flash image generation API File Conversion: Converts the generated image data to a file format Delivery: Sends the generated image back to the user via Telegram ๐ญ Use Cases Creative Content Creation**: Generate artwork, illustrations, and visual concepts Social Media Content**: Create unique images for posts and stories Concept Visualization**: Turn ideas into visual representations Educational Materials**: Generate images for presentations and learning materials Personal Projects**: Create custom artwork for personal use ๐ Prerequisites Before setting up this workflow, you'll need: Required Accounts & APIs: Telegram Bot Token**: Create a bot via @BotFather Google AI Studio Account**: Access to Gemini API services Required Credentials: Telegram Bot API credentials Google Gemini API key with access to: Gemini 2.5 Pro (for prompt enhancement) Gemini 2.0 Flash Preview (for image generation) ๐ Setup Instructions Import the Workflow: Copy the workflow JSON into your n8n instance Configure Telegram Bot: Create a new bot using @BotFather Add your bot token to the Telegram credentials in n8n Set Up Google Gemini API: Create API key from Google AI Studio and add to n8n credentials Activate the Workflow: Enable the workflow to start receiving messages ๐ก Customization Options Prompt Templates**: Modify the prompt enhancement instructions for different artistic styles Image Parameters**: Adjust generation settings for different image formats and qualities Response Formats**: Customize how images are delivered (with captions, multiple formats, etc.) ๐ Example Interaction User Input: "Generate a beautiful image of a dragon from game of thrones" Enhanced Prompt: "A majestic, formidable dragon, reminiscent of those from Game of Thrones, perched atop a jagged, snow-capped mountain peak. The dragon is depicted with immense, leathery wings unfurled, scales glinting with iridescent blues and deep charcoals under the harsh, cold light. Its eyes glow with an inner, fiery amber. The scene is captured with a dramatic, low-angle wide shot, emphasizing the dragon's sheer scale against a turbulent, stormy sky filled with dark, bruised clouds and streaks of lightning. The atmosphere is oppressive and foreboding, with biting wind and swirling snow creating a sense of raw power and danger. The color palette is dominated by icy blues, stark greys, deep blacks, and the contrasting fiery glow of the dragon's eyes and perhaps a hint of internal fire. Rendered in a hyperrealistic, cinematic digital art style, with exceptional attention to detail on the scales, musculature, and the texture of the rocky environment. Lighting is dramatic and high-contrast, with sharp highlights on the dragon's form and deep, impenetrable shadows. Quality specifications include ultra-high detail, 8K resolution, photorealistic rendering, and an epic, awe-inspiring mood, evoking the grandeur and terror of powerful fantasy creatures." Result: ๐ง Technical Details AI Models**: Google Gemini 2.5 Pro (prompt enhancement) + Gemini 2.0 Flash Preview (image generation) Messaging**: Telegram Bot API Output Format**: High-quality images in standard formats Processing Time**: Typically 10-15 seconds per image for Gemini 2.5 Flash and 25-30 seconds Gemini for 2.5 Pro
by Agentick AI
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. **This n8n template automates candidate outreach, call transcription, and structured feedback capture for HR teams and recruiters. It triggers on a new candidate row added in a Google Sheet, initiates a call using Vapi.ai, processes the transcript using Google Gemini, extracts key information like CTC, experience, and notice period, and then updates the same Google Sheet with parsed insights. This is ideal for recruiters or HR teams conducting high-volume candidate outreach and wanting to scale initial data collection using automated voice bots and AI transcription analysis.** How it works Trigger: Listens for new rows added to a Google Sheet (e.g., a new candidate lead). Call Initiation: Uses Vapi.ai to make a phone call to the candidate using an assistant bot. Transcript Retrieval: After the call, fetches the conversation transcript from the Vapi API. AI Transcript Analysis: Google Gemini parses the transcript and extracts structured fields like: Work experience Current & expected CTC Notice period & negotiability Work preferences and location Data Mapping: Extracted insights are mapped to structured JSON fields. Google Sheet Update: The same row in the source Sheet is updated with the collected information. Use Cases Pre-screening calls for job applicants Collecting missing candidate information asynchronously Replacing manual HR data entry with AI-powered automation Smart CRM updates from voice interactions Requirements Before you run this workflow, ensure the following: โ Google account with access to Google Sheets API โ Vapi.ai account with: Assistant ID Phone number ID Active API key โ Google Gemini API (via PaLM) enabled โ n8n version 1.40.0 or later with relevant credentials configured How to use Import the workflow into n8n. Set up your credentials for: Google Sheets Trigger Google Sheets Vapi.ai (add Bearer token) Google Gemini Replace the placeholder values in: Assistant ID Phone number ID Google Sheet ID and tab Start the workflow and add a row to the Google Sheet. Wait for the automated call and let the AI extract and populate the data. Customising this workflow Replace Google Gemini with OpenAI or Claude if preferred. Add sentiment analysis on the transcript using an LLM. Modify the Sheet column structure to add additional fields. Add a filter node to skip candidates with incomplete phone numbers. Use a Webhook trigger instead of Google Sheets to integrate with job portals or ATS.