by explorium
Automatically enrich prospect data from HubSpot using Explorium and create leads in Salesforce This n8n workflow streamlines the process of enriching prospect information by automatically pulling data from HubSpot, processing it through Explorium's AI-powered tools, and creating new leads in Salesforce with enhanced prospect details. Credentials Required To use this workflow, set up the following credentials in your n8n environment: HubSpot Type**: App Token (or OAuth2 for broader compatibility) Used for**: triggering on new contacts, fetching contact data Explorium API Type**: Generic Header Auth Header**: Authorization Value**: Bearer YOUR_API_KEY Get explorium api key Salesforce Type**: OAuth2 or Username/Password Used for**: creating new lead records Go to Settings → Credentials, create these three credentials, and assign them in the respective nodes before running the workflow. Workflow Overview Node 1: HubSpot Trigger This node listens for real-time events from the connected HubSpot account. Once triggered, the node passes metadata about the event to the next step in the flow. Node 2: HubSpot This node fetches contact details from HubSpot after the trigger event. Credential**: Connected using a HubSpot App Token Resource**: Contact Operation**: Get Contact Return All**: Disabled This node retrieves the full contact details needed for further processing and enrichment. Node 3: Match prospect This node sends each contact's data to Explorium's AI-powered prospect matching API in real time. Method**: POST Endpoint**: https://api.explorium.ai/v1/prospects/match Authentication**: Generic Header Auth (using a configured credential) Headers**: Content-Type: application/json The request body is dynamically built from contact data, typically including: full_name, company_name, email, phone_number, linkedin. These fields are matched against Explorium's intelligence graph to return enriched or validated profiles. Response Output: total_matches, matched_prospects, and a prospect_id. Each response is used downstream to enrich, validate, or create lead information. Node 4: Filter This node filters the output from the Match prospect step to ensure that only valid, matched results continue in the flow. Only records that contain at least one matched prospect with a non-null prospect_id are passed forward. Status: Currently deactivated (as shown by the "Deactivate" label) Node 5: Extract Prospect IDs from Matched Results This node extracts all valid prospect_id values from previously matched prospects and compiles them into a flat array. It loops over all matched items, extracts each prospect_id from the matched_prospects array and returns a single object with an array of all prospect_ids. Node 6: Explorium Enrich Contacts Information This node performs bulk enrichment of contacts by querying Explorium with a list of matched prospect_ids. Node Configuration: Method**: POST Endpoint**: https://api.explorium.ai/v1/prospects/contacts_information/bulk_enrich Authentication**: Header Auth (using saved credentials) Headers**: "Content-Type": "application/json", "Accept": "application/json" Returns enriched contact information, such as: emails**: professional/personal email addresses phone_numbers**: mobile and work numbers professions_email, **professional_email_status, mobile_phone Node 7: Explorium Enrich Profiles This additional enrichment node provides supplementary contact data enhancement, running in parallel with the primary enrichment process. Node 8: Merge This node combines multiple data streams from the parallel enrichment processes into a single output, allowing you to consolidate data from different Explorium enrichment endpoints. The "combine" setting indicates it will merge the incoming data streams rather than overwriting them. Node 9: Code - flatten This custom code node processes and transforms the merged enrichment data before creating the Salesforce lead. It can be used to: Flatten nested data structures Format data according to Salesforce field requirements Apply business logic or data validation Map Explorium fields to Salesforce lead properties Handle data type conversions Node 10: Salesforce This final node creates new leads in Salesforce using the enriched data returned by Explorium. Credential**: Salesforce OAuth2 or Username/Password Resource**: Lead Operation**: Create Lead The node creates new lead records with enriched information including contact details, company information, and professional data obtained through the Explorium enrichment process. Workflow Flow Summary Trigger: HubSpot webhook triggers on new/updated contacts Fetch: Retrieve contact details from HubSpot Match: Find prospect matches using Explorium Filter: Keep only successfully matched prospects (currently deactivated) Extract: Compile prospect IDs for bulk enrichment Enrich: Parallel enrichment of contact information through multiple Explorium endpoints Merge: Combine enrichment results Transform: Flatten and prepare data for Salesforce (Code node) Create: Create new lead records in Salesforce This workflow ensures comprehensive data enrichment while maintaining data quality and providing a seamless integration between HubSpot prospect data and Salesforce lead creation. The parallel enrichment structure maximizes data collection efficiency before creating high-quality leads in your CRM system.
by Miquel Colomer
🎯 Precision Prospecting: Automate LinkedIn Lead Gen with n8n & Bright Data 📝 Overview This workflow turns n8n into an AI-powered prospector, automatically searching Google for LinkedIn profiles, scraping profile data via Bright Data, and summarizing key details. Ideal for sales and recruitment teams seeking targeted lead lists without manual research. 🎥 Workflow in Action Want to see this workflow in action? You have a chat window output below: 🔑 Key Features AI Chat Trigger**: Start prospecting via conversational prompts. Contextual Memory**: Retains the last 20 messages for coherent dialogue. Automated Google Search**: Generates site-restricted queries and fetches the top result. Bright Data Scraping**: Synchronously scrapes LinkedIn profile details by URL. Intelligent Filtering**: Extracts only valid LinkedIn profile links. Limit Control**: Returns a single, most relevant profile per request. LLM Summary**: Uses GPT-4o-mini to interpret and present scraped data. 🚀 How It Works (Step-by-Step) Prerequisites: n8n ≥ v1.0 with community nodes: install n8n-nodes-brightdata (not verified community node). API credentials: OpenAI, Bright Data (web unlocker zone “web\_unlocker1”). Webhook endpoint for chat trigger. Node Configuration: When chat message received (chatTrigger): Fires on user prompt. Simple Memory1 (memoryBufferWindow): Stores the last 20 chat messages. AI Prospector Agent (agent): Orchestrates search logic. Get 1 Google Result (brightData): Performs a Google search with site:linkedin.com/in. Get Links from Body (html): Extracts all `` hrefs from the search result page. Extract Links (splitOut): Splits out individual link entries. Filter only LinkedIn Profiles (filter): Ensures the URL contains “linkedin.com/” and starts with “https\://”. Limit (limit): Restricts output to the first valid profile URL. Search LinkedIn URI (toolWorkflow): Passes the URL to a secondary workflow to fetch the first link. Get LinkedIn Profile Data (brightDataTool): Scrapes the profile JSON. OpenAI Chat Model (lmChatOpenAi): Summarizes and formats the scraped data. Workflow Logic: User asks for a person by company & name, company & position, or LinkedIn URL. Agent builds a Google query (e.g., site:linkedin.com/in bright data cmo) and calls “Get 1 Google Result.” Extracted links are filtered and limited to the top valid profile. If user provided a direct LinkedIn URL, Agent skips search and scrapes immediately. Scraped profile JSON is passed to GPT-4o-mini to generate a concise summary. Testing & Optimization: Trigger via Execute Workflow for dry runs. Inspect intermediate node outputs in n8n’s Execution panel. Adjust maxIterations or memory window length for performance. Tune Bright Data zone or country settings to optimize scraping speed. Deployment & Monitoring: Activate the workflow and expose its webhook URL. Use n8n’s built-in Alerts or external monitoring (e.g., Slack notifications) on failures. Rotate credentials via n8n’s Credential Vault when needed. Version-control workflow via duplicates or Git-backed n8n instances. ✅ Pre-requisites OpenAI Account**: API key for GPT-4o-mini. Bright Data Account**: Zone “web\_unlocker1” and dataset gd_l1viktl72bvl7bjuj0. n8n Version**: v1.0+ with community nodes installed. Permissions**: Webhook access, Credential Vault read/write. 👤 Who Is This For? Sales teams automating outbound LinkedIn prospecting. Recruiters sourcing candidates without manual scraping. Marketing ops looking to enrich CRM with accurate profile data. 📈 Benefits & Use Cases Efficiency**: Reduces hours of manual search and data entry to seconds. Accuracy**: Filters out non-LinkedIn links and ensures high-quality results. Scalability**: Handle multiple prospect requests concurrently via chat or API. Integration**: Easily hook into CRMs or email sequencers downstream. Workflow created and verified by Miquel Colomer https://www.linkedin.com/in/miquelcolomersalas/ and N8nHackers https://n8nhackers.com
by WeblineIndia
This workflow is created by AI developers at WeblineIndia. It streamlines the process of managing content by automatically identifying and fetching the most recently added Google Doc file from your Google Drive. It extracts the content of the document for processing and leverages an AI model to generate a concise and meaningful summary of the extracted text. The summarized content is then stored in a designated Google Sheet, alongside relevant details like the document name and the date it was added, providing an organized and easily accessible reference for future use. This automation simplifies document handling, enhances productivity, and ensures seamless data management. Steps : Fetch the Most Recent Document from Google Drive Action:** Use the Google Drive Node. Details:** List files, filter by date to fetch the most recently added .doc file, and retrieve its file ID and metadata. Extract Content from the Document Action:** Use the Google Docs Node. Details:** Set the operation to "Get Content," pass the file ID, and extract the document's text content. Summarize the Document Using an AI Model Action:** Use an AI Model Node (e.g., OpenAI, ChatGPT). Details:** Provide the extracted text to the AI model, use a prompt to generate a summary, and capture the result. Store the Summarized Content in Google Sheets Action:** Use the Google Sheets Node. Details:** Append a new row to the target sheet with details such as the original document name, summary, and date added. About WeblineIndia WeblineIndia specializes in delivering innovative and custom AI solutions to simplify and automate business processes. If you need any help, please reach out to us.
by Miquel Colomer
📝 Overview This workflow transforms n8n into a smart real-estate concierge by combining an AI chat interface with Bright Data’s marketplace datasets. Users interact via chat to specify city, price, bedrooms, and bathrooms—and receive a curated list of three homes for sale, complete with images and briefings. 🎥 Workflow in Action Want to see this workflow in action? Play the video 🔑 Key Features AI-Powered Chat Trigger:** Instantly start conversations using LangChain’s Chat Trigger node. Contextual Memory:** Retain up to 30 recent messages for coherent back-and-forth. Bright Data Integration:** Dynamically filter “FOR\_SALE” properties by city, price, bedrooms, and bathrooms (limit = 3). Automated Snapshot Retrieval:** Poll for dataset readiness and fetch full snapshot content. HTML-Formatted Output:** Present results as a ` of ` items, embedding property images. 🚀 How It Works (Step-by-Step) Prerequisites: n8n ≥ v1.0 Community nodes: install n8n-nodes-brightdata (the unverified community node) API credentials: OpenAI, Bright Data Webhook endpoint to receive chat messages Node Configuration: Chat Trigger: Listens for incoming chat messages; shows a welcome screen. Memory Buffer: Stores the last 30 messages for context. OpenAI Chat Model: Uses GPT-4o-mini to interpret user intent. Real Estate AI Agent: Orchestrates filtering logic, calls tools, and formats responses. Bright Data “Filter Dataset” Tool: Applies user-defined filters plus homeStatus = FOR_SALE. Wait & Recover Snapshot: Polls until snapshot is ready, then fetches content. Get Snapshot Content: Converts raw JSON into a structured list. Workflow Logic: User sends search criteria → Agent validates inputs. Agent invokes “Filter Dataset” once all filters are present. Upon dataset readiness, the snapshot is retrieved and parsed. Final output rendered as a bullet list with property images. Testing & Optimization: Use the built-in Execute Workflow trigger for rapid dry runs. Inspect node outputs in n8n’s UI; adjust filter defaults or snapshot limits. Tune OpenAI model parameters (e.g., maxIterations) for faster responses. Deployment & Monitoring: Activate the main workflow and expose its webhook URL. Monitor executions in the “Executions” panel; set up alerts for errors. Archive or duplicate workflows as needed; update credentials via credential manager. ✅ Pre-requisites Bright Data Account:** API key for marketplaceDataset. OpenAI Account:** Access to GPT-4o-mini model. n8n Version:** v1.0 or later with community node support. Permissions:** Webhook access, credential vault read/write. 👤 Who Is This For? Real-estate agencies and brokers seeking to automate client queries. PropTech startups building conversational search tools. Data analysts who want on-demand property snapshots without manual scraping. 📈 Benefits & Use Cases Time Savings:** Replace manual MLS searches with an AI-driven chat. Scalability:** Serve multiple clients simultaneously via webchat or embedded widget. Consistency:** Always report exactly three properties, ensuring concise results. Engagement:** Visual listings with images boost user satisfaction and conversion. Workflow created and verified by Miquel Colomer https://www.linkedin.com/in/miquelcolomersalas/ and N8nHackers https://n8nhackers.com
by Rajeet Nair
📖 Description 🔹 How it works This workflow introduces an AI + Human-in-the-Loop pipeline for employee timesheet management. It combines the power of Google Drive, AI (OCR + LLM), and Gmail with a human review step to ensure accuracy and compliance. AI-Powered File Discovery Scans a Google Drive folder for new or updated timesheet files (PDF, Word, Excel, Images). AI Data Extraction Uses OCR and LLM (Mistral) to intelligently read and extract structured data. Supports multiple formats: PDF, Word (DOC/DOCX), Excel (XLS/XLSX), and Image files (JPG, PNG, scanned documents). Creates clean JSON with file details and timesheet logs (date, hours worked, tasks, notes). Smart Data Formatting Converts AI output into a clear HTML summary table for easy review. Flags potential anomalies (missing hours, duplicate dates, irregular entries). Human-in-the-Loop Verification Sends an approval email via Gmail containing: File metadata AI-generated HTML summary JSON attachment of raw extracted data HR/Managers review the summary and approve/reject before final actions occur. Post-Approval Automation (optional) Approved records can be saved in a separate Google Drive folder. Employees or HR receive confirmation emails. ⚙️ Set up steps Connect Credentials Add Google Drive and Gmail credentials in n8n. Configure Mistral (or any LLM) API credentials. Configure Google Drive In the “Search files and folders” node, replace the folderId with your company’s timesheet folder ID. Customize Extraction Schema Sticky notes explain how JSON output is structured. Adapt it for your organization’s needs (e.g., overtime, project codes). Set Up Human Verification Emails Update Gmail node recipients to your HR or approval team. Customize the email body (AI summary + JSON file attached). Activate & Test Enable the workflow. Upload a sample timesheet to trigger the AI + human verification loop. ⚡ Result: A robust AI + Human-in-the-Loop workflow that reduces repetitive data entry, prevents payroll errors, and gives HR full confidence before final approval.
by Matteo
This n8n workflow automates the handling of incoming emails. It detects and filters out spam, searches a knowledge base (FAQ) stored in a Pinecone vector database, and sends a reply using Gmail — all powered by an AI model (GPT-4o mini). How It Works Receiving Emails The Gmail Trigger node checks a Gmail inbox every hour. When a new email arrives, it starts the workflow. Fetching Full Email Content The get_message node retrieves all the details of the message: sender, subject, text, message ID, etc. Spam Filtering The Spam checker node uses GPT-4o mini to classify the email as either "spam" or "no spam". It detects not only classic spam but also automated messages (e.g. from Google or Microsoft). If marked as "spam", the workflow ends and nothing is processed. Conditional Filter The If node checks the spam result. Only "no spam" emails proceed to the AI Agent. AI-Based Reply The AI Agent node generates a response based on: The email content A system prompt defining the assistant’s behavior (polite, professional, under the name “Total AI Solutions”) Information retrieved from the Pinecone Vector Store, which contains FAQs The AI is instructed to always check the vector store before replying. The AI prepares both the subject and the body of the reply. Sending the Reply The Gmail node sends the reply to the original sender. It uses the original email's ID to keep the thread intact. Language Model The OpenAI Chat Model node provides GPT-4o mini as the language engine for generating responses. Memory Support The Simple Memory node maintains short-term context, helpful in multi-turn conversations. Knowledge Base (FAQ) The Pinecone Vector Store node connects to a Pinecone index (faqmattabott) containing vectorized FAQ content. Vectors are created using the Embeddings OpenAI node.
by Jacob
Unlock the full potential of your YouTube channel with our powerful integration that connects Google Sheets and DeepSeek AI — designed to skyrocket your video visibility and engagement without manual hassle. What this integration does for you: Automates video data management by pulling your YouTube URLs straight from Google Sheets — no more copy-pasting or manual tracking. Extracts your current titles and descriptions directly from YouTube, giving you a clear starting point. Generates 3 high-impact, SEO-optimized titles plus 1 compelling, conversion-focused description — crafted by DeepSeek’s AI to grab attention and rank higher. Updates your Google Sheet automatically with new optimized titles and descriptions — keeping all your video info in one place, ready to publish. Why it matters: In the crowded world of YouTube, having the right title and description can make all the difference between millions of views or being lost in the noise. This integration takes the guesswork out of optimization, saving you time and boosting your channel’s growth with proven AI-driven content. You need to Sheet with columns: Url Keyword Status Old Title New Title Old Description New Description My contact: jacobmarketingservice@gmail.com
by Gopal Debnath
💡 How It Works: ⏰ Triggers daily at 6:00 AM 📊 Fetches one random question from your Google Sheet 🧠 Formats question, options, correct answer, and explanation 📤 Sends it to: 📧 Email 💬 Telegram (via Bot) 📱 WhatsApp/SMS (via Twilio) 🔧 What You Need to Configure: YOUR_GOOGLE_SHEET_ID → your sheet with columns: question, optionA, optionB, optionC, optionD, correctAnswer, explanation Email credentials (SMTP) Telegram Bot Token & Chat ID Twilio phone numbers and credentials
by Automate With Marc
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🧠 Market Research & Business Case Study Generator Category: AI + Research | GPT + Perplexity | Business Strategy Skill Level: Intermediate Use Case: Market Research, Business Planning, Strategic Analysis 📌 Description: This template automates the creation of comprehensive, data-backed business case studies—perfect for entrepreneurs, analysts, consultants, and market researchers. For more of such build + step-by-step video tutorials, check out: https://www.youtube.com/@Automatewithmarc Just send a simple message like: “Give me a market opportunity analysis of a bicycle rental business in North Africa.” And the workflow does the rest. It scopes your research topic, performs live web research, and crafts a well-structured 1500-word business case study—all automatically saved to Google Docs. 🔧 How It Works: 🟢 Chat Trigger: Start the workflow by sending a prompt via the built-in Chat interface (Langchain Chat Trigger). 🧭 Research Scope Definer (GPT-4o): Breaks down the user input into structured components like industry, geography, trends, and challenges. 🌐 Deep Research (Perplexity Sonar): Performs live research to retrieve relevant industry data, consumer trends, competitive insights, and more. 📘 Business Case Writer (Claude Sonnet): Synthesizes the findings into a detailed case study with sections including: Executive Summary Market Overview Opportunity Analysis Competitive Landscape Risks & Challenges Strategic Recommendations Conclusion 📄 Google Docs Integration: The final output is appended to a connected Google Doc, so all your insights are neatly stored and ready to share. 🧰 Tools Used: OpenAI GPT-4o Perplexity Sonar Deep Research Anthropic Claude Sonnet Google Docs Chat Trigger ✅ Ideal For: Business consultants & strategy teams Market researchers & analysts Startup founders & product managers Educators & MBA students
by Darsheel
This n8n workflow acts as an AI-powered Inbox Assistant that automatically summarizes and classifies Gmail emails, prioritizes important messages, and sends a daily digest to Slack. It’s ideal for startup founders and small teams juggling investor intros, customer leads, and support queries — all from a busy Gmail inbox. Each email is processed using ChatGPT to generate a concise summary, classify the message (e.g., Support, Investor, Spam), and determine its urgency. High and medium priority messages are forwarded to Slack instantly. Lower priority emails are logged to Google Sheets for review. A daily 7 PM digest summarizes the day’s most important messages. 💡 Use Cases Preventing missed investor or lead emails Lightweight CRM alternative using Google Sheets Slack summaries of critical Gmail activity 🔧 How It Works Gmail node fetches new messages ChatGPT summarizes and extracts urgency + type High/medium urgency → sent to Slack + labeled in Gmail Low urgency → logged in Google Sheets Cron node triggers a daily 7 PM Slack summary ✅ Requirements OpenAI API Key (GPT-4 or GPT-4o recommended) Gmail access with read and label permission Slack Bot Token or Webhook URL Google Sheets integration (optional) 🛠 Customization Ideas Replace Slack with Telegram or WhatsApp Route investor leads to Airtable or Notion Add multi-language support in ChatGPT prompt Create weekly summaries via email
by Vlad Temian
Description This workflow automates a video content pipeline that generates creative, Instagram Reel videos using AI. It combines OpenAI's GPT-4o-mini for idea generation with Sisif.ai's text-to-video AI technology to produce engaging short-form content automatically. Perfect for: Content creators, social media managers, marketing teams, and anyone who wants to maintain a consistent flow of AI-generated video content without manual intervention. Prerequisites Sisif.ai Account**: Sign up at sisif.ai and get your API token from https://sisif.ai/users/api-keys/ OpenAI Account**: Get your API key from OpenAI platform n8n Instance**: Self-hosted or cloud instance Step-by-step setup Import the workflow in n8n. Create OpenAI API credentials here. Create Sisif.ai API credentials here. Add OpenAI API & Sisif.ai API creds in n8n. Open the blue sticky → edit topic, style, duration, resolution. Enable the Cron trigger (defaults to every 6 h). Run once to test. Activate when ready. How it Works The workflow operates on a scheduled cycle, generating fresh video content every 6 hours: 🤖 AI Idea Generation: OpenAI's GPT-4o-mini acts as a creative video strategist, generating unique, trend-aware video concepts optimized for Instagram and social media 🎬 Video Creation: Sisif.ai transforms each creative prompt into a high-quality 5-second video in 360x640 resolution ⏱️ Smart Monitoring: The workflow intelligently monitors video generation progress, waiting for completion before proceeding 📊 Data Processing: Final video data is structured and prepared for further use or storage Key Features ⚡ Fully Automated Runs every 6 hours without manual intervention Generates 4 unique videos daily (28 videos per week) Self-monitoring with automatic retry logic 🎯 Optimized for Social Media Instagram 360x640 resolution 5-second duration for maximum engagement Trend-aware content generation Action-packed, visual storytelling 🔧 Smart Architecture Simple HTTP requests for reliable operation Bearer token authentication for secure API access Automatic status checking and waiting logic Error handling and retry mechanisms
by Automate With Marc
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🧠 AI News Update Every 24 Hours (with Perplexity + GPT Formatter) Description: This workflow automatically delivers a clean, AI-curated summary of the latest AI news headlines from the past 24 hours directly to your inbox — every morning at 9 AM. For step-by-step video tutorial for this build, watch here: https://youtu.be/O-DLvaMVLso 🧰 How It Works: 🕘 Schedule Trigger Runs daily at 9 AM to start the workflow. 🔎 Perplexity AI Agent Searches for AI-related headlines published in the last 24 hours, including: Headline 1-sentence summary Source Full URL 🧠 GPT Formatter AI Agent Uses an OpenAI language model (GPT-4.1-mini) to reformat raw news data into a clean, readable email update. 🧷 Memory Buffer (Optional) Gives the formatter context and continuity if you want to personalize formatting further over time. 📧 Gmail Node Sends the formatted AI news digest to your inbox (or your team’s) daily. 📦 Tools & APIs Required: ✅ Perplexity AI API ✅ OpenAI API ✅ Gmail Account (OAuth2 credentials) 🔄 Use Cases: Daily AI trend monitoring for individuals or teams Automating internal market intelligence Research triggers for blog or content creation Email digests for newsletters or Slack updates 🛠️ Customizable Ideas: Swap Gmail for Slack, Telegram, Discord, etc. Modify the topic (e.g., Climate Tech, Crypto News) Add sentiment analysis or AI-generated commentary Send summary to Google Docs or Notion instead