by Dr. Firas
Generate AI Viral Videos with VEO3 and Auto-Publish to TikTok Who is this for? This workflow is for content creators, marketers, and social media managers who want to consistently produce viral-style short videos and publish them automatically to TikTok — without manual editing or uploading. What problem is this workflow solving? / Use case Creating short-form video content that stands out takes time: ideation, scriptwriting, video generation, and publishing. This workflow automates the entire pipeline — from idea generation to TikTok upload — enabling you to scale your content strategy and focus on creativity rather than repetitive tasks. What this workflow does Generates viral video ideas** daily using GPT-5 Creates structured prompts** for before/after transformation videos Renders cinematic vertical videos** with VEO3 (9:16 format) Saves ideas and metadata** into Google Sheets for tracking Uploads videos automatically to TikTok** via Blotato integration Updates status in Google Sheets** once the video is live The result: a fully automated daily viral video publishing system. Setup Google Sheets Connect your Google Sheets account. Create a sheet with columns for idea, caption, environment, sound, production, and final_output. OpenAI Add your OpenAI API credentials (for GPT-5 mini / GPT-4.1 mini). VEO3 (Kie API) Set up your API key in the HTTP Request node (Generate Video with VEO3). Blotato Connect your Blotato account for TikTok publishing. Schedule Trigger Adjust the Start Daily Content Generation node to fit your preferred posting frequency. How to customize this workflow to your needs Platforms**: Extend publishing to YouTube Shorts or Instagram Reels by duplicating the TikTok step. Frequency**: Change the Schedule Trigger to post multiple times per day or only a few times per week. Creative Style**: Modify the system prompts to align with your brand’s style (cinematic, minimalist, neon, etc.). Tracking**: Enhance the Google Sheets logging with engagement metrics by pulling TikTok analytics via Blotato. This workflow helps you build a hands-free AI-powered content engine, turning raw ideas into published viral videos every day. 📄 🎥 Watch This Tutorial: Step by Step 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Avkash Kakdiya
How it works This workflow enriches and personalizes your lead profiles by integrating HubSpot contact data, scraping social media information, and using AI to generate tailored outreach emails. It streamlines the process from contact capture to sending a personalized email — all automatically. The system fetches new or updated HubSpot contacts, verifies and enriches their Twitter/LinkedIn data via Phantombuster, merges the profile and engagement insights, and finally generates a customized email ready for outreach. Step-by-step 1. Trigger & Input HubSpot Contact Webhook: Fires when a contact is created or updated in HubSpot. Fetch Contact: Pulls the full contact details (email, name, company, and social profiles). Update Google Sheet: Logs Twitter/LinkedIn usernames and marks their tracking status. 2. Validation Validate Twitter/LinkedIn Exists: Checks if the contact has a valid social profile before proceeding to scraping. 3. Social Media Scraping (via Phantombuster) Launch Profile Scraper & 🎯 Launch Tweet Scraper: Triggers Phantombuster agents to fetch profile details and recent tweets. Wait Nodes: Ensures scraping completes (30–60 seconds). Fetch Profile/Tweet Results: Retrieves output files from Phantombuster. Extract URL: Parses the job output to extract the downloadable .json or .csv data file link. 4. Data Download & Parsing Download Profile/Tweet Data: Downloads scraped JSON files. Parse JSON: Converts the raw file into structured data for processing. 5. Data Structuring & Merging Format Profile Fields: Maps stats like bio, followers, verified status, likes, etc. Format Tweet Fields: Captures tweet data and associates it with the lead’s email. Merge Data Streams: Combines tweet and profile datasets. Combine All Data: Produces a single, clean object containing all relevant lead details. 6. AI Email Generation & Delivery Generate Personalized Email: Feeds the merged data into OpenAI GPT (via LangChain) to craft a custom HTML email using your brand details. Parse Email Content: Cleans AI output into structured subject and body fields. Sends Email: Automatically delivers the personalized email to the lead via Gmail. Benefits Automated Lead Enrichment — Combines CRM and real-time social media data with zero manual research. Personalized Outreach at Scale — AI crafts unique, relevant emails for each contact. Improved Engagement Rates — Targeted messages based on actual social activity and profile details. Seamless Integration — Works directly with HubSpot, Google Sheets, Gmail, and Phantombuster. Time & Effort Savings — Replaces hours of manual lookup and email drafting with an end-to-end automated flow.
by explorium
Inbound Agent - AI-Powered Lead Qualification with Product Usage Intelligence This n8n workflow automatically qualifies and scores inbound leads by combining their product usage patterns with deep company intelligence. The workflow pulls new leads from your CRM, analyzes which API endpoints they've been testing, enriches them with firmographic data, and generates comprehensive qualification reports with personalized talking points—giving your sales team everything they need to prioritize and convert high-quality leads. DEMO Template Demo Credentials Required To use this workflow, set up the following credentials in your n8n environment: Salesforce Type:** OAuth2 or Username/Password Used for:** Pulling lead reports and creating follow-up tasks Alternative CRM options: HubSpot, Zoho, Pipedrive Get credentials at Salesforce Setup Databricks (or Analytics Platform) Type:** HTTP Request with Bearer Token Header:** Authorization Value:** Bearer YOUR_DATABRICKS_TOKEN Used for:** Querying product usage and API endpoint data Alternative options: Datadog, Mixpanel, Amplitude, custom data warehouse Explorium API Type:** Generic Header Auth Header:** Authorization Value:** Bearer YOUR_API_KEY Used for:** Business matching and firmographic enrichment Get your API key at Explorium Dashboard Explorium MCP Type:** HTTP Header Auth Used for:** Real-time company intelligence and supplemental research Connect to: https://mcp.explorium.ai/mcp Anthropic API Type:** API Key Used for:** AI-powered lead qualification and analysis Get your API key at Anthropic Console Go to Settings → Credentials, create these credentials, and assign them in the respective nodes before running the workflow. Workflow Overview Node 1: When clicking 'Execute workflow' Manual trigger that initiates the lead qualification process. Type:** Manual Trigger Purpose:** On-demand execution for testing or manual runs Alternative Trigger Options: Schedule Trigger:** Run automatically (hourly, daily, weekly) Webhook:** Trigger on CRM updates or new lead events CRM Trigger:** Real-time activation when leads are created Node 2: GET SF Report Pulls lead data from a pre-configured Salesforce report. Method:** GET Endpoint:** Salesforce Analytics Reports API Authentication:** Salesforce OAuth2 Returns: Raw Salesforce report data including: Lead contact information Company names Lead source and status Created dates Custom fields CRM Alternatives: This node can be replaced with HubSpot, Zoho, or any CRM's reporting API. Node 3: Extract Records Parses the Salesforce report structure and extracts individual lead records. Extraction Logic: Navigates report's factMap['T!T'].rows structure Maps data cells to named fields Node 4: Extract Tenant Names Prepares tenant identifiers for usage data queries. Purpose: Formats tenant names as SQL-compatible strings for the Databricks query Output: Comma-separated, quoted list: 'tenant1', 'tenant2', 'tenant3' Node 5: Query Databricks Queries your analytics platform to retrieve API usage data for each lead. Method:** POST Endpoint:** /api/2.0/sql/statements Authentication:** Bearer token in headers Warehouse ID:** Your Databricks cluster ID Platform Alternatives: Datadog:** Query logs via Logs API Mixpanel:** Event segmentation API Amplitude:** Behavioral cohorts API Custom Warehouse:** PostgreSQL, Snowflake, BigQuery queries Node 6: Split Out Splits the Databricks result array into individual items for processing. Field:** result.data_array Purpose:** Transform single response with multiple rows into separate items Node 7: Rename Keys Normalizes column names from database query to readable field names. Mapping: 0 → TenantNames 1 → endpoints 2 → endpointsNum Node 8: Extract Business Names Prepares company names for Explorium enrichment. Node 9: Loop Over Items Iterates through each company for individual enrichment. Node 10: Explorium API: Match Businesses Matches company names to Explorium's business entity database. Method:** POST Endpoint:** /v1/businesses/match Authentication:** Header Auth (Bearer token) Returns: business_id: Unique Explorium identifier matched_businesses: Array of potential matches Match confidence scores Node 11: Explorium API: Firmographics Enriches matched businesses with comprehensive company data. Method:** POST Endpoint:** /v1/businesses/firmographics/bulk_enrich Authentication:** Header Auth (Bearer token) Returns: Company name, website, description Industry categories (NAICS, SIC, LinkedIn) Size: employee count range, revenue range Location: headquarters address, city, region, country Company age and founding information Social profiles: LinkedIn, Twitter Logo and branding assets Node 12: Merge Combines API usage data with firmographic enrichment data. Node 13: Organize Data as Items Structures merged data into clean, standardized lead objects. Data Organization: Maps API usage by tenant name Maps enrichment data by company name Combines with original lead information Creates complete lead profile for analysis Node 14: Loop Over Items1 Iterates through each qualified lead for AI analysis. Batch Size:** 1 (analyzes leads individually) Purpose:** Generate personalized qualification reports Node 15: Get many accounts1 Fetches the associated Salesforce account for context. Resource:** Account Operation:** Get All Filter:** Match by company name Limit:** 1 record Purpose: Link lead qualification back to Salesforce account for task creation Node 16: AI Agent Analyzes each lead to generate comprehensive qualification reports. Input Data: Lead contact information API usage patterns (which endpoints tested) Firmographic data (company profile) Lead source and status Analysis Process: Evaluates lead quality based on usage, company fit, and signals Identifies which Explorium APIs the lead explored Assesses company size, industry, and potential value Detects quality signals (legitimate company email, active usage) and red flags Determines optimal sales approach and timing Connected to Explorium MCP for supplemental company research if needed Output: Structured qualification report with: Lead Score:** High Priority, Medium Priority, Low Priority, or Nurture Quick Summary:** Executive overview of lead potential API Usage Analysis:** Endpoints used, usage insights, potential use case Company Profile:** Overview, fit assessment, potential value Quality Signals:** Positive indicators and concerns Recommended Actions:** Next steps, timing, and approach Talking Points:** Personalized conversation starters based on actual API usage Node 18: Clean Outputs Formats the AI qualification report for Salesforce task creation. Node 19: Update Salesforce Records Creates follow-up tasks in Salesforce with qualification intelligence. Resource:** Task Operation:** Create Authentication:** Salesforce OAuth2 Alternative Output Options: HubSpot:** Create tasks or update deal stages Outreach/SalesLoft:** Add to sequences with custom messaging Slack:** Send qualification reports to sales channels Email:** Send reports to account owners Google Sheets:** Log qualified leads for tracking Workflow Flow Summary Trigger: Manual execution or scheduled run Pull Leads: Fetch new/updated leads from Salesforce report Extract: Parse lead records and tenant identifiers Query Usage: Retrieve API endpoint usage data from analytics platform Prepare: Format data for enrichment Match: Identify companies in Explorium database Enrich: Pull comprehensive firmographic data Merge: Combine usage patterns with company intelligence Organize: Structure complete lead profiles Analyze: AI evaluates each lead with quality scoring Format: Structure qualification reports for CRM Create Tasks: Automatically populate Salesforce with actionable intelligence This workflow eliminates manual lead research and qualification, automatically analyzing product engagement patterns alongside company fit to help sales teams prioritize and personalize their outreach to the highest-value inbound leads. Customization Options Flexible Triggers Replace the manual trigger with: Schedule:** Run hourly/daily to continuously qualify new leads Webhook:** Real-time qualification when leads are created CRM Trigger:** Activate on specific lead status changes Analytics Platform Integration The Databricks query can be adapted for: Datadog:** Query application logs and events Mixpanel:** Analyze user behavior and feature adoption Amplitude:** Track product engagement metrics Custom Databases:** PostgreSQL, MySQL, Snowflake, BigQuery CRM Flexibility Works with multiple CRMs: Salesforce:** Full integration (pull reports, create tasks) HubSpot:** Contact properties and deal updates Zoho:** Lead enrichment and task creation Pipedrive:** Deal qualification and activity creation Enrichment Depth Add more Explorium endpoints: Technographics:** Tech stack and product usage News & Events:** Recent company announcements Funding Data:** Investment rounds and financial events Hiring Signals:** Job postings and growth indicators Output Destinations Route qualification reports to: CRM Updates:** Salesforce, HubSpot (update lead scores/fields) Task Creation:** Any CRM task/activity system Team Notifications:** Slack, Microsoft Teams, Email Sales Tools:** Outreach, SalesLoft, Salesloft sequences Reporting:** Google Sheets, Data Studio dashboards AI Model Options Swap AI providers: Default: Anthropic Claude (Sonnet 4) Alternatives: OpenAI GPT-4, Google Gemini Setup Notes Salesforce Report Configuration: Create a report with required fields (name, email, company, tenant ID) and use its API endpoint Tenant Identification: Ensure your product usage data includes identifiers that link to CRM leads Usage Data Query: Customize the SQL query to match your database schema and table structure MCP Configuration: Explorium MCP requires Header Auth—configure credentials properly Lead Scoring Logic: Adjust AI system prompts to match your ideal customer profile and qualification criteria Task Assignment: Configure Salesforce task assignment rules or add logic to route to specific sales reps This workflow acts as an intelligent lead qualification system that combines behavioral signals (what they're testing) with firmographic fit (who they are) to give sales teams actionable intelligence for every inbound lead.
by Pawan
This template sets up a scheduled automation that scrapes the latest news from The Hindu website, uses a Google Gemini AI Agent to filter and analyze the content for relevance to the Competitive Exams like UPSC Civil Services Examination (CSE) syllabus, and compiles a structured daily digest directly into a Google Sheet. It saves hours of manual reading and note-taking by providing concise summaries, subject categorization, and explicit UPSC importance notes. Who’s it for This workflow is essential for: UPSC/CSE Aspirants who require a curated, focused, and systematic daily current affairs digest. Coaching Institutes aiming to instantly generate structured, high-quality study material for their students. Educators and Content Creators focused on Governance, Economy, International Relations, and Science & Technology. How it works / What it does This workflow runs automatically every morning (scheduled for 7 AM by default) to generate a ready-to-study current affairs document. Scraping: The Schedule Trigger fires an HTTP Request to fetch the latest news links from The Hindu's front page. Data Curation: The HTML and Code in JavaScript nodes work together to extract and pair every article URL with its title. Content Retrieval: For each identified link, a second HTTP Request node fetches the entire article body. AI Analysis and Filtering: The AI Agent uses a detailed prompt and the Google Gemini Chat Model to perform two critical tasks: Filter: It filters out all irrelevant articles (e.g., sports results, local crime) to keep only the 5-6 most important UPSC-relevant pieces (Polity, Economy, IR, etc.). Analyze: For the selected articles, it generates a Brief Summary, identifies the Main Subject, and clearly articulates Why it is Important for the UPSC Exam. Storage: The AI Agent calls the integrated Google Sheets Tool to automatically append the structured, analyzed data into your designated Google Sheet, creating your daily ready-made notes. Requirements To deploy this workflow, you need: n8n Account: (Cloud or self-hosted). Google Gemini API Key: For connecting the Google Gemini Chat Model and powering the AI Agent. Google Sheets Credentials: For reading/writing the final compiled digest. Target Google Sheet: A spreadsheet with the following columns: Date, URL, Subject, Brief Summary, and What is Important. How to set up Credentials Setup:** Connect your Google Gemini and Google Sheets accounts via the n8n Credentials Manager. Google Sheet Linking:* In the *Append row in sheet and Append row in sheet in Google Sheets1 nodes, replace the **placeholder IDs and GIDs with the actual ID and sheet name of your dedicated UPSC notes spreadsheet. Scheduling:* Adjust the time in the *Schedule Trigger: Daily at 7 AM node** if you want the daily analysis to run at a different hour. AI Customization (Optional):* You can refine the System Message in the *AI Agent: Filter & Analyze UPSC News node** to focus the analysis on specific exam phases (e.g., Prelims only) or adjust the priority of subjects.
by Emir Belkahia
This workflow helps Customer Success Managers and customer success professionals quickly gather intelligence on clients or prospects by analyzing their recent LinkedIn activity via a simple Slack command. Who's it for CSMs, Account Managers, and Sales professionals who need fast, structured insights about a person's LinkedIn presence before a call, meeting, or outreach. What it does (and doesn't do) ✅ It DOES: Fetch recent LinkedIn posts from any profile Analyze posting frequency and cadence patterns Identify top themes and focus areas Extract recent highlights with context Generate a clean HTML report sent via email ❌ It DOESN'T: Access private/non-public LinkedIn content Provide real-time updates (it's a snapshot) Replace actual researches when needed Think of it as: Your personal LinkedIn research assistant that turns a name into actionable intelligence in under a minute. How it works Slack command - Type /check-linkedin [Full Name] in Slack Name validation - AI verifies you provided a full name (not just "John") Profile discovery - Finds the correct LinkedIn profile via Apify Content scraping - Pulls their recent posts (last 20) AI analysis - GPT-4.1 analyzes posting patterns, topics, and highlights Report generation - Creates a formatted HTML email report Email delivery - Sends the intelligence brief to your inbox Set up steps Setup time: ~15 minutes Create or use your existing Slack app and add a Slash Command (it can be done here https://api.slack.com/apps) Configure the webhook URL in your Slack app Connect credentials: Slack OAuth Apify API OpenAI API Gmail OAuth Update the email recipient in "Send report via Email" node Test with a known LinkedIn profile Requirements Slack workspace (with app installation permissions) Apify account with credits OpenAI API key (GPT-4.1 access) Gmail account Apify actors: LinkedIn Profile Finder LinkedIn Post Scraper Cost estimation ~$0.05-0.09 per profile check. You could research 11-20 people for $1. ⚠️ Cost Disclaimer: The costs displayed above are indicative only and may vary significantly depending on which n8n actors you select. Some actors incur monthly charges—for example, one of the two actors used in this workflow costs $35/month. So, I recommend using this actor only when there's a clear business need for it. For cost optimization, consider switching to alternative actors that can deliver similar / simpler functionality at a lower cost. If you plan to use this workflow extensively, I strongly suggest performing a budget assessment and evaluating other actor options to maximize cost efficiency. The workflow uses GPT-4.1-mini for lightweight classification and GPT-4.1 for the heavy analysis to balance quality and cost. Known Limitations Common names have limited accuracy: Very common names (e.g., "John Smith") often fail to identify the correct person accurately. An improved version could support company name in the slash command as an additional input to help narrow down results and improve first-try matching accuracy. 💡 Pro tips Check before important meetings: Run this 15-30 minutes before a call. The email report gives you conversation starters and context about what they care about. Batch your research: If you have multiple clients or prospects, queue them up. Just remember each lookup costs ~$0.05-0.09. Watch your Apify credits: The LinkedIn scrapers are the main cost driver. Monitor your Apify usage if you're doing high volume. Don't spam the same profile: LinkedIn may rate-limit. Space out repeat checks on the same person by at least a few hours. Review the "Quick Scan" section first: The email report starts with key stats and top focus areas. Perfect for a 30-second pre-call prep. What to do after the workflow runs Check your email - Report arrives in 30-90 seconds Review the report - Latest post date, cadence, and top themes Read Recent Activity Summary - High-level overview of their content Dive into Detailed Analysis - Two main topics with keywords and rationale Use it strategically: Reference their recent posts in your outreach Ask about topics they're clearly passionate about Tailor your pitch to their demonstrated interests Avoid generic "saw you on LinkedIn" messages Questions or Feedback? 📧 emir.belkahia@gmail.com 💼 linkedin.com/in/emirbelkahia
by TOMOMITSU ASANO
Intelligent Invoice Processing with AI Classification and XML Export Summary Automated invoice processing pipeline that extracts data from PDF invoices, uses AI Agent for intelligent expense categorization, generates XML for accounting systems, and routes high-value invoices for approval. Detailed Description A comprehensive accounts payable automation workflow that monitors for new PDF invoices, extracts text content, uses AI to classify expenses and detect anomalies, converts to XML format for accounting system integration, and implements approval workflows for high-value or unusual invoices. Key Features PDF Text Extraction**: Extract from File node parses invoice PDFs automatically AI-Powered Classification**: AI Agent categorizes expenses, suggests GL codes, detects anomalies XML Export**: Convert structured data to accounting-compatible XML format Approval Workflow**: Route invoices over $5,000 or low confidence for human review Multi-Trigger Support**: Google Drive monitoring or manual webhook upload Comprehensive Logging**: Archive all processed invoices to Google Sheets Use Cases Accounts payable automation Expense report processing Vendor invoice management Financial document digitization Audit trail generation Required Credentials Google Drive OAuth (for PDF source folder) OpenAI API key Slack Bot Token Gmail OAuth Google Sheets OAuth Node Count: 24 (19 functional + 5 sticky notes) Unique Aspects Uses Extract from File node for PDF text extraction (rarely used) Uses XML node for JSON to XML conversion (very rare) Uses AI Agent node for intelligent classification Uses Google Drive Trigger for file monitoring Implements approval workflow with conditional routing Webhook response** mode for API integration Workflow Architecture [Google Drive Trigger] [Manual Webhook] | | +----------+-----------+ | v [Filter PDF Files] | v [Download Invoice PDF] | v [Extract PDF Text] | v [Parse Invoice Data] (Code) | v [AI Invoice Classifier] <-- [OpenAI Chat Model] | v [Parse AI Classification] | v [Convert to XML] | v [Format XML Output] | v [Needs Approval?] (If) / \ Yes (>$5000) No (Auto) | | [Email Approval] [Slack Notify] | | +------+-------+ | v [Archive to Google Sheets] | v [Respond to Webhook] Configuration Guide Google Drive: Set folder ID to monitor in Drive Trigger node Approval Threshold: Default $5,000, adjust in "Needs Approval?" node Email Recipients: Configure finance-approvers@example.com Slack Channel: Set #finance-notifications for updates GL Codes: AI suggests codes; customize in AI prompt if needed Google Sheets: Configure document for invoice archive
by Yusuke
🧠 Overview Discover and analyze the most valuable community-built n8n workflows on GitHub. This automation searches public repositories, analyzes JSON workflows using AI, and saves a ranked report to Google Sheets — including summaries, use cases, difficulty, stars, node count, and repository links. ⚙️ How It Works Search GitHub Code API — queries for extension:json n8n and splits results Fetch & Parse — downloads each candidate file’s raw JSON and safely parses it Extract Metadata — detects AI-powered flows and collects key node information AI Analysis — evaluates the top N workflows (description, use case, difficulty) Merge Insights — combines AI analysis with GitHub data Save to Google Sheets — appends or updates by workflow name 🧩 Setup Instructions (5–10 min) Open Config node and set: search_query — e.g., "openai" extension:json n8n max_results — number of results to fetch (1–100) ai_analysis_top — number of workflows analyzed with AI SPREADSHEET_ID, SHEET_NAME — Google Sheets target Add GitHub PAT via HTTP Header Credential: Authorization: Bearer <YOUR_TOKEN> Connect OpenAI Credential to OpenAI Chat Model Connect Google Sheets (OAuth2) to Save to Google Sheets (Optional) Enable Schedule Trigger to run weekly for automatic updates > 💡 Tip: If you need to show literal brackets, use backticks like `<example>` (no HTML entities needed). 📚 Use Cases 1) Trend Tracking for AI Automations Goal:** Identify the fastest-growing AI-powered n8n workflows on GitHub. Output:** Sorted list by stars and AI detection, updated weekly. 2) Internal Workflow Benchmarking Goal:** Compare your organization’s workflows against top public examples. Output:** Difficulty, node count, and AI usage metrics in Google Sheets. 3) Market Research for Automation Agencies Goal:** Discover trending integrations and tool combinations (e.g., OpenAI + Slack). Output:** Data-driven insights for client projects and content planning. 🧪 Notes & Best Practices 🔐 No hardcoded secrets — use n8n Credentials 🧱 Works with self-hosted or cloud n8n 🧪 Start small (max_results = 10) before scaling 🧭 Use “AI Powered” + “Stars” columns in Sheets to identify top templates 🧩 Uses only Markdown sticky notes — no HTML formatting required 🔗 Resources GitHub (template JSON):** github-workflow-finder-ai.json
by Igor Chernyaev
Template name Smart AI Support Assistant for Telegram Short description Smart AI Support Assistant for Telegram automatically answers repeated questions in your group using a Q&A knowledge base in Pinecone and forwards new or unclear questions to a human expert. Long description (Description поле) How it works Question detection listens to messages in a Telegram group and checks whether each new message is a real question or an expert reply. Knowledge base search looks for an existing answer in the Pinecone vector store for valid questions from the group. Auto‑reply from cache sends the saved answer straight back to the group when a good match is found, without involving the expert. Escalation to expert creates a ticket and forwards unanswered questions to the expert in a private chat with the same bot. Expert learning loop saves the expert’s reply to Pinecone so that similar questions are answered automatically in the future. Setup steps Connect Telegram Trigger to a single Telegram bot that is added as an admin to the group/supergroup and receives all user messages. Use the same bot for the expert: the expert’s private chat with this bot is where tickets and questions are delivered. Set up Pinecone: create an index, note the environment and index name, and add your Pinecone API key to n8n credentials. Add your AI model API key (for example, OpenAI) and select the model used for embeddings and answer rewriting. Configure any environment variables or n8n credentials for project IDs and spaces/namespaces used in Pinecone. Test the full flow: send a question in the group, confirm that a ticket reaches the expert in a private chat, reply once, and check that the next similar question is answered automatically from the cache.
by Oneclick AI Squad
Automate your post-event networking with this intelligent n8n workflow. Triggered instantly after an event, it collects attendee and interaction data, enriches profiles with LinkedIn insights, and uses GPT-4 to analyze engagement and generate tailored follow-up messages. High-value leads are prioritized, messages are sent via email, LinkedIn, or Slack, and all activity is logged in your CRM and database. Save hours of manual follow-up while boosting relationship-building and ROI. 🤝✨ Advanced Features Webhook automation** – Starts instantly on event completion Multi-Source Enrichment** – Combines event data, interactions, and LinkedIn profiles AI-Powered Insights** – GPT-4 analyzes behavior and suggests personalized talking points Smart Priority Filtering** – Routes leads into High, Medium, and Low priority paths Personalized Content Generation** – AI crafts custom emails and LinkedIn messages Multi-Channel Outreach** – Sends via Email, LinkedIn DM, and Slack CRM Integration** – Automatically updates HubSpot with contact notes and engagement PostgreSQL Logging** – Stores full interaction history and analytics ROI Dashboard** – Tracks response rates, meetings booked, and pipeline impact What It Does Collects attendee data from your event platform Enriches with LinkedIn profiles & real-time interaction logs Scores networking potential using engagement algorithms Uses AI to analyze conversations, roles, and mutual interests Generates hyper-personalized follow-up emails and LinkedIn messages Sends messages through preferred channels (email, LinkedIn, Slack) Updates HubSpot CRM with follow-up status and next steps Logs all actions and tracks analytics for performance reporting Workflow Process The Webhook Trigger initiates the workflow via POST request with event and attendee data. Get Attendees** fetches participant list from the event platform. Get Interactions** pulls Q&A, chat, poll, and networking activity logs. Enrich LinkedIn Data** retrieves professional profiles, job titles, and company details via LinkedIn API. Merge & Enrich Data** combines all sources into a unified lead profile. AI Analyze Profile** uses GPT-4 to evaluate interaction depth, role relevance, and conversation context. Filter High Priority** routes top-tier leads (e.g., decision-makers with strong engagement). Filter Medium Priority** handles warm prospects for lighter follow-up. AI Agent1** generates personalized email content using chat model and memory. Generate Email** creates a professional, context-aware follow-up email. Send Email** delivers the message to the lead’s inbox. AI Agent2** crafts a concise, friendly LinkedIn connection message. Generate LinkedIn Msg** produces a tailored outreach note. Send LinkedIn** posts the message via LinkedIn API. Slack Notification** alerts your team in real-time about high-priority outreach. Update CRM (HubSpot)** adds contact, tags, and follow-up tasks automatically. Save to Database (Insert)** logs full lead journey and message content in PostgreSQL. Generate Analytics** compiles engagement metrics and success rates. Send Response** confirms completion back to the event system. Setup Instructions Import the workflow JSON into n8n Configure credentials: Event Platform API (for attendees & interactions) LinkedIn API (OAuth2) OpenAI (GPT-4) SMTP (for email) or Email Service (SendGrid, etc.) HubSpot API Key PostgreSQL Database Slack Webhook URL Trigger with a webhook POST containing event ID and settings Watch personalized outreach happen automatically! Prerequisites Event platform with webhook + attendee/interaction API LinkedIn Developer App with API access OpenAI API key with GPT-4 access HubSpot account with API enabled PostgreSQL database (table for leads & logs) Slack workspace (optional, for team alerts) Example Webhook Payload { "eventId": "evt_spring2025", "eventName": "Annual Growth Summit", "triggerFollowUp": true, "priorityThreshold": { "high": 75, "medium": 50 } } Modification Options Adjust scoring logic in AI Analyze Profile (e.g., weight Q&A participation higher) Add custom email templates in Generate Email with your brand voice Include meeting booking links (Calendly) in high-priority messages Route VIP leads to Send SMS via Twilio Export analytics to Google Sheets or BI tools (Looker, Tableau) Add approval step before sending LinkedIn messages Ready to 10x your event ROI? Get in touch with us for custom n8n automation!
by explorium
Research Agent - Automated Sales Meeting Intelligence This n8n workflow automatically prepares comprehensive sales research briefs every morning for your upcoming meetings by analyzing both the companies you're meeting with and the individual attendees. The workflow connects to your calendar, identifies external meetings, enriches companies and contacts with deep intelligence from Explorium, and delivers personalized research reports—giving your sales team everything they need for informed, confident conversations. DEMO Template Demo Credentials Required To use this workflow, set up the following credentials in your n8n environment: Google Calendar (or Outlook) Type:** OAuth2 Used for:** Reading daily meeting schedules and identifying external attendees Alternative: Microsoft Outlook Calendar Get credentials at Google Cloud Console Explorium API Type:** Generic Header Auth Header:** Authorization Value:** Bearer YOUR_API_KEY Used for:** Business/prospect matching, firmographic enrichment, professional profiles, LinkedIn posts, website changes, competitive intelligence Get your API key at Explorium Dashboard Explorium MCP Type:** HTTP Header Auth Used for:** Real-time company intelligence and supplemental research for AI agents Connect to: https://mcp.explorium.ai/mcp Anthropic API Type:** API Key Used for:** AI-powered company and attendee research analysis Get your API key at Anthropic Console Slack (or preferred output) Type:** OAuth2 Used for:** Delivering research briefs Alternative options: Google Docs, Email, Microsoft Teams, CRM updates Go to Settings → Credentials, create these credentials, and assign them in the respective nodes before running the workflow. Workflow Overview Node 1: Schedule Trigger Automatically runs the workflow on a recurring schedule. Type:** Schedule Trigger Default:** Every morning before business hours Customizable:** Set to any interval (hourly, daily, weekly) or specific times Alternative Trigger Options: Manual Trigger:** On-demand execution Webhook:** Triggered by calendar events or CRM updates Node 2: Get many events Retrieves meetings from your connected calendar. Calendar Source:** Google Calendar (or Outlook) Authentication:** OAuth2 Time Range:** Current day + 18 hours (configurable via timeMax) Returns:** All calendar events with attendee information, meeting titles, times, and descriptions Node 3: Filter for External Meetings Identifies meetings with external participants and filters out internal-only meetings. Filtering Logic: Extracts attendee email domains Excludes your company domain (e.g., 'explorium.ai') Excludes calendar system addresses (e.g., 'resource.calendar.google.com') Only passes events with at least one external attendee Important Setup Note: Replace 'explorium.ai' in the code node with your company domain to properly filter internal meetings. Output: Events with external participants only external_attendees: Array of external contact emails company_domains: Unique list of external company domains per meeting external_attendee_count: Number of external participants Company Research Pipeline Node 4: Loop Over Items Iterates through each meeting with external attendees for company research. Node 5: Extract External Company Domains Creates a deduplicated list of all external company domains from the current meeting. Node 6: Explorium API: Match Business Matches company domains to Explorium's business entity database. Method:** POST Endpoint:** /v1/businesses/match Authentication:** Header Auth (Bearer token) Returns: business_id: Unique Explorium identifier matched_businesses: Array of matches with confidence scores Company name and basic info Node 7: If Validates that a business match was found before proceeding to enrichment. Condition:** business_id is not empty If True:** Proceed to parallel enrichment nodes If False:** Skip to next company in loop Nodes 8-9: Parallel Company Enrichment Node 8: Explorium API: Business Enrich Endpoints:** /v1/businesses/firmographics/enrich, /v1/businesses/technographics/enrich Enrichment Types:** firmographics, technographics Returns:** Company name, description, website, industry, employees, revenue, headquarters location, ticker symbol, LinkedIn profile, logo, full tech stack, nested tech stack by category, BI & analytics tools, sales tools, marketing tools Node 9: Explorium API: Fetch Business Events Endpoint:** /v1/businesses/events/fetch Event Types:** New funding rounds, new investments, mergers & acquisitions, new products, new partnerships Date Range:** September 1, 2025 - November 4, 2025 Returns:** Recent business milestones and financial events Node 10: Merge Combines enrichment responses and events data into a single data object. Node 11: Cleans Merge Data Output Transforms merged enrichment data into a structured format for AI analysis. Node 12: Company Research Agent AI agent (Claude Sonnet 4) that analyzes company data to generate actionable sales intelligence. Input: Structured company profile with all enrichment data Analysis Focus: Company overview and business context Recent website changes and strategic shifts Tech stack and product focus areas Potential pain points and challenges How Explorium's capabilities align with their needs Timely conversation starters based on recent activity Connected to Explorium MCP: Can pull additional real-time intelligence if needed to create more detailed analysis Node 13: Create Company Research Output Formats the AI analysis into a readable, shareable research brief. Attendee Research Pipeline Node 14: Create List of All External Attendees Compiles all unique external attendee emails across all meetings. Node 15: Loop Over Items2 Iterates through each external attendee for individual enrichment. Node 16: Extract External Company Domains1 Extracts the company domain from each attendee's email. Node 17: Explorium API: Match Business1 Matches the attendee's company domain to get business_id for prospect matching. Method:** POST Endpoint:** /v1/businesses/match Purpose:** Link attendee to their company Node 18: Explorium API: Match Prospect Matches attendee email to Explorium's professional profile database. Method:** POST Endpoint:** /v1/prospects/match Authentication:** Header Auth (Bearer token) Returns: prospect_id: Unique professional profile identifier Node 19: If1 Validates that a prospect match was found. Condition:** prospect_id is not empty If True:** Proceed to prospect enrichment If False:** Skip to next attendee Node 20: Explorium API: Prospect Enrich Enriches matched prospect using multiple Explorium endpoints. Enrichment Types:** contacts, profiles, linkedin_posts Endpoints:** /v1/prospects/contacts/enrich, /v1/prospects/profiles/enrich, /v1/prospects/linkedin_posts/enrich Returns: Contacts:** Professional email, email status, all emails, mobile phone, all phone numbers Profiles:** Full professional history, current role, skills, education, company information, experience timeline, job titles and seniority LinkedIn Posts:** Recent LinkedIn activity, post content, engagement metrics, professional interests and thought leadership Node 21: Cleans Enrichment Outputs Structures prospect data for AI analysis. Node 22: Attendee Research Agent AI agent (Claude Sonnet 4) that analyzes prospect data to generate personalized conversation intelligence. Input: Structured professional profile with activity data Analysis Focus: Career background and progression Current role and responsibilities Recent LinkedIn activity themes and interests Potential pain points in their role Relevant Explorium capabilities for their needs Personal connection points (education, interests, previous companies) Opening conversation starters Connected to Explorium MCP: Can gather additional company or market context if needed Node 23: Create Attendee Research Output Formats attendee analysis into a readable brief with clear sections. Node 24: Merge2 Combines company research output with attendee information for final assembly. Node 25: Loop Over Items1 Manages the final loop that combines company and attendee research for output. Node 26: Send a message (Slack) Delivers combined research briefs to specified Slack channel or user. Alternative Output Options: Google Docs:** Create formatted document per meeting Email:** Send to meeting organizer or sales rep Microsoft Teams:** Post to channels or DMs CRM:** Update opportunity/account records with research PDF:** Generate downloadable research reports Workflow Flow Summary Schedule: Workflow runs automatically every morning Fetch Calendar: Pull today's meetings from Google Calendar/Outlook Filter: Identify meetings with external attendees only Extract Companies: Get unique company domains from external attendees Extract Attendees: Compile list of all external contacts Company Research Path: Match Companies: Identify businesses in Explorium database Enrich (Parallel): Pull firmographics, website changes, competitive landscape, events, and challenges Merge & Clean: Combine and structure company data AI Analysis: Generate company research brief with insights and talking points Format: Create readable company research output Attendee Research Path: Match Prospects: Link attendees to professional profiles Enrich (Parallel): Pull profiles, job changes, and LinkedIn activity Merge & Clean: Combine and structure prospect data AI Analysis: Generate attendee research with background and approach Format: Create readable attendee research output Delivery: Combine: Merge company and attendee research for each meeting Send: Deliver complete research briefs to Slack/preferred platform This workflow eliminates manual pre-meeting research by automatically preparing comprehensive intelligence on both companies and individuals—giving sales teams the context and confidence they need for every conversation. Customization Options Calendar Integration Works with multiple calendar platforms: Google Calendar:** Full OAuth2 integration Microsoft Outlook:** Calendar API support CalDAV:** Generic calendar protocol support Trigger Flexibility Adjust when research runs: Morning Routine:** Default daily at 7 AM On-Demand:** Manual trigger for specific meetings Continuous:** Hourly checks for new meetings Enrichment Depth Add or remove enrichment endpoints: Company:** Technographics, funding history, news mentions, hiring signals Prospects:** Contact information, social profiles, company changes Customizable:** Select only needed data to optimize speed and costs Research Scope Configure what gets researched: All External Meetings:** Default behavior Filtered by Keywords:** Only meetings with specific titles By Attendee Count:** Only meetings with X+ external attendees By Calendar:** Specific calendars only Output Destinations Deliver research to your preferred platform: Messaging:** Slack, Microsoft Teams, Discord Documents:** Google Docs, Notion, Confluence Email:** Gmail, Outlook, custom SMTP CRM:** Salesforce, HubSpot (update account notes) Project Management:** Asana, Monday.com, ClickUp AI Model Options Swap AI providers based on needs: Default: Anthropic Claude (Sonnet 4) Alternatives: OpenAI GPT-4, Google Gemini Setup Notes Domain Configuration: Replace 'explorium.ai' in the Filter for External Meetings code node with your company domain Calendar Connection: Ensure OAuth2 credentials have calendar read permissions Explorium Credentials: Both API key and MCP credentials must be configured Output Timing: Schedule trigger should run with enough lead time before first meetings Rate Limits: Adjust loop batch sizes if hitting API rate limits during enrichment Slack Configuration: Select destination channel or user for research delivery Data Privacy: Research is based on publicly available professional information and company data This workflow acts as your automated sales researcher, preparing detailed intelligence reports every morning so your team walks into every meeting informed, prepared, and ready to have meaningful conversations that drive business forward.
by Rahul Joshi
Description Turn incoming Gmail messages into structured Zendesk tickets, enriched by Azure OpenAI, and log key details to Google Sheets for tracking. Ideal for IT Support teams needing fast, consistent intake and documentation. ⚡ What This Template Does Fetches new emails via Gmail Trigger. ✉️ Normalizes Gmail data and formats it for downstream steps. Enriches and structures content with Azure OpenAI Chat Model and Output Parsers. Creates Zendesk tickets from the processed data. 🎫 Appends or updates logs in Google Sheets for auditing and reporting. 📊 Key Benefits Saves time by automating ticket creation and logging. ⏱️ Improves ticket quality with AI-driven normalization and structure. Ensures consistent records in Google Sheets for easy reporting. Reduces manual errors in IT Support intake. ✅ Features Gmail-triggered intake flow for new messages. AI enrichment using Azure OpenAI Chat Model with parsing and memory tooling. Zendesk ticket creation (create: ticket) with structured fields. Google Sheets logging (appendOrUpdate: sheet). Modular design with Execute Workflow nodes for reuse and scaling. Requirements n8n instance (Cloud or self-hosted). Gmail credentials configured in n8n for the Gmail Trigger. Zendesk credentials with permission to create tickets. Google Sheets credentials with access to the target spreadsheet (append/update enabled). Azure OpenAI credentials configured for the Azure OpenAI Chat Model and associated parsing. Target Audience IT Support and Helpdesk teams handling email-based requests. 🛠️ Operations teams standardizing inbound email workflows. Agencies and MSPs offering managed support intake. Internal automation teams centralizing ticket capture and logging. Step-by-Step Setup Instructions Connect Gmail credentials in n8n and select the inbox/label for the Gmail Trigger. Add Zendesk credentials and confirm ticket creation permissions. Configure Google Sheets credentials and select the target sheet for logs. Add Azure OpenAI credentials to the Azure OpenAI Chat Model node and verify parsing steps. Import the workflow, assign credentials to each node, update any placeholders, and run a test. Rename the final email/logging nodes descriptively (e.g., “Log to Support Sheet”) and schedule if needed.
by Rahul Joshi
Description Process new resumes from Google Drive, extract structured candidate data with AI, save to Google Sheets, and auto-create a ClickUp hiring task. Gain a centralized, searchable candidate database and instant task kickoff—no manual data entry. 🚀 What This Template Does Watches a Google Drive folder for new resume PDFs and triggers the workflow. 📂 Downloads the file and converts the PDF to clean, readable text. 📄 Analyzes resume text with an AI Resume Analyzer to extract structured candidate info (name, email, phone, experience, skills, education). 🤖 Cleans and validates the AI JSON output for reliability. 🧹 Appends or updates a candidate row in Google Sheets and creates a ClickUp hiring task. ✅ Key Benefits Save hours with end-to-end, hands-off resume processing. ⏱️ Never miss a candidate—every upload triggers automatically. 🔔 Keep a single source of truth in Sheets, always up-to-date. 📊 Kickstart hiring instantly with auto-created ClickUp tasks. 🗂 Works with varied resume formats using AI extraction. 🧠 Features Google Drive “Watch for New Resumes” trigger (every minute). ⏲ PDF-to-text extraction optimized for text-based PDFs. 📘 AI-powered resume parsing into standardized JSON fields. 🧩 JSON cleanup and validation for safe storage. 🧰 Google Sheets append-or-update for a central candidate database. 📑 ClickUp task creation with candidate-specific titles and assignment. 🎯 Requirements n8n instance (cloud or self-hosted); recommended n8n version 1.106.3 or higher. 🔧 Google Drive access to a dedicated resumes folder (PDF resumes recommended). 📂 Google Sheets credential with edit access to the candidate database sheet. 📈 ClickUp workspace/project access to create tasks for hiring. 📌 AI service credentials for the Resume Analyzer step (add in n8n Credentials). 🤖 Target Audience HR and Talent Acquisition teams needing faster screening. 👥 Recruiters and staffing agencies handling high volumes. 🏢 Startups and ops teams standardizing candidate intake. 🚀 No-code/low-code builders automating hiring workflows. 🧩 Step-by-Step Setup Instructions Connect Google Drive, Google Sheets, ClickUp, and your AI service in n8n Credentials. 🔐 Set the Google Drive “watched” folder (e.g., Resume_store). 📁 Import the workflow, assign credentials to all nodes, and map your Sheets columns. 🗂️ Adjust the ClickUp task details (title pattern, assignee, list). 📝 Run once with a sample PDF to test, then enable scheduling (every 1 minute). ▶️ Optionally rename the email/task nodes for clarity (e.g., “Create Hiring Task in ClickUp”). ✍️