by vinci-king-01
Employee Directory Sync – Microsoft Teams & Coda ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow keeps your employee directory perfectly synchronized across your HRIS (or any REST-compatible HR database), Microsoft Teams, Coda docs, and Slack channels. It automatically polls the HR system on a schedule, detects additions or updates, and propagates those changes to downstream tools so everyone always has the latest employee information. Pre-conditions/Requirements Prerequisites An active n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed A reachable HRIS API (BambooHR, Workday, Personio, or any custom REST endpoint) Existing Microsoft Teams workspace and a team/channel for announcements A Coda account with an employee directory table A Slack workspace and channel where directory updates will be posted Required Credentials Microsoft Teams OAuth2** – To post adaptive cards or messages Coda API Token** – To insert/update rows in your Coda doc Slack OAuth2** – To push notifications into a Slack channel HTTP Basic / Bearer Token** – For your HRIS REST endpoint ScrapeGraphAI API Key** – (Only required if you scrape public profile data) HRIS Field Mapping | HRIS Field | Coda Column | Teams/Slack Field | |------------|-------------|-------------------| | firstName| First Name| First Name | | lastName | Last Name | Last Name | | email | Email | Email | | title | Job Title | Job Title | | department| Department| Department | (Adjust the mapping in the Set and Code nodes as needed.) How it works This workflow keeps your employee directory perfectly synchronized across your HRIS (or any REST-compatible HR database), Microsoft Teams, Coda docs, and Slack channels. It automatically polls the HR system on a schedule, detects additions or updates, and propagates those changes to downstream tools so everyone always has the latest employee information. Key Steps: Schedule Trigger**: Fires daily (or at your chosen interval) to start the sync routine. HTTP Request**: Fetches the full list of employees from your HRIS API. Code (Delta Detector)**: Compares fetched data with a cached snapshot to identify new hires, departures, or updates. IF Node**: Branches based on whether changes were detected. Split In Batches**: Processes employees in manageable sets to respect API rate limits. Set Node**: Maps HRIS fields to Coda columns and Teams/Slack message fields. Coda Node**: Upserts rows in the employee directory table. Microsoft Teams Node**: Posts an adaptive card summarizing changes to a selected channel. Slack Node**: Sends a formatted message with the same update. Sticky Note**: Provides inline documentation within the workflow for maintainers. Set up steps Setup Time: 10-15 minutes Import the workflow into your n8n instance. Open Credentials tab and create: Microsoft Teams OAuth2 credential. Coda API credential. Slack OAuth2 credential. HRIS HTTP credential (Basic or Bearer). Configure the HRIS HTTP Request node Replace the placeholder URL with your HRIS endpoint (e.g., https://api.yourhr.com/v1/employees). Add query parameters or headers as required by your HRIS. Map Coda Doc & Table IDs in the Coda node. Select Teams & Slack channels in their respective nodes. Adjust the Schedule Trigger to your desired frequency. Optional: Edit the Code node to tweak field mapping or add custom delta-comparison logic. Execute the workflow manually once to verify proper end-to-end operation. Activate the workflow. Node Descriptions Core Workflow Nodes: Schedule Trigger** – Initiates the sync routine at set intervals. HTTP Request (Get Employees)** – Pulls the latest employee list from the HRIS. Code (Delta Detector)** – Stores the previous run’s data in workflow static data and identifies changes. IF (Has Changes?)** – Skips downstream steps when no changes were detected, saving resources. Split In Batches** – Iterates through employees in chunks (default 50) to avoid API throttling. Set (Field Mapper)** – Renames and restructures data for Coda, Teams, and Slack. Coda (Upsert Rows)** – Inserts new rows or updates existing ones based on email match. Microsoft Teams (Post Message)** – Sends a rich adaptive card with the update summary. Slack (Post Message)** – Delivers a concise change log to a Slack channel. Sticky Note** – Embedded documentation for quick reference. Data Flow: Schedule Trigger → HTTP Request → Code (Delta Detector) Code → IF (Has Changes?) If No → End If Yes → Split In Batches → Set → Coda → Teams → Slack Customization Examples Change Sync Frequency // Inside Schedule Trigger { "mode": "everyDay", "hour": 6, "minute": 0 } Extend Field Mapping // Inside Set node items[0].json.phone = item.phoneNumber ?? ''; items[0].json.location = item.officeLocation ?? ''; return items; Data Output Format The workflow outputs structured JSON data: { "employee": { "id": "123", "firstName": "Jane", "lastName": "Doe", "email": "jane.doe@example.com", "title": "Senior Engineer", "department": "R&D", "status": "New Hire", "syncedAt": "2024-05-08T10:15:23.000Z" }, "destination": { "codaRowId": "row_abc123", "teamsMessageId": "msg_987654", "slackTs": "1715158523.000200" } } Troubleshooting Common Issues HTTP 401 from HRIS API – Verify token validity and that the credential is attached to the HTTP Request node. Coda duplicates rows – Ensure the key column in Coda is set to “Email” and the Upsert option is enabled. Performance Tips Cache HRIS responses in static data to minimize API calls. Increase the Split In Batches size only if your API rate limits allow. Pro Tips: Use n8n’s built-in Version Control to track mapping changes over time. Add a second IF node to differentiate between “new hires” and “updates” for tailored announcements. Enable Slack’s “threaded replies” to keep your #hr-updates channel tidy.
by Tsubasa Shukuwa
How it works This workflow automatically generates a new haiku poem every morning using AI, formats it in 5-7-5 structure, saves it to Google Docs, and sends it to your email inbox. Workflow steps: Schedule Trigger – Runs daily at 7:00 AM. AI Agent – Asks AI to output four words (kigo, noun, verb1, verb2) in JSON format. Code in JavaScript – Builds a 5-7-5 haiku using the AI-generated words and sets today’s title. Edit Fields – Prepares document fields (title and body) for Google Docs. Create a document – Creates a new Google Document for the haiku. Prepare Append – Collects the document ID and haiku text for appending. Update a document – Inserts the haiku into the existing Google Doc. Send a message – Sends the haiku of the day to your Gmail inbox. OpenRouter Chat Model – Connects the OpenRouter model used by the AI Agent. Setup steps Connect your OpenRouter API key as a credential (used in the AI Agent node). Update your Google Docs folder ID and Gmail account credentials. Change the email recipient address in the “Send a message” node. Adjust the Schedule Trigger time as you like. Run the workflow once to test and verify document creation and email delivery. Ideal for Writers and poets who want daily creative inspiration. Individuals seeking a fun morning ritual. Educators demonstrating AI text generation in a practical example. ⚙️ Note: Each node includes an English Sticky Note above it for clarity and documentation.
by vinci-king-01
Lead Scoring Pipeline with Telegram and Box This workflow ingests incoming lead data from a form submission webhook, enriches each lead with external data sources, applies a custom scoring algorithm, and automatically stores the enriched record in Box while notifying your sales team in Telegram. It is designed to give you a real-time, end-to-end lead-qualification pipeline without writing any glue code. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n.cloud) ScrapeGraphAI community node installed (not directly used in this template but required by marketplace listing rules) Telegram Bot created via BotFather Box account (Developer App or User OAuth2) Publicly accessible URL (for the Webhook trigger) Optional: Enrichment API account (e.g., Clearbit, PDL) for richer scoring data Required Credentials | Credential | Scope | Purpose | |------------|-------|---------| | Telegram Bot Token | Bot | Send scored-lead alerts | | Box OAuth2 Credentials | App Level | Upload enriched lead JSON/CSV | | (Optional) Enrichment API Key | REST | Append firmographic & technographic data | Environment Variables (Recommended) | Variable | Example | Description | |---------|----------|-------------| | LEAD_SCORE_THRESHOLD | 75 | Minimum score that triggers a Telegram alert | | BOX_FOLDER_ID | 123456789 | Destination folder for lead files | How it works This workflow listens for new form submissions, enriches each contact with external data, calculates a lead score based on configurable criteria, and routes the lead through one of two branches: high-value leads trigger an instant Telegram alert and are archived to Box, while low-value leads are archived only. Errors are captured by an Error Trigger for post-mortem analysis. Key Steps: Webhook Trigger**: Receives raw form data (name, email, company, etc.). Set node – Normalization**: Renames fields and initializes default values. HTTP Request – Enrichment**: Calls an external enrichment API to augment data. Merge node**: Combines original and enriched data into a single object. Code node – Scoring**: Runs JavaScript to calculate a numeric lead score. If node – Qualification Gate**: Checks if score ≥ LEAD_SCORE_THRESHOLD. Telegram node**: Sends alert message to your sales channel for high-scoring leads. Box node**: Uploads the enriched JSON (or CSV) file into a specified folder. Error Trigger**: Captures any unhandled errors and notifies ops (optional). Sticky Notes**: Explain scoring logic and credential placement (documentation aids). Set up steps Setup Time: 15-25 minutes Create Telegram Bot & Get Token Talk to BotFather → /newbot → copy the provided token. Create a Box Developer App Enable OAuth2 → add https://api.n8n.cloud/oauth2-credential/callback (or your own) as redirect URI. Install Required Community Nodes From n8n editor → “Install” → search “ScrapeGraphAI” → install. Import the Workflow JSON Click “Import” → paste the workflow file → save. Configure the Webhook URL in Your Form Tool Copy the production URL generated by the Webhook node → add it as form action. Set Environment Variables In n8n (Settings → Environment) add LEAD_SCORE_THRESHOLD and BOX_FOLDER_ID. Fill in All Credentials Telegram: paste bot token. Box: complete OAuth2 flow. Enrichment API: paste key in the HTTP Request node headers. Activate Workflow Toggle “Activate”. Submit a test form to verify Telegram/Box outputs. Node Descriptions Core Workflow Nodes: Webhook** – Entry point; captures incoming JSON payload from the form. Set (Normalize Fields)** – Maps raw keys to standardized ones (firstName, email, etc.). HTTP Request (Enrichment)** – Queries external service for firmographic data. Merge (Combine Data)** – Merges the two JSON objects (form + enrichment). Code (Scoring)** – Calculates lead score using weighted attributes. If (Score Check)** – Branches flow based on the score threshold. Telegram** – Sends high-score alerts to a specified chat ID. Box** – Saves a JSON/CSV file of the enriched lead to cloud storage. Error Trigger** – Executes if any preceding node fails. Sticky Notes** – Inline documentation for quick reference. Data Flow: Webhook → Set → HTTP Request → Merge → Code → If If (true) → Telegram If (always) → Box Error (from any node) → Error Trigger Customization Examples Change Scoring Logic // Inside the Code node const { jobTitle, companySize, technologies } = items[0].json; let score = 0; if (jobTitle.match(/(CTO|CEO|Founder)/i)) score += 50; if (companySize > 500) score += 20; if (technologies.includes('AWS')) score += 10; // Bonus: subtract points if free email domain if (items[0].json.email.endsWith('@gmail.com')) score -= 30; return [{ json: { ...items[0].json, score } }]; Use a Different Storage Provider (e.g., Google Drive) // Replace Box node with Google Drive node { "node": "Google Drive", "operation": "upload", "fileName": "lead_{{$json.email}}.json", "folderId": "1A2B3C..." } Data Output Format The workflow outputs structured JSON data: { "firstName": "Ada", "lastName": "Lovelace", "email": "ada@example.com", "company": "Analytical Engines Inc.", "companySize": 250, "jobTitle": "CTO", "technologies": ["AWS", "Docker", "Node.js"], "score": 82, "qualified": true, "timestamp": "2024-04-07T12:34:56.000Z" } Troubleshooting Common Issues Telegram messages not received – Ensure the bot is added to the group and chat_id/token are correct. Box upload fails with 403 – Check folder permissions; verify OAuth2 tokens have not expired. Webhook shows 404 – The workflow is not activated or the URL was copied in “Test” mode instead of “Production”. Performance Tips Batch multiple form submissions using the “SplitInBatches” node to reduce API-call overhead. Cache enrichment responses (Redis, n8n Memory) to avoid repeated lookups for the same domain. Pro Tips: Add an n8n “Wait” node between enrichment calls to respect rate limits. Use Static Data to store domain-level enrichment results for even faster runs. Tag Telegram alerts with emojis based on score (🔥 Hot Lead for >90). This is a community-contributed n8n workflow template provided “as-is.” Always test thoroughly in a non-production environment before deploying to live systems.
by Anatoly
Automated Solana News Tracker with AI-Powered Weekly Summaries Never miss important Solana ecosystem updates again. This production-ready workflow automatically scrapes crypto news daily, intelligently filters duplicates, stores everything in Google Sheets, and generates AI-powered weekly summaries every Monday—completely hands-free. 🎯 What It Does: This intelligent automation runs on autopilot to keep you informed about Solana developments without manual monitoring. Every day at 8 AM PT, it fetches the latest Solana news from CryptoPanic, checks for duplicates against your existing database, and stores only new articles in Google Sheets. On Mondays, it takes an extra step: reading all accumulated articles from the past week and using GPT-4.1-mini to generate a concise, factual summary of key developments and investor takeaways. Daily News Collection**: Automatically fetches latest Solana articles from CryptoPanic API Smart Duplicate Detection**: Compares incoming articles against existing database to prevent redundancy Data Validation**: Filters out incomplete articles to ensure data quality Organized Storage**: Maintains clean Google Sheets database with timestamps and descriptions Weekly AI Summaries**: Analyzes accumulated news every Monday and generates 2-3 sentence insights Historical Archive**: Builds searchable database of both raw articles and weekly summaries 💼 Perfect For: Crypto traders tracking market-moving news • SOL investors monitoring ecosystem growth • Blockchain researchers building historical datasets • Content creators sourcing newsletter material • Portfolio managers needing daily briefings • Anyone wanting Solana updates without information overload 🔧 How It Works: The workflow operates in two distinct modes based on the day of the week. During the daily collection phase (Tuesday-Sunday), it runs at 8 AM PT, fetches the latest Solana news from CryptoPanic, formats the data to extract titles, descriptions, and timestamps, checks each article against your Google Sheets database to identify duplicates, filters out any articles that already exist or have missing data, and appends only valid new articles to your "Raw Data" sheet. On Mondays, the workflow performs all daily tasks plus an additional summarization step. After storing new articles, it retrieves all accumulated news from the "Raw Data" sheet, aggregates all article descriptions into a single text block, sends this consolidated information to GPT-4.1-mini with instructions to create a factual, spartan-toned summary highlighting key investor takeaways, and saves the AI-generated summary with a timestamp to the "Weekly Summary" sheet for historical reference. ✨ Key Features: Schedule-based execution**: Runs automatically at 8 AM PT every day without manual intervention Intelligent deduplication**: Title-based matching prevents storing the same article multiple times Data quality control**: Validates required fields before storage to maintain clean dataset Dual-sheet architecture**: Separate sheets for raw articles and weekly summaries for easy access Cost-effective AI**: Uses GPT-4.1-mini (~$0.001 per summary) for extremely low operating costs Scalable storage**: Google Sheets handles thousands of articles with free tier Customizable cryptocurrency**: Easily adapt to track Bitcoin, Ethereum, or any supported coin Flexible scheduling**: Modify trigger time and summary frequency to match your needs 📋 Requirements: CryptoPanic account with free API key (register at cryptopanic.com) Google Sheets with two sheets: "Raw Data" (columns: date, title, descripton, summary) and "Weekly Summary" (columns: Date, Summary) OpenAI API key for GPT-4.1-mini access (~$0.05/month cost) n8n Cloud or self-hosted instance with schedule trigger enabled ⚡ Quick Setup: Register for a free CryptoPanic API key and replace [your token] in the "Get Solana News" HTTP Request node URL. Create a new Google Spreadsheet with two sheets: one named "Raw Data" with columns for date, title, descripton (note the typo in template), and summary; another named "Weekly Summary" with columns for Date and Summary. Connect your Google Sheets OAuth2 credential to all Google Sheets nodes in the workflow. Add your OpenAI API credential to the "Summarize News" node. Test the workflow manually to ensure it fetches news and stores it correctly. Activate the workflow to enable daily automatic execution. 🚨 Please note, that you're not able to get news in real-time with a FREE CryptoPanic API. Consider their pro plan or another platform for real-time news scraping You'll get new that's up to date as of yesterday. 🎁 What You Get: Complete end-to-end automation with concise sticky note documentation at each workflow stage, pre-configured duplicate detection logic, AI summarization with investor-focused prompts optimized for factual analysis without hype, dual-sheet Google Sheets structure for raw data and summaries, flexible schedule trigger you can adjust to any timezone, example data in pinned format showing expected API responses, customization guides for different cryptocurrencies and summary frequencies, and troubleshooting checklist for common setup issues. 💰 Expected Costs & Performance: CryptoPanic API is free with reasonable rate limits for personal use. OpenAI GPT-4.1-mini costs approximately $0.001 per summary, totaling about $0.05 per month for weekly summaries. The workflow typically processes 20-50 articles daily and generates one summary weekly from 140-350 accumulated articles. Daily executions complete in 5-10 seconds, while Monday runs with AI summarization take 15-20 seconds. Google Sheets provides free storage for up to 5 million cells, easily handling years of news data. 🔄 Customization Ideas: Track different cryptocurrencies by changing the currencies parameter (btc, eth, ada, doge, etc.). Adjust the schedule trigger to run at different times matching your timezone. Modify the Monday check condition to generate summaries on different days or multiple times per week. Connect Slack, Discord, or Email nodes to receive instant notifications when summaries are generated. Edit the AI prompt to change tone, detail level, or focus on specific aspects like price action, development updates, or partnerships. Add conditional logic to send alerts only when certain keywords appear in news (like "hack," "partnership," or "upgrade").
by Mira Melhem
👔 Recruitment Office WhatsApp Automation Automate WhatsApp communication for recruitment agencies with an interactive, structured customer experience. This workflow handles pricing inquiries, request submissions, tracking, complaints, and human escalation while maintaining full session tracking and media support. Good to know Uses WhatsApp Interactive List Messages for user selection and navigation. Includes session-state logic and memory across messages. Includes a 5-minute cooldown to avoid spam and repeated triggers. Supports logging for all interaction types including media files. Includes both a global bot shutdown switch and per-user override. How it works A customer sends a message to the official WhatsApp number. The workflow replies with an interactive menu containing 8 service options: 💰 Pricing by nationality (8 supported countries) 📝 New recruitment request submission 🔍 Tracking existing applications via Google Sheets lookup 🔁 Worker transfer link distribution 🌍 Translation service information 📄 Required documents and instructions ⚠️ Complaint submission and routing 👤 Request a human agent The workflow retrieves or stores data based on the selection using Google Sheets and Data Tables. If the customer requests human help or the logic detects uncertainty, the workflow: Pauses automation for that user Notifies a designated staff member All interactions are logged including files, text, timestamps, and selections. Features 📋 Structured WhatsApp service menu 📄 CRM-style recruitment request logging ✨ Pricing logic with nationality mapping 🔍 Lookup-based status tracking 📎 Support for media uploads (PDF, images, audio, documents) 🧠 Session tracking with persistent user state 🤝 Human escalation workflow with internal notifications 🛑 Anti-spam and cooldown control 🎚 Bot master switch (global + per-user) Technology stack | Component | Usage | |----------|-------| | n8n | Automation engine | | WhatsApp Business API | Messaging and interactive UX | | Google Sheets | CRM and logs | | Data Tables | State management | | JavaScript | Custom logic and routing | Requirements WhatsApp Business API account with active credentials n8n Cloud or self-hosted instance Google Sheets for CRM storage Data Tables enabled for persistent session tracking How to use The workflow uses a Webhook trigger compatible with common WhatsApp API providers. Modify menu content, pricing, optional steps, and escalation flows as needed. Link your Google Sheets and replace test sheet IDs with production values. Configure human escalation to notify team members or departments. Customising this workflow Replace Google Sheets with Airtable, HubSpot, or SQL storage. Add expiration and reminder messages for missing documents. Add AI-powered response logic for common questions. Enable multi-country support (Saudi/UAE/Jordan/Qatar/Kuwait/etc.) Connect to dashboards for reporting and staff performance analytics.
by Muhammad Ali
🚀 How It Works Turn your WhatsApp chats into an AI-powered meeting scheduler with Google Gemini, Google Calendar, and Google Sheets. This workflow understands natural language like “Book a meeting with Ali at 3 PM tomorrow”, checks your contacts, avoids overlaps, and updates your calendar automatically all from WhatsApp. It’s a complete AI scheduling system built for founders, teams, and service providers who manage clients over chat. 🔁 Workflow Overview WhatsApp Trigger** → Captures incoming messages in real time Intent Agent (Gemini)** → Detects scheduling intent (create / edit / cancel) Google Sheets** → Finds contact names, emails, and tags Get Events** → Checks existing meetings to prevent conflicts Correction Agent + Intent Check** → Confirms details with AI Calendar Agent (Gemini)** → Executes the calendar action intelligently Create / Update / Delete Event** → Syncs instantly to Google Calendar Response Node** → Sends WhatsApp and email confirmations ⚙️ Quick Setup (⏱ ~15 min) Connect WhatsApp Cloud API – link your WhatsApp Business account Authenticate Google Calendar & Sheets – use Sheets for contacts (Name | Email | Type) Add Google Gemini API Key – used by Intent, Correction, and Calendar agents Customize Prompts – adjust tone and language in the Gemini nodes Test Your Flow – e.g., message “Schedule meeting with Ali at 10 AM Friday” to verify calendar and confirmation replies 💡 All setup details are also documented inside the workflow sticky notes. 🧩 Integrations WhatsApp Cloud API Google Calendar API Google Sheets API Google Gemini (LLM) 💡 Benefits ✅ Automates scheduling directly from WhatsApp ✅ Understands natural language requests ✅ Prevents double-bookings automatically ✅ Sends instant confirmations ✅ Saves hours of manual coordination 👥 Ideal For Entrepreneurs & consultants managing clients on WhatsApp Sales or support teams booking demos and meetings Virtual assistants and AI service providers Anyone who wants a 24/7 AI calendar manager
by Oneclick AI Squad
This n8n workflow automatically captures LinkedIn leads from multiple sources (new connections, post engagements), enriches the data with AI-powered scoring, eliminates duplicates, syncs to Google Sheets and your CRM (HubSpot or Salesforce), and sends instant notifications to your sales team for high-quality leads. No more manual copy-pasting or lost opportunities—every LinkedIn interaction becomes a tracked, qualified lead ready for follow-up. Benefits Zero Manual Data Entry:** Eliminates tedious copy-paste work—saves 10+ hours per week for active LinkedIn users Smart Lead Scoring:** AI calculates quality scores (0-100) and assigns temperature labels (Hot/Warm/Cold) based on engagement patterns Multi-Source Capture:** Monitors new connections, post comments, likes, and shares to catch leads from all touchpoints Duplicate Prevention:** Tracks leads for 90 days to avoid database clutter and redundant outreach Instant Sales Alerts:** Notifies team via Slack/Email within seconds for Hot & Warm leads, enabling rapid follow-up CRM Integration:** Syncs directly to HubSpot or Salesforce with custom fields for lead temperature and quality metrics Built-in Analytics:** Tracks cumulative statistics (total leads, temperature distribution) for performance monitoring Always-On Automation:** Runs every 15 minutes 24/7, capturing leads even when you're sleeping or in meetings Useful for Which Industry B2B SaaS & Tech:** Sales teams prospecting decision-makers and capturing demo requests from LinkedIn content Consulting & Professional Services:** Agencies tracking client inquiries and partnership opportunities from thought leadership posts Recruiting & HR:** Talent acquisition teams building candidate pipelines from LinkedIn engagement Real Estate & Financial Services:** Agents capturing high-intent leads from market updates and educational content Marketing Agencies:** Business development teams converting social engagement into qualified opportunities E-Learning & Coaching:** Course creators and consultants tracking interested prospects from free content Manufacturing & B2B Sales:** Enterprise sales teams monitoring buying committee members engaging with product content How it works Monitor LinkedIn - Checks every 15 minutes for new connections and post engagements Extract data - Pulls profile information, company, position from multiple sources Score leads - Calculates quality score (0-100) and assigns temperature (Hot/Warm/Cold) Prevent duplicates - Tracks leads for 90 days to avoid duplicate entries Sync to platforms - Saves to Google Sheets and your CRM (HubSpot or Salesforce) Smart notifications - Alerts sales team via Slack/Email for Hot and Warm leads only Setup steps Connect LinkedIn - Add OAuth2 credentials to both LinkedIn nodes Setup Google Sheets - Create spreadsheet with these columns: leadId, fullName, firstName, lastName, email, phone, company, position, headline, location, profileUrl, leadSource, leadTemperature, qualityScore, engagementType, commentText, dateAdded, status, assignedTo, notes, lastUpdated Choose CRM - Enable either HubSpot OR Salesforce nodes (disable the other) Configure alerts - Add Slack webhook URL and/or email settings Test workflow - Run manually first to verify all connections Activate - Turn on for automatic lead capture Key features Smart scoring**: Quality score based on engagement level and profile completeness Temperature labels**: Hot (70+), Warm (50-69), Cold (<50) for prioritization No duplicates**: 90-day tracking prevents duplicate entries Multi-CRM support**: Works with HubSpot or Salesforce Analytics built-in**: Tracks total leads and temperature distribution
by VEED
Generate social videos with AI avatars using VEED and Claude Overview This n8n workflow automatically generates TikTok/Reels-ready talking head videos from scratch. You provide a topic and intention, and the workflow handles everything: scriptwriting, avatar generation, voiceover creation, and video rendering. Output: Vertical (9:16) AI-generated videos with lip-synced avatars, ready for social media posting. What It Does Topic + Intention → Claude writes script → OpenAI generates avatar → OpenAI creates voiceover → VEED renders video → Saved to Google Drive + logged to Sheets Pipeline Breakdown | Step | Tool | What Happens | |------|------|--------------| | 1. Script Generation | Claude Sonnet 4 | Creates hook, script (30-45 sec), caption, and image prompt based on your topic and intention | | 2. Avatar Generation | OpenAI gpt-image-1 | Generates photorealistic portrait image (1024×1536) | | 3. Voiceover | OpenAI TTS-1-HD | Converts script to natural speech (Nova voice) | | 4. Video Rendering | VEED Fabric 1.0 | Lip-syncs avatar to audio, creates final video | | 5. Storage | Google Drive | Uploads final MP4 | | 6. Logging | Google Sheets | Records all metadata (script, caption, URLs, timestamps) | Required Connections API Keys (entered in Configuration node) | Service | Key Type | Where to Get | |---------|----------|--------------| | Anthropic | API Key | https://console.anthropic.com/settings/keys | | OpenAI | API Key | https://platform.openai.com/api-keys | > ⚠️ OpenAI Note: gpt-image-1 requires organization verification. Go to https://platform.openai.com/settings/organization/general to verify. n8n Credentials (connect in n8n) | Node | Credential Type | Purpose | |------|-----------------|---------| | �� Generate Video (VEED) | FAL.ai API | VEED video rendering | | �� Upload to Drive | Google Drive OAuth2 | Store final videos | | �� Log to Sheets | Google Sheets OAuth2 | Track all generated content | Configuration Options Edit the ⚙️ Workflow Configuration node to customize. The configuration uses a JSON format: { "topic": "AI video creation tools", "intention": "informative", "brand_name": "YOUR_BRAND_NAME", "target_audience": "content creators and marketers", "trending_hashtags": "#AIvideo #ContentCreation #VideoMarketing #AItools #TikTokTips", "num_videos": 1, "anthropic_api_key": "YOUR_ANTHROPIC_API_KEY", "openai_api_key": "YOUR_OPENAI_API_KEY", "video_resolution": "720p", "video_aspect_ratio": "9:16", "custom_avatar_description": "", "custom_script": "" } Configuration Fields Explained | Field | Required | Description | |-------|----------|-------------| | topic | ✅ | The subject of your video (e.g., "AI productivity tools") | | intention | ✅ | Content style: informative, lead_generation, or disruption | | brand_name | ✅ | Your brand/product name to mention | | target_audience | ✅ | Who you're creating content for | | trending_hashtags | ✅ | Hashtags to include in the caption | | num_videos | ✅ | How many videos to generate (1-5 recommended) | | anthropic_api_key | ✅ | Your Anthropic API key | | openai_api_key | ✅ | Your OpenAI API key | | video_resolution | ✅ | Video quality: 720p or 1080p | | video_aspect_ratio | ✅ | Aspect ratio: 9:16 (vertical) or 16:9 (horizontal) | | custom_avatar_description | ❌ | Optional: Describe your avatar (leave empty for AI-generated) | | custom_script | ❌ | Optional: Your own script (leave empty for AI-generated) | Intention Types | Intention | Content Style | Best For | |-----------|---------------|----------| | informative | Educational, value-driven, builds trust | Thought leadership, tutorials | | lead_generation | Creates curiosity, soft CTA | Product awareness, funnels | | disruption | Bold, provocative, scroll-stopping | Viral potential, brand awareness | Custom Avatar & Script Options The workflow supports flexible content generation - you can let Claude generate everything, or provide your own inputs. Custom Avatar Description Leave custom_avatar_description empty to let Claude decide, or provide your own: "custom_avatar_description": "a female influencer in her 30s, with a coworking space in the background, attractive but charismatic" Examples: "a woman in her 20s with gym clothes" "a bearded man in his 30s wearing a hoodie" "a professional woman with glasses in business casual" Custom Script Leave custom_script empty to let Claude write it, or provide your own: "custom_script": "This is my custom script. VEED is a great platform for creating videos like this. You can try it too!" Guidelines for custom scripts: Keep it 30-45 seconds when read aloud Maximum ~450 characters Avoid special characters for TTS compatibility Write naturally, as if speaking Behavior Matrix | custom_avatar_description | custom_script | What Claude Generates | |---------------------------|---------------|----------------------| | Empty | Empty | Avatar + Script + Caption | | Provided | Empty | Script + Caption | | Empty | Provided | Avatar + Caption | | Provided | Provided | Caption only | Content Angles (auto-rotated) When generating multiple videos, the workflow automatically varies the approach: | # | Angle | Hook Style | |---|-------|------------| | 1 | Problem-solution | Opens with a question | | 2 | Myth-busting | Opens with controversial statement | | 3 | Quick-tip | Opens with a number/statistic | | 4 | Before-after | Opens with transformation | | 5 | Trend-commentary | Opens with news/timely angle | Output Per Video Generated | Asset | Format | Location | |-------|--------|----------| | Final Video | MP4 (720p, 9:16) | Google Drive folder | | Avatar Image | PNG (1024×1536) | tmpfiles.org (temporary) | | Voiceover | MP3 | tmpfiles.org (temporary) | | Metadata | Row entry | Google Sheets | Google Sheets Columns | Column | Description | |--------|-------------| | topic | Video topic | | intention | Content intention used | | brand_name | Brand mentioned | | content_theme | 2-3 word theme summary | | script_audio | Full voiceover script | | script_image | Image generation prompt | | caption | Ready-to-post TikTok caption with hashtags | | image_url | Temporary avatar image URL | | audio_url | Temporary audio URL | | video_url | Google Drive link to final video | | status | done/error | | created_at | Timestamp | Estimated Costs Per Video | Service | Usage | Approximate Cost | |---------|-------|------------------| | Claude Sonnet 4 | 2K tokens | $0.01 | | OpenAI gpt-image-1 | 1 image (1024×1536) | ~$0.04-0.08 | | OpenAI TTS-1-HD | 450 characters | $0.01 | | VEED/FAL.ai | 1 video render | ~$0.10-0.20 | | Total | | ~$0.15-0.30 per video | > Costs vary based on script length and current API pricing. Setup Checklist Step 1: Import Workflow [ ] Import generate-social-videos-with-ai-avatars-using-veed-and-claude.json into n8n Step 2: Configure API Keys [ ] Open the ⚙️ Workflow Configuration node [ ] Replace YOUR_ANTHROPIC_API_KEY with your actual Anthropic API key [ ] Replace YOUR_OPENAI_API_KEY with your actual OpenAI API key [ ] Verify your OpenAI organization at https://platform.openai.com/settings/organization/general (required for gpt-image-1) Step 3: Connect n8n Credentials [ ] Click on �� Generate Video (VEED) node → Add FAL.ai credential [ ] Click on �� Upload to Drive node → Add Google Drive OAuth2 credential [ ] Click on �� Log to Sheets node → Add Google Sheets OAuth2 credential Step 4: Configure Storage [ ] Update the �� Upload to Drive node with your Google Drive folder URL [ ] Update the �� Log to Sheets node with your Google Sheets URL [ ] Create column headers in your Google Sheet: topic, intention, brand_name, content_theme, script_audio, script_image, caption, image_url, audio_url, video_url, status, created_at Step 5: Customize Content [ ] Update topic, brand_name, target_audience, and trending_hashtags [ ] Optionally add custom_avatar_description and/or custom_script Step 6: Test [ ] Set num_videos: 1 for initial testing [ ] Execute the workflow [ ] Check Google Drive for the output video [ ] Verify metadata was logged to Google Sheets MCP Integration (Optional) This workflow can also be exposed to Claude Desktop via n8n's Model Context Protocol (MCP) integration, allowing you to generate videos through natural language prompts. To enable MCP: Add a Webhook Trigger node to the workflow (in addition to the Manual Trigger) Connect it to the same ⚙️ Workflow Configuration node Go to Settings → Instance-level MCP → Enable the workflow Configure Claude Desktop with your n8n MCP server URL Claude Desktop Configuration (Windows): { "mcpServers": { "n8n-mcp": { "command": "supergateway", "args": [ "--streamableHttp", "https://YOUR_N8N_INSTANCE.app.n8n.cloud/mcp-server/http", "--header", "authorization:Bearer YOUR_MCP_ACCESS_TOKEN" ] } } } > Note: Install supergateway globally first: npm install -g supergateway Limitations & Notes Technical Limitations tmpfiles.org**: Temporary file URLs expire after ~1 hour. Final videos are safe in Google Drive. VEED processing**: Takes 2-5 minutes per video depending on length. n8n Cloud network**: Some external domains are blocked; workflow uses base64 for images to avoid this. Content Considerations Scripts are optimized for 30-45 seconds (TTS-friendly) Avatar images are AI-generated (not real people) Captions include hashtags automatically Each video in a batch gets a different content angle Best Practices Start small: Test with 1 video before scaling to 5 Review scripts: Claude generates good content but review before posting Monitor costs: Check API usage dashboards weekly Backup sheets: The Google Sheet serves as your content database Troubleshooting | Issue | Solution | |-------|----------| | "Organization must be verified" | Verify at platform.openai.com/settings/organization/general | | VEED authentication error | Re-add FAL.ai credential to VEED node | | Google Drive "no binary field" | Ensure Download Video outputs to field named data | | JSON parse error from Claude | Workflow has fallback content; check Claude node output | | Image URL blocked | Workflow uses base64 to avoid this; ensure gpt-image-1 model | | MCP "Server disconnected" (Windows) | Install supergateway globally: npm install -g supergateway | | MCP path error on Windows | Use supergateway directly instead of npx | Version History | Version | Date | Changes | |---------|------|---------| | 2.0 | Jan 2026 | Added custom avatar/script options, MCP integration support, improved configuration | | 1.0 | Jan 2026 | Initial release with portrait mode, gpt-image-1, native VEED node | Credits Built with: n8n** - Workflow automation Anthropic Claude** - Script generation OpenAI** - Image & audio generation VEED Fabric** - AI video rendering Google Workspace** - Storage & logging
by Dinakar Selvakumar
How it works This workflow automatically publishes Instagram and Facebook posts using Google Sheets as a content calendar. Users add post details to a sheet, and the workflow handles scheduling, image processing, posting, and status updates without manual intervention. Step-by-step Scheduled Trigger The workflow runs automatically at a fixed interval (for example, every 15 minutes) to check for posts that are ready to be published. Configuration & Credentials A configuration step stores reusable values such as spreadsheet ID, sheet name, and platform settings, keeping the workflow easy to customize and secure. Data Retrieval & Filtering Posts are read from Google Sheets and filtered to include only rows marked as “Pending” and scheduled for the current time or earlier. Image Handling If an image link is provided, the workflow downloads the image from Google Drive. If no image is present, the post continues as text-only. Platform Routing Based on the selected platform (Instagram, Facebook, or both), the workflow routes the post to the appropriate publishing path. Social Media Publishing The post is published to Instagram and/or Facebook using the connected business account credentials. Status Update After publishing, the workflow updates the original Google Sheet with the post status (Success or Failed), published timestamp, and error message if applicable.
by SuperAgent
Who is this template for? This template is ideal for small businesses, agencies, and solo professionals who want to automate appointment scheduling and caller follow-up through a voice-based AI receptionist. If you’re using tools like Google Calendar, Airtable, and Vapi (Twilio), this setup is for you. What problem does this workflow solve? Manual call handling, appointment booking, and email coordination can be time-consuming and prone to errors. This workflow solves that by automating the receptionist role: answering calls, checking calendar availability, managing appointments, and storing call summaries—all without human intervention. What this workflow does This Agent Receptionist manages inbound voice calls and scheduling tasks using Vapi and Google Calendar. It checks availability, books or updates calendar events, sends email confirmations, and logs call details into Airtable. The workflow includes built-in logic for slot management, email triggers, and storing call transcripts. Setup Instructions Duplicate Airtable Base: Use this Airtable base templateBASE LINK Import Workflow: Load provided JSON into your n8n instance. Credentials: Connect your Google Calendar and Airtable credentials in n8n. Activate Workflow: Enable workflow to get live webhook URLs. Vapi Configuration: Paste provided system prompt into Vapi Assistant. Link the appropriate webhook URLs from n8n (GetSlots, BookSlots, UpdateSlots, CancelSlots, and end-of-call report). Disclaimer Optimized for cloud-hosted n8n instances. Self-hosted users should verify webhook and credential setups.
by Lucas
🎶 Add liked songs to a monthly playlist > This Workflow is a port of Add saved songs to a monthly playlist from IFTTT. When you like a song, the workflow will save this song in a monthly playlist. E.g.: It's June 2024, I liked a song. The workflow will save this song in a playlist called June '24. If this playlist does not exist, the workflow will create it for me. ⚙ How it works Each 5 minutes, the workflow will start automatically. He will do 3 things : Get the last 10 songs you saved in the "Liked song" playlist (by clicking on the heart in the app) and save them in a NocoDB table (of course, the workflow avoid to create duplicates). Check if the monthly playlist is already created. Otherwise, the playlist is created. The created playlist is also saved in NocoDB to avoid any problems. Check if the monthly playlist contains all the song liked this month by getting them from NocoDB. If they are not present, add them one by one in the playlist. You may have a question regarding the need of NocoDB. Over the last few weeks/months, I've had duplication problems in my playlists and some playlists have been created twice because Spotify wasn't returning all the information but only partial information. Having the database means I don't have to rely on Spotify's data but on my own, which is accurate and represents reality. 📝 Prerequisites You need to have : Spotify API keys, which you can obtain by creating a Spotify application here: https://developer.spotify.com/dashboard. Create a NocoDB API token 📚 Instructions Follow the instructions below Create your Spotify API credential Create your NocoDB credential Populate all Spotify nodes with your credentials Populate all Spotify nodes with your credentials Enjoy ! If you need help, feel free to ping me on the N8N Discord server or send me a DM at "LucasAlt" Show your support Share your workflow on X and mention @LucasCtrlAlt Consider buying me a coffee 😉
by Alex Kim
Weather via Slack 🌦️💬 Overview This workflow provides real-time weather updates via Slack using a custom Slack command: /weather [cityname] Users can type this command in Slack (e.g., /weather New York), and the workflow will fetch and post the latest forecast, including temperature, wind conditions, and a short weather summary. While this workflow is designed for Slack, users can modify it to send weather updates via email, Discord, Microsoft Teams, or any other communication platform. How It Works Webhook Trigger – The workflow is triggered when a user runs /weather [cityname] in Slack. Geocoding with OpenStreetMap – The city name is converted into latitude and longitude coordinates. Weather Data from NOAA – The coordinates are used to retrieve detailed weather data from the National Weather Service (NWS) API. Formatted Weather Report – The workflow extracts relevant weather details, such as: Temperature (°F/°C) Wind speed and direction Short forecast summary Slack Notification – The weather forecast is posted back to the Slack channel in a structured format. Requirements A custom Slack app with: The ability to create a Slash Command (/weather) OAuth permissions to post messages in Slack An n8n instance to host and execute the workflow Customization Replace Slack messaging with email, Discord, Microsoft Teams, Telegram, or another service. Modify the weather data format for different output preferences. Set up scheduled weather updates for specific locations. Use Cases Instantly check the weather for any location directly in Slack. Automate weather reports for team members or projects. Useful for remote teams, outdoor event planning, or general weather tracking. Setup Instructions Create a custom Slack app: Go to api.slack.com/apps and create a new app. Add a Slash Command (/weather) with the webhook URL from n8n. Enable OAuth scopes for sending messages. Deploy the webhook – Ensure it can receive and process Slack commands. Run the workflow – Type /weather [cityname] in Slack and receive instant weather updates.