by Javier Hita
Who is this for? This workflow is perfect for sales teams, business development professionals, recruitment agencies, and fractional CFO service providers who need to identify and qualify companies actively hiring. Whether you're prospecting for new clients, building a database of potential customers, or researching market opportunities, this automated solution saves hours of manual research while delivering high-quality, AI-analyzed leads. What problem is this workflow solving? Finding qualified prospects in the finance sector is time-consuming and often inefficient. Traditional methods involve: Manually browsing LinkedIn job postings for hours Difficulty distinguishing between genuine opportunities and recruitment spam Inconsistent lead categorization and qualification Risk of contacting the same companies multiple times Lack of structured data for sales team follow-up This workflow automates the entire lead generation process, from data collection to AI-powered qualification, ensuring you focus only on the most promising opportunities. What this workflow does This comprehensive lead generation system performs six key functions: Automated LinkedIn Job Scraping: Uses Apify's reliable LinkedIn Jobs Scraper to extract detailed job postings for finance positions, including company information, job descriptions, and contact details. Smart Data Processing: Removes duplicates, filters companies by size, and structures data for consistent analysis across all leads. Intelligent Lead Categorization: Compares new leads against your existing database to optimize processing and avoid duplicate work. AI-Powered Qualification: Leverages OpenAI's GPT-4 Mini to analyze each lead and determine: Company Category: Consumer companies, Fractional CFO services, Recruiting agencies, or Other Finance Role Validation: Confirms the position is genuinely finance-related Seniority Level: Entry, Mid, Senior, Director, or C-Level classification Job Summary: Concise description for quick sales team review Automated Database Management: Stores qualified leads in Airtable with comprehensive profiles, preventing duplicates while maintaining data integrity. Lead Scoring & Routing: Prioritizes leads based on processing status and qualification results for efficient sales team follow-up. Setup Prerequisites You'll need accounts for three services: Airtable** (Free tier supported) - For lead storage and management Apify** (14-day free trial available) - For LinkedIn job scraping OpenAI** (Pay-per-use) - For AI-powered lead analysis Step 1: Create Required Credentials Apify API Credential Sign up for an Apify account at apify.com Navigate to Settings β Integrations β API tokens Create a new API token In n8n, create a new Apify API credential with your token OpenAI API Credential Create an account at platform.openai.com Generate an API key in the API section In n8n, create a new OpenAI credential with your key Airtable Personal Access Token Go to airtable.com/create/tokens Create a personal access token with the following scopes: data.records:read data.records:write schema.bases:read In n8n, create a new Airtable Personal Access Token credential Step 2: Set Up Airtable Base Create a new Airtable base with the following structure: Table Name: Qualified Leads Required Fields: Company Name (Single line text) Job Title (Single line text) Is Finance Job (Checkbox) Seniority Level (Single select: Entry, Mid, Senior, Director, C-Level) Company Category (Single select: Consumer, Recruiting, Fractional CFO, Other) Job Summary (Long text) Company LinkedIn (URL) Job Link (URL) Posted Date (Date) Location (Single line text) Industry (Single line text) Company Employees (Number) Step 3: Configure the Workflow Import the Workflow: Copy the JSON and import it into your n8n instance Update Credentials: Replace placeholder credential IDs with your actual credential IDs in: "Scrape LinkedIn Jobs" node (Apify credential) "OpenAI GPT-4 Mini" node (OpenAI credential) "Save to Airtable" and "Get Existing Leads" nodes (Airtable credential) Configure Airtable Connection: Update the base ID and table ID in both Airtable nodes Set Search Parameters: In the "Edit Variables" node, configure: linkedinUrls: Your target LinkedIn job search URLs maxEmployees: Maximum company size filter (default: 200) batchSize: Processing batch size for API efficiency (default: 5) Step 4: Test the Workflow Start with a small test by setting count: 50 in the HTTP Request node Use a specific LinkedIn job search URL (e.g., "CFO jobs in New York") Execute the workflow manually and verify results in your Airtable base Review the AI categorization accuracy and adjust prompts if needed How to customize this workflow to your needs Targeting Different Roles Modify the LinkedIn search URLs in the "Edit Variables" node to target different positions: "https://www.linkedin.com/jobs/search/?keywords=Controller" "https://www.linkedin.com/jobs/search/?keywords=Finance%20Director" "https://www.linkedin.com/jobs/search/?keywords=VP%20Finance" Adjusting Company Size Filters Change the maxEmployees parameter to focus on different company segments: Startups: 1-50 employees SMBs: 51-500 employees Enterprise: 500+ employees Customizing AI Analysis Enhance the GPT-4 prompt in the "AI Lead Analyzer" node to include: Industry-specific criteria Geographic preferences Technology stack requirements Company growth stage indicators Integration Options Extend the workflow by adding: Slack notifications** for new qualified leads Email alerts** for high-priority prospects CRM integration** (Salesforce, HubSpot, Pipedrive) Lead enrichment** with additional data sources Scheduling Automation Set up the workflow to run automatically: Daily**: For active prospecting campaigns Weekly**: For ongoing market research Monthly**: For periodic database updates Performance & Cost Optimization API Efficiency**: The workflow processes leads in batches to optimize API usage Smart Deduplication**: Avoids re-processing existing leads to reduce costs Configurable Limits**: Adjust batch sizes and employee count filters based on your needs Expected Costs**: Approximately $0.05-$0.20 per 100 analysed leads (OpenAI costs) Troubleshooting Common Issues: Rate Limiting**: Increase delays between API calls if you encounter rate limits Data Quality**: Review LinkedIn search URLs for relevance to your target market AI Accuracy**: Adjust prompts if categorisation doesn't match your criteria Airtable Errors**: Verify field names match exactly between workflow and base structure Support Resources: Apify LinkedIn Scraper Documentation OpenAI API Documentation Airtable API Reference Transform your lead generation process with this powerful, AI-driven workflow that delivers qualified prospects ready for immediate outreach.
by Matthieu
Search LinkedIn companies, Score with AI and add them to Google Sheet CRM Setup Video: https://youtube.com/watch?v=m904RNxtF0w&t Who is this for? This template is ideal for sales teams, business development professionals, and marketers looking to build a targeted prospect database with automatic qualification. Perfect for agencies, consultants, and B2B companies wanting to identify and prioritize the most promising potential clients. What problem does this workflow solve? Manually researching companies on LinkedIn, evaluating their fit for your services, and tracking them in your CRM is time-consuming and subjective. This automation streamlines lead generation by automatically finding, scoring, and importing qualified prospects into your database. What this workflow does This workflow automatically searches for companies on LinkedIn based on your criteria, retrieves detailed information about each company, filters them based on quality indicators, uses AI to score how well they match your ideal customer profile, and adds them to your Google Sheet CRM while preventing duplicates. Setup Create a Ghost Genius API account and get your API key Configure HTTP Request nodes with Header Auth credentials Create a copy of the provided Google Sheet template Set up your Google Sheet and OpenAI credentials following n8n documentation Customize the "Set Variables" node to match your target audience and scoring criteria How to customize this workflow Modify search parameters to target different industries, locations, or company sizes Adjust the follower count threshold based on your qualification criteria Customize the AI scoring system to align with your specific product or service offering Add notification nodes to alert you when high-scoring companies are identified
by Risper
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How It Works This n8n workflow automatically discovers high-quality business leads from Reddit posts by analysing posts across targeted subreddits. Loads your business profile from a connected Google Sheet. Uses AI to identify relevant subreddits where your potential customers engage. Generates intent-based Reddit search queries based on your services, keywords, and client pain points. Searches Reddit in real time using the generated queries. Classifies posts based on whether they show lead potential. Analyses high-potential posts for service-fit, urgency, and estimated value. Filters and scores leads to prioritize high-conversion opportunities. Saves the most promising leads to a dedicated Google Sheet. Sends Slack alerts to notify your sales team for immediate follow-up. Requirements Before using this workflow, ensure the following services are connected and configured: Google Sheets (OAuth2): Reads your business profile and writes qualified leads Reddit (OAuth2) Perform Reddit post searches based on generated queries Google Gemini API Analyse posts, generate queries, and extract insights Slack API : Notify your team with qualified lead summaries Google Sheets Setup You will need two Google Sheets: Business Profile Sheet (Input) This sheet contains a single row describing your service business. The workflow reads this to generate relevant subreddit selections and search queries. Required Fields (as headers in row 1): profession industry primary_services service_keywords target_client_profile pain_points intent_signals urgency_indicators price_range Reddit Leads Sheet (Output) This sheet stores high-quality Reddit posts identified as potential leads. The workflow appends or updates rows based on post_id to avoid duplication. Expected Columns: post_id post_url post_title post_post post_subreddit post_date
by vinci-king-01
Sales Pipeline Automation Dashboard with AI Lead Intelligence π― Target Audience Sales managers and team leads Business development representatives Marketing teams managing lead generation CRM administrators and sales operations Account executives and sales representatives Sales enablement professionals Revenue operations (RevOps) teams π Problem Statement Manual lead qualification and sales pipeline management is inefficient and often leads to missed opportunities or poor lead prioritization. This template solves the challenge of automatically scoring, qualifying, and routing leads using AI-powered intelligence to maximize conversion rates and sales team productivity. π§ How it Works This workflow automatically processes new leads using AI-powered intelligence, scores and qualifies them based on multiple factors, and automates the entire sales pipeline from lead capture to deal creation. Key Components Dual Trigger System - Scheduled monitoring and webhook triggers for real-time lead processing AI-Powered Lead Intelligence - Advanced scoring algorithm based on 7 key factors Multi-Source Data Enrichment - LinkedIn and Crunchbase integration for comprehensive lead profiles Automated Sales Actions - Intelligent routing, task creation, and follow-up sequences Multi-Platform Integration - HubSpot CRM, Slack notifications, and Google Sheets dashboard π Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the lead was processed | "2024-01-15T10:30:00Z" | | lead_id | String | Unique lead identifier | "LEAD-2024-001234" | | first_name | String | Lead's first name | "John" | | last_name | String | Lead's last name | "Smith" | | email | String | Lead's email address | "john@company.com" | | company_name | String | Company name | "Acme Corp" | | job_title | String | Lead's job title | "Marketing Director" | | lead_score | Number | AI-calculated score (0-100) | 85 | | grade | String | Lead grade (A+, A, B+, B, C+) | "A+" | | category | String | Lead category | "Enterprise" | | priority | String | Priority level | "Critical" | | lead_source | String | How the lead was acquired | "Website Form" | | assigned_rep | String | Assigned sales representative | "Senior AE" | | company_size | String | Company employee count | "201-500 employees" | | industry | String | Company industry | "Technology" | | funding_stage | String | Company funding stage | "Series B" | | estimated_value | String | Estimated deal value | "$50K-100K" | π οΈ Setup Instructions Estimated setup time: 25-30 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials HubSpot CRM account with API access Google Sheets account with API access Slack workspace for notifications (optional) Email service for welcome emails (optional) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up HubSpot CRM Integration Add HubSpot API credentials Grant necessary permissions for contacts, deals, and tasks Configure custom properties for lead scoring and qualification Test the connection to ensure it's working 4. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for sales pipeline data Configure the sheet name (default: "Sales Pipeline") 5. Configure Lead Scoring Parameters Update the lead scoring weights in the Code node Customize ideal customer profile criteria Set automation trigger thresholds Adjust sales rep assignment logic 6. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials for welcome emails Define notification preferences for different lead grades Test notification delivery 7. Configure Triggers Set up webhook endpoint for real-time lead capture Configure scheduled trigger for periodic monitoring Choose appropriate time zones for your business hours Test both trigger mechanisms 8. Test and Validate Run the workflow manually with sample lead data Check HubSpot for proper contact and deal creation Verify Google Sheets data formatting Test all notification channels π Workflow Customization Options Modify Lead Scoring Algorithm Adjust scoring weights for different factors Add new scoring criteria (geographic location, technology stack, etc.) Customize ideal customer profile parameters Implement industry-specific scoring models Extend Data Enrichment Add more data sources (ZoomInfo, Apollo, etc.) Include social media presence analysis Add technographic data collection Implement intent signal detection Customize Sales Automation Modify follow-up sequences for different lead categories Add more sophisticated sales rep assignment logic Implement territory-based routing Add automated meeting scheduling Output Customization Add data visualization and reporting features Implement sales pipeline analytics Create executive dashboards with key metrics Add conversion rate tracking and analysis π Use Cases Lead Qualification**: Automatically score and qualify incoming leads Sales Pipeline Management**: Streamline the entire sales process Lead Routing**: Intelligently assign leads to appropriate sales reps Follow-up Automation**: Ensure consistent and timely follow-up Sales Intelligence**: Provide comprehensive lead insights Performance Tracking**: Monitor sales team and pipeline performance π¨ Important Notes Respect LinkedIn and Crunchbase terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update your lead scoring parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Ensure GDPR compliance for lead data processing π§ Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status HubSpot API errors: Check API key and permissions Google Sheets permission errors: Check OAuth2 scope and permissions Lead scoring errors: Review the Code node's JavaScript logic Rate limiting: Adjust request frequency and implement delays Support Resources: ScrapeGraphAI documentation and API reference HubSpot API documentation and developer resources n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Sales automation best practices and guidelines
by Dvir Sharon
π Extract Google My Business Leads by Service & Location with Bright Data to Google Sheets This template requires a self-hosted n8n instance to run. A comprehensive n8n automation that extracts Google My Business listings by service type and geographic location using Bright Data's Google Maps dataset, with intelligent city expansion and automatic duplicate removal. π₯ Who is this for? Lead generation professionals Sales teams Marketing agencies Business development representatives Entrepreneurs conducting outreach or market research β What problem is this solving? Manual lead generation from Google Maps is time-consuming and inefficient. This workflow automates the process of finding businesses by service type and location, expanding searches across cities, removing duplicates, and organizing results in a structured format. βοΈ What this workflow does Input Processing Accepts service type, state, and country via web form Uses Claude AI to generate city lists Auto-categorizes services Creates search queries per city Data Collection Uses Bright Data's Google Maps dataset Processes in batches with rate limits Monitors scraping with retry logic Formats and handles API responses Quality Control Removes duplicates by name and phone Maintains clean data in Google Sheets Ensures structured, usable datasets π Output Data Points | Field | Description | Example | | :-------------- | :-------------------------- | :---------------------------- | | Business Name | Company or business name | TechFix Computer Repair | | Category | Business category type | Electronics | | Country | Country location | US | | City | Specific city searched | Austin | | Phone Number | Contact phone number | +1 (555) 123-4567 | | Website URL | Business website | https://techfix.com | | Google Maps URL | Direct Maps link | https://maps.google.com/... | | Address | Full business address | 123 Main St, Austin, TX | | Operating Hours | Business hours | Mon-Fri 9AM-6PM | | Google Rating | Star rating | 4.5 | | Total Reviews | Number of reviews | 127 | | Reviews URL | Link to reviews | https://maps.google.com/reviews... | π Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with Google Maps dataset access Anthropic API key for Claude AI Step-by-Step Import the workflow JSON into n8n Configure Bright Data credentials and dataset access Set up Google Sheets and OAuth2 credentials Configure Claude AI with your API key Replace all placeholder credential IDs and tokens. For improved security, use credentials instead of hardcoding the API token placeholder in the HTTP Request node. Test with sample data (e.g., "Coffee Shop" in California, US) Activate the workflow and use the form for submissions π How to Customize Modify Geographic Scope Add countries to the form dropdown Customize Claude prompts for city generation Adjust search logic for international markets Enhance Data Collection Add more fields from Bright Data Include revenue, employee count, social profiles Improve Duplicate Detection Use fuzzy matching for similar names Include address-based checks Customize Output Format Transform data for CRM compatibility Export to CSV, database, or multiple destinations Implement Advanced Features Integrate email finder services Include lead scoring logic Discover social media profiles Batch Processing Optimization Adjust batch sizes per Bright Data limits Use parallel processing and retry logic Integration Options Connect to CRMs like HubSpot or Salesforce Trigger email automation Integrate with marketing platforms
by Stefan
Track n8n Node Definitions from GitHub and Export to Google Sheets Overview This workflow automatically retrieves and processes metadata from the official n8n GitHub repository, filters all available .node.json files, parses their structure, and appends structured information into a Google Sheet. Perfect for developers, community managers, and technical writers who need to maintain up-to-date information about n8n's evolving node ecosystem. Setup Instructions Prerequisites Before setting up this workflow, ensure you have: A GitHub account with API access A Google account with Google Sheets access An active n8n instance (cloud or self-hosted) Step 1: GitHub API Configuration Navigate to GitHub Settings β Developer Settings β Personal Access Tokens Generate a new token with public_repo permissions Copy the generated token and store it securely In n8n, create a new "GitHub API" credential Paste your token in the credential configuration and save Step 2: Google Sheets Setup Create a new Google Sheets document Set up the following column headers in the first row: node (Column A) - Node identifier/name nodeVersion (Column B) - Version of the node codexVersion (Column C) - Codex version number categories (Column D) - Node categories credentialDocumentation (Column E) - Credential documentation URL primaryDocumentation (Column F) - Primary documentation URL Note down the Google Sheets document ID from the URL Configure Google Sheets OAuth2 credentials in n8n Step 3: Workflow Configuration Import the workflow into your n8n instance Update the following placeholder values: Replace YOUR_GOOGLE_SHEETS_DOCUMENT_ID with your actual document ID Replace YOUR_WEBHOOK_ID if using webhook functionality Configure the GitHub API credentials in the HTTP Request nodes Set up Google Sheets credentials in the Google Sheets nodes Share your Google Sheets document with the email address associated with your Google OAuth2 credentials Grant "Editor" permissions to allow the workflow to write data Google Sheets Template Details The workflow creates a structured dataset with these columns: node**: Node identifier (e.g., n8n-nodes-base.slack) nodeVersion**: Version of the node (e.g., 1.0.0) codexVersion**: Codex version number (e.g., 1.0.0) categories**: Node categories (e.g., Communication, Productivity) credentialDocumentation**: URL to credential documentation primaryDocumentation**: URL to primary node documentation Customization Options Modifying Data Extraction You can customize the "Format Data" node to extract additional fields: Add new assignments in the Set node Modify the column mapping in the Google Sheets node Update your spreadsheet headers accordingly Changing Update Frequency To run this workflow on a schedule: Replace the Manual Trigger with a Cron node Set your desired schedule (e.g., daily, weekly) Configure appropriate timing to avoid API rate limits Adding Filters Customize the "Filter Node Files" code node to: Filter specific node types Include/exclude certain categories Process only recently updated nodes Features Fetches all node definitions from the n8n-io/n8n repository Filters for .node.json files only Downloads and parses metadata automatically Extracts key fields like node names, versions, categories, and documentation URLs Appends structured data to Google Sheets with batch processing Includes error handling and retry mechanisms Clears existing data before appending new information for fresh results Use Cases This workflow is ideal for: Track changes in official n8n node definitions over time Audit node categories and documentation links for completeness Build custom dashboards from node metadata Community management and documentation maintenance Integration planning and compatibility analysis
by Andrew
Who is this for? This workflow is ideal for n8n self-hosted users, DevOps engineers, and automation developers who want to automatically back up their n8n workflows to GitHub on a regular basis. What problem is this workflow solving Manually backing up n8n workflows can be time-consuming and prone to human error. This workflow automates the backup process, ensuring that all workflows are safely stored in a version-controlled GitHub repository every 24 hours. What this workflow does This automation runs daily to back up all workflows from your n8n instance to a specified GitHub repository. Each workflow is saved as a .json file using its unique ID, organized into a folder path defined by repo_path. The workflow is designed to manage memory usage efficiently by recursively calling itself. Once the backup is complete, it optionally sends a Slack notification to confirm success. Setup Configure the Config node in the subworkflow to set: GitHub Repo Owner GitHub Repo Name Main folder path (repo_path) Connect your GitHub and (optionally) Slack credentials. Set the workflow to run on a daily cron schedule. Test the workflow manually to confirm the GitHub integration works. Sign up for a free consultation and find out how n8n can help you.
by Giannis Kotsakiachidis
π¦ GoCardless β Maybe Finance β Automatic Multi-Bank Sync & Weekly Overview πΈ Whoβs it for π€ Freelancers, founders, households, and side-hustlers who work with several bank accounts but want one, always-up-to-date budget inside Maybe Financeβno more CSV exports or copy-paste. How it works / What it does βοΈ Schedule Trigger (cron) fires every Monday π (switch to Manual Trigger while testing) Get access token β fresh 24 h GoCardless token π Fetch transactions for each account: Revolut Pro Revolut Personal ABN AMRO (add extra HTTP Request nodes for any other GoCardless-supported banks) Extract booked β keep only settled items ποΈ Set transactions β¦ β map every record to Maybe Financeβs schema π Merge all arrays into one payload π Create transactions to Maybe β POSTs each item via API π Resend Email β sends you a βWeekly transactions overviewβ π§ All done in a single run β your Maybe dashboard is refreshed and you get an inbox alert. How to set up π οΈ Import the template into n8n (cloud or self-hosted). Create credentials GoCardless secret_id & secret_key Maybe Finance API key (Optional) Resend API key for email notifications One-time GoCardless config (run the blocks on the left): /token/new/ β obtain token /institutions β find institution IDs /agreements/enduser/ β create agreements /requisitions/ β get the consent URL & finish bank login /requisitions/{id} β copy the GoCardless account_ids Create the same accounts in Maybe Finance and run the HTTP GET request in the purple frame and copy their account_ids. Open each Set transactions β¦ node and paste the correct Maybe account_id. Adjust the Schedule Trigger (e.g. daily, monthly). Save & activate π Requirements π n8n 1.33 + GoCardless app (secret ID & key, live or sandbox) Maybe Finance account & API key (Optional) Resend account for email How to customize β¨ Include pending transactions**: change the Item Lists filter. Add more banks**: duplicate the βGet β¦ transactionsβ β βExtract bookedβ β βSet transactionsβ path and plug its output into the Merge node. Different interval**: edit the cron rule in Schedule Trigger. Disable emails**: just remove or deactivate the Resend node. Send alerts to Slack / Teams**: branch after the Merge node and add a chat node. Happy budgeting! π°
by Airtop
Monitoring Job Changes on LinkedIn Use Case This automation tracks job changes among your LinkedIn connections and extracts relevant details. It's ideal for triggering timely outreach, updating CRM records, or feeding lead scoring workflows based on new roles. What This Automation Does It scrapes your LinkedIn "Job Changes" feed and returns: Name of the person Their new position LinkedIn profile URL Functional category (e.g., marketing, sales, HR, executive) Each run processes 5 job changes at a time. How It Works Manual Trigger: Starts the workflow when the user clicks "Test workflow." Airtop Enrichment: Navigates to the LinkedIn job changes page and extracts: name new_position linkedin_profile_url position_function (classification such as marketing, sales, HR, etc.) Formatting: Output is structured into clean JSON for use in further workflows. Setup Requirements Airtop Profile connected to LinkedIn Airtop API key configured in n8n A LinkedIn account with a populated βJob Changesβ feed Next Steps Automate Alerts**: Add Slack, email, or CRM integrations to notify your team. Enrich and Score Leads**: Chain this with your ICP scoring workflow to evaluate new roles. Customize Scope**: Expand extraction to more than 5 job changes or add filters based on job titles or functions. Read more about Monitoring Job Changes on Linkedin.
by Avkash Kakdiya
How it works This workflow automatically generates personalized follow-up messages for leads or customers after key interactions (e.g., demos, sales calls). It enriches contact details from HubSpot (or optionally Monday.com), uses AI to draft a professional follow-up email, and distributes it across multiple communication channels (Slack, Telegram, Teams) as reminders for the sales team. Step-by-step 1. Trigger & Input Schedule Trigger β Runs automatically at a defined interval (e.g., daily). Set Sample Data β Captures the contactβs name, email, and context from the last interaction (e.g., βhad a product demo yesterday and showed strong interestβ). 2. Contact Enrichment HubSpot Contact Lookup β Searches HubSpot CRM by email to confirm or enrich contact details. Monday.com Contact Fetch (Optional) β Can pull additional CRM details if enabled. 3. AI Message Generation AI Language Model (OpenAI) β Provides the underlying engine for message creation. Generate Follow-Up Message β Drafts a short, professional, and friendly follow-up email: References previous interaction context. Suggests clear next steps (call, resources, etc.). Ends with a standardized signature block for consistency. 4. Multi-Channel Communication Slack Reminder β Posts the generated message as a reminder in the sales teamβs Slack channel. Telegram Reminder β Sends the follow-up draft to a Telegram chat. Teams Reminder β Shares the same message in a Microsoft Teams channel. Benefits Personalized Outreach at Scale β AI ensures each follow-up feels tailored and professional. Context-Aware Messaging β Pulls in CRM details and past interactions for relevance. Cross-Platform Delivery β Distributes reminders via Slack, Teams, and Telegram so no follow-up is missed. Time-Saving for Sales Teams β Eliminates manual drafting of repetitive follow-up emails. Consistent Branding β Ensures every message includes a unified signature block.
by Jonathan
This workflow takes Dialpad call information after a call is disconnected and pushes it into Syncro as a ticket timer update, matching the start time and end time provided by Dialpad and a note that containing the contact or customer name and number. > This workflow is part of an MSP collection, The original can be found here: https://github.com/bionemesis/n8nsyncro
by JYLN
This updated workflow will automatically archive your Spotify Discover Weekly tracks to another manually created playlist, without the nuisance of duplicate tracks. It utilizes the latest verisons of n8n's Schedule trigger, Spotify, Switch, Merge, and IF nodes. Special thanks to trey for their original version of the workflow, as well as ihortom for their help with navigating the Switch node's outputs. To use this workflow, you'll need to: Create a playlist for use as the archive playlist within your Spotify account Create and select your Spotify credentials within each Spotify node within the workflow See workflow README for additional information and optional setup steps.