by Belen
This n8n template automatically transcribes GoHighLevel (GHL) call recordings and creates an AI-generated summary that is added as a note directly to the related contact in your GHL CRM. It’s designed for real estate investors, agencies, and sales teams that handle a large volume of client calls and want to keep detailed, searchable notes without spending hours on manual transcription. Who’s it for Sales and acquisitions teams that want instant call notes in their CRM Real estate wholesalers or agencies using GoHighLevel for deal flow Support and QA teams that need summarized transcripts for review Any business owner who wants to automatically document client conversations How it works A HighLevel automation workflow triggers when a call is marked “Completed” and automatically sends a webhook to n8n. The n8n workflow receives this webhook and waits briefly to ensure the call recording is ready. It retrieves the conversation and message IDs from the webhook payload. The call recording is fetched from GHL’s API. An AI transcription node converts the audio to text. A summarization node condenses the transcript into bullet points or a concise paragraph. A Code node formats the AI output into proper JSON for GHL’s “Create Note” endpoint. Finally, an HTTP Request node posts the summary to the contact’s record in GHL. How to set up Add your GoHighLevel OAuth credential and connect your agency account. Add your AI credential (e.g., OpenAI, Anthropic, or Gemini). Replace the sample webhook URL with your n8n endpoint. Test with a recent call and confirm the summary appears in the contact timeline. Requirements GoHighLevel account with API and OAuth access AI service for transcription and summarization (e.g., OpenAI Whisper + GPT) Customizing this workflow You can tailor this automation for your specific team or workflow: Add sentiment analysis or keyword extraction to the summary. Change the AI prompt to focus on “action items,” “objections,” or “next steps.” Send summaries to Slack, Notion, or Google Sheets for reporting. Trigger follow-up tasks automatically in your CRM based on keywords. Good to know AI transcription and summarization costs vary by provider — check your LLM’s pricing. GoHighLevel’s recording availability may take up to 1 minute after the call ends; adjust the delay accordingly. For OAuth setup help, refer to GHL’s OAuth documentation. Happy automating! ⚙️
by Stephan Koning
Real-Time ClickUp Time Tracking to HubSpot Project Sync This workflow automates the synchronization of time tracked on ClickUp tasks directly to a custom project object in HubSpot, ensuring your project metrics are always accurate and up-to-date. Use Case & Problem This workflow is designed for teams that use a custom object in HubSpot for high-level project overviews (tracking scoped vs. actual hours per sprint) but manage daily tasks and time logging in ClickUp. The primary challenge is the constant, manual effort required to transfer tracked hours from ClickUp to HubSpot, a process that is both time-consuming and prone to errors. This automation eliminates that manual work entirely. How It Works Triggers on Time Entry:** The workflow instantly starts whenever a user updates the time tracked on any task in a specified ClickUp space. ⏱️ Fetches Task & Time Details:** It immediately retrieves all relevant data about the task (like its name and custom fields) and the specific time entry that was just updated. Identifies the Project & Sprint:** The workflow processes the task data to determine which HubSpot project it belongs to and categorizes the work into the correct sprint (e.g., Sprint 1, Sprint 2, Additional Requests). Updates HubSpot in Real-Time:** It finds the corresponding project record in HubSpot and updates the master actual_hours_tracked property. It then intelligently updates the specific field for the corresponding sprint (e.g., actual_sprint_1_hours), ensuring your reporting remains granular and accurate. Requirements ✅ ClickUp Account with the following custom fields on your tasks: A Dropdown custom field named Sprint to categorize tasks. A Short Text custom field named HubSpot Deal ID or similar to link to the HubSpot record. ✅ HubSpot Account with: A Custom Object used for project tracking. Custom Properties** on that object to store total and sprint-specific hours (e.g., actual_hours_tracked, actual_sprint_1_hours, total_time_remaining, etc.). > Note: Since this workflow interacts with a custom HubSpot object, it uses flexible HTTP Request nodes instead of the standard n8n HubSpot nodes. Setup Instructions Configure Credentials: Add your ClickUp (OAuth2) and HubSpot (Header Auth with a Private App Token) credentials to the respective nodes in the workflow. Set ClickUp Trigger: In the Time Tracked Update Trigger node, select your ClickUp team and the specific space you want to monitor for time updates. Update HubSpot Object ID: Find the ID of your custom project object in HubSpot. In the HubSpot HTTP Request nodes (e.g., OnProjectFolder), replace the placeholder ID objectTypeId in the URL with your own objectTypeId How to Customize Adjust the Code: Extract Sprint & Task Data node to change how sprint names are mapped or how time is calculated. Update the URLs in the HubSpot HTTP Request nodes if your custom object or property names differ.
by Dhinesh Ravikumar
Who it's for Project managers, AI builders, and teams who want structured, automated meeting summaries with zero manual work. What it does This workflow monitors a Google Drive folder for new meeting notes (PDF/TXT), extracts text, summarizes it via OpenAI GPT-4o, groups tasks by sentiment, builds a styled HTML summary, and sends it via Gmail. How to set it up Connect Google Drive, OpenAI, and Gmail credentials. Point the Drive Trigger to your meeting notes folder. Paste the system prompt into the AI node. Set Gmail Email Type to HTML and Message to {{$json.email_html}}. Drop a test file and execute once. Requirements n8n account Google Drive, OpenAI, and Gmail credentials Non-scanned PDFs or plain text files Customization ideas Add Slack or Notion logging Support additional file types Translate summaries automatically Tags #ai #automation #productivity #gmail #drive #meeting-summary #openai
by Fahmi Fahreza
Sign up for Decodo HERE for Discount Automatically scrape, structure, and log forum or news content using Decodo and Google Gemini AI. This workflow extracts key details like titles, URLs, authors, and engagement stats, then appends them to a Google Sheet for tracking and analysis. Who’s it for? Ideal for data journalists, market researchers, or AI enthusiasts who want to monitor trending topics across specific domains. How it works Trigger: Workflow runs on schedule. Data Setup: Defines forum URLs and geolocation. Scraping: Extracts raw text data using the Decodo API. AI Extraction: Gemini parses and structures the scraped text into clean JSON. Data Storage: Each news item is appended or updated in Google Sheets. Logging: Records scraping results for monitoring and debugging. How to set up Add your Decodo, Google Gemini, and Google Sheets credentials in n8n. Adjust the forum URLs, geolocation, and Google Sheet ID in the Workflow Config node. Set your preferred trigger interval in Schedule Trigger. Activate and monitor from the n8n dashboard.
by AFK Crypto
Try It Out! 🚀 Reddit Crypto Intelligence & Market Spike Detector ⸻ 🧠 Workflow Description Reddit Crypto Intelligence & Market Spike Detector is an automated market sentiment and price-monitoring workflow that connects social chatter with real-time crypto price analytics. It continuously scans new posts from r/CryptoCurrency, extracts recently mentioned coins, checks live price movements via CoinGecko, and alerts you on Discord when a significant spike or drop occurs. This automation empowers traders, analysts, and communities to spot early market trends before they become mainstream — all using free APIs and open data. ⸻ ⚙️ How It Works Monitor Reddit Activity ◦ Automatically fetches the latest posts from r/CryptoCurrency using Reddit’s free RSS feed. ◦ Captures trending titles, post timestamps, and mentions of coins or tokens (e.g., $BTC, $ETH, $SOL, $PEPE). Extract Coin Mentions ◦ A Code Node parses the feed using regex (\$[A-Za-z0-9]{2,10}) to identify any symbols or tickers discussed. ◦ Removes duplicates and normalizes all results for accurate data mapping. Fetch Market Data ◦ Each detected coin symbol is matched with CoinGecko’s public API to fetch live market data, including current price, market rank, and 24-hour price change. ◦ No API key required — completely free and reliable source. Detect Market Movement ◦ A second Code Node filters the fetched data to identify price movements greater than ±5% within the last 24 hours. ◦ This helps isolate meaningful market action from routine fluctuations. Generate and Send Alerts ◦ When a spike or dip is detected, the workflow composes a rich alert message including: ▪ 💎 Coin name and symbol ▪ 💰 Current price ▪ 📈 24h percentage change ▪ 🕒 Timestamp of detection ◦ The message is sent automatically to your Discord channel using a preconfigured webhook. ⸻ 💬 Example Output 🚨 Crypto Reddit Mention & Price Spike Alert! 🚨 💎 ETHEREUM (ETH) 💰 $3,945.23 📈 Change: +6.12% 💎 SOLANA (SOL) 💰 $145.88 📈 Change: +8.47% 🕒 Checked at: 2025-10-31T15:00:00Z If no coins cross the ±5% threshold: “No price spikes detected in the latest Reddit check.” 🔔 #MarketIntel #CryptoSentiment #PriceAlert ⸻ 🪄 Key Features • 🧠 Social + Market Intelligence – Combines Reddit sentiment with live market data to detect potential early signals. • 🔎 Automated Coin Detection – Dynamically identifies newly discussed tokens from live posts. • 📊 Smart Spike Filtering – Highlights only meaningful movements above configurable thresholds. • 💬 Discord Alerts – Delivers clear, structured, and timestamped alerts to your community automatically. • ⚙️ Fully No-Cost Stack – Utilizes free Reddit and CoinGecko APIs with no authentication required. ⸻ 🧩 Use Cases • Crypto Traders: Detect early hype or momentum shifts driven by social chatter. • Analysts: Automate social sentiment tracking tied directly to live market metrics. • Community Managers: Keep members informed about trending coins automatically. • Bots & AI Assistants: Integrate this logic to enhance automated trading signals or alpha alerts. ⸻ 🧰 Required Setup • Discord Webhook URL – For automatic alert posting. • (Optional) CoinGecko API endpoint (no API key required). • n8n Instance – Self-hosted or Cloud; free tier is sufficient. • Workflow Schedule – Recommended: hourly (Cron Node interval = 1 hour). ⸻ AFK Crypto Website: afkcrypto.com
by Edson Encinas
🧩 Template Description File Hash Reputation Checker is a security automation workflow that validates file hashes (MD5, SHA1, SHA256) and checks their reputation using the VirusTotal API. It is designed for SOC teams, security engineers, and automation pipelines that need fast and consistent malware verdicts from a single hash input. The workflow supports two input methods: An HTTP webhook for API-based integrations A Slack slash command (/hash-check) for quick analyst-driven checks directly from Slack Once a hash is submitted, the workflow normalizes and validates the input, queries VirusTotal for detection statistics, and determines whether the file is Malicious, Suspicious, Clean, or Unknown. Results are returned as a structured JSON response and also posted to Slack with severity-based formatting. ⚙️ How It Works A file hash is submitted via HTTP POST or Slack using /hash-check FILE_HASH. The hash is normalized (lowercased and trimmed). The workflow validates the hash format (MD5, SHA1, or SHA256). VirusTotal is queried for hash reputation data. Detection statistics are analyzed to calculate a verdict: Malicious Suspicious Clean Unknown A Slack message is sent for all verdicts, with alert-style formatting for malicious results. A structured JSON response is returned to the requester. 🛠️ Setup Steps VirusTotal API Create or use an existing VirusTotal account. Add your API key to n8n as VirusTotal API credentials. Slack Configuration Create a Slack App. Enable Slash Commands and create /hash-check. Set the Request URL to the n8n webhook endpoint. Connect your Slack account in n8n credentials. Activate the Workflow Activate the workflow in n8n. Test using: HTTP POST: { "text": "file_hash" } Slack: /hash-check FILE_HASH; 🎛️ Customization Ideas Route Slack messages to different channels based on severity. Add additional outputs (email, SIEM, ticketing systems). Extend the workflow to support multiple hashes per request.
by browseract
How it works This workflow uses BrowserAct to run an AI-powered browser automation that collects structured product data, including image URLs and related metadata. The workflow then: Parses the BrowserAct output into individual product items Iterates through each product entry Downloads the product image and converts it into Base64 format Sends the image together with a predefined prompt to an AI video generation API Polls the generation status until the video is ready Downloads the generated short video file Uploads both the original product image and the generated video to Google Drive Each product is processed independently, making the workflow suitable for batch-based and scalable automation scenarios. Set up steps Connect your BrowserAct account to enable the browser-based data extraction workflow Connect a Google Drive account where source images and generated videos will be stored Review the input parameters provided by the BrowserAct node, such as target URL, search keyword, or data limit Adjust the product processing limit or batch size if you want to control execution time Run the workflow manually once to verify the output before using it in regular automation Additional explanations and configuration details are provided as sticky notes directly inside the workflow. Workflow Guidance and Showcase https://www.youtube.com/watch?v=XS5vyh-bdz0
by Rajeet Nair
Autonomous PostgreSQL Data Quality Monitoring & Remediation Overview This workflow automatically monitors PostgreSQL database data quality and detects structural or statistical anomalies before they impact analytics, pipelines, or applications. Running every 6 hours, it scans database metadata, table statistics, and historical baselines to identify: Schema drift Null value explosions Abnormal data distributions Detected issues are evaluated using a confidence scoring system that considers severity, frequency, and affected data volume. When issues exceed the defined threshold, the workflow generates SQL remediation suggestions, logs the issue to an audit table, and sends alerts to Slack. This automation enables teams to proactively maintain database reliability, detect unexpected schema changes, and quickly respond to data quality problems. How It Works 1. Scheduled Monitoring A Schedule Trigger starts the workflow every 6 hours to run automated database quality checks. 2. Metadata & Statistics Collection The workflow retrieves important metadata from PostgreSQL: Schema metadata** from information_schema.columns Table statistics** from pg_stat_user_tables Historical baselines** from a baseline tracking table These datasets allow the workflow to compare current database conditions against historical norms. 3. Data Quality Detection Engine Three parallel detection checks analyze the database: Schema Drift Detection Identifies new tables or columns Detects removed columns or tables Detects datatype or nullability changes Null Explosion Detection Calculates null percentage per column Flags columns exceeding configured null thresholds Outlier Distribution Detection Compares current column statistics against historical baselines Uses statistical deviation (z-score) to detect abnormal distributions 4. Issue Aggregation & Confidence Scoring All detected issues are aggregated and evaluated using a confidence scoring system based on: Severity of the issue Data volume affected Historical frequency Consistency of detection Only issues above the configured confidence threshold proceed to remediation. 5. SQL Remediation Suggestions For high-confidence issues, the workflow automatically generates SQL investigation or remediation queries, such as: ALTER TABLE fixes NULL cleanup queries Outlier review queries 6. Logging & Alerting Confirmed issues are: Stored in a PostgreSQL audit table Sent as alerts to Slack 7. Baseline Updates Finally, the workflow updates the data quality baseline table, improving anomaly detection accuracy in future runs. Setup Instructions Configure a PostgreSQL credential in n8n. Replace <target schema name> in the SQL queries with your database schema. Create the following tables in PostgreSQL: Audit Table data_quality_audit Stores detected data quality issues and remediation suggestions. Baseline Table data_quality_baselines Stores historical statistics used for anomaly detection. Configure your Slack credential. Replace the placeholder Slack channel ID in the Send Alert to Team node. Optional configuration parameters can be modified in the Workflow Configuration node: confidenceThreshold maxNullPercentage outlierStdDevThreshold auditTableName baselineTableName Use Cases Database Reliability Monitoring Detect unexpected schema changes or structural modifications in production databases. Data Pipeline Validation Identify anomalies in datasets used by ETL pipelines before they propagate errors downstream. Analytics Data Quality Monitoring Prevent reporting inaccuracies caused by missing data or abnormal values. Production Database Observability Provide automated alerts when critical database quality issues occur. Data Governance & Compliance Maintain a historical audit log of database quality issues and remediation actions. Requirements This workflow requires the following services: PostgreSQL Database** Slack Workspace** n8n** Nodes used: Schedule Trigger Set Postgres Code (Python) Aggregate IF Slack Key Features Automated database health monitoring Schema drift detection** Null explosion detection** Statistical anomaly detection** Confidence-based issue filtering Automated SQL remediation suggestions Slack alerting Historical baseline learning system Summary This workflow provides an automated data quality monitoring system for PostgreSQL. It continuously analyzes schema structure, column statistics, and historical baselines to detect anomalies, generate remediation suggestions, and notify teams in real time. By automating database quality checks, teams can identify issues early, reduce debugging time, and maintain reliable data pipelines.
by Gilbert Onyebuchi
This workflow automates the full invoicing and payment process using n8n and Xero. It allows businesses to generate invoices, track payments, send WhatsApp notifications, and keep records synced automatically, without manual follow-ups or repetitive admin work. It’s designed to plug into your existing tools and scale as your operations grow. How It Works A webhook receives invoice or payment data from your app, form, or system Xero automatically creates or updates the invoice Payments are tracked and verified in real time Clients receive WhatsApp notifications for invoices, reminders, or payments All records are logged in a database and synced to Google Calendar and Google Sheets Automated responses confirm successful actions or errors Everything runs in the background once connected. Setup Connect your Xero account to n8n Set up a database (PostgreSQL via Supabase) for logging invoices and payments Connect Google Calendar for scheduling and tracking Connect Twilio WhatsApp for client notifications Point your system or payment source to the provided webhook URL No complex coding required. I guide you through the setup and ensure everything is tested. Need Help or Customization? If you’d like this workflow customized for your business or want help setting it up properly, feel free to reach out. 🔗 Connect with me on LinkedIn: 👉 Click here to connect I’m happy to walk you through it or adapt it to your specific use case.
by Fahmi Fahreza
TikTok Trend Analyzer with Apify + Gemini + Airtable Automatically scrape trending TikTok videos, analyze their virality using Gemini AI, and store insights directly into Airtable for creative research or content planning. Who’s it for? Marketing analysts, creators, and creative agencies looking to understand why videos go viral and how to replicate successful hooks and formats. How it works A scheduled trigger runs the Apify TikTok Trends Scraper weekly. The scraper collects trending video metadata. Data is stored in Airtable (views, likes, captions, sounds, etc.). When a specific video is submitted via webhook, the workflow fetches it from Airtable. Gemini AI analyzes the video and extracts structured insights: summary, visual hook, audio, and subtitle analysis. The workflow updates the Airtable record with these AI insights. How to set up Connect Apify and Airtable credentials, link Gemini or OpenAI keys, and adjust the schedule frequency. Add your Airtable base and table IDs. You can trigger analysis manually via the webhook endpoint.
by Rahul Joshi
Description Turn incoming Gmail messages into Zendesk tickets and keep a synchronized log in Google Sheets. Uses Gmail as the trigger, creates Zendesk tickets, and appends or updates a central sheet for tracking. Gain a clean, auditable pipeline from inbox to support queue. ✨ What This Template Does Fetches new emails via Gmail Trigger. ✉️ Normalizes Gmail payload for consistent fields. 🧹 Creates a Zendesk ticket from the email content. 🎫 Formats data for Sheets and appends or updates a row. 📊 Executes helper sub-workflows and writes logs for traceability. 🔁🧾 Key Benefits Converts emails to actionable support tickets automatically. ⚡ Maintains a single source of truth in Google Sheets. 📒 Reduces manual triage and data entry. 🕒 Improves accountability with structured logs. ✅ Features Gmail Trigger for real-time intake. ⏱️ Normalize Gmail Data for consistent fields. 🧩 Create Zendesk Ticket (create: ticket). 🎟️ Format Sheet Data for clean columns. 🧱 Log to Google Sheets with appendOrUpdate. 🔄 Execute workflow (sub-workflow) steps for modularity. 🧩 Requirements n8n instance (cloud or self-hosted). 🛠️ Gmail credentials configured in n8n (with read access to the monitored inbox). ✉️ Zendesk credentials (API token or OAuth) with permission to create tickets. 🔐 Google Sheets credentials with access to the target spreadsheet for append/update. 📊 Access to any sub-workflows referenced by the Execute workflow nodes. 🔁 Target Audience IT support and helpdesk teams managing email-based requests. 🖥️ Ops teams needing auditable intake logs. 🧾 Agencies and service providers converting client emails to tickets. 🤝 Small teams standardizing email-to-ticket flows. 🧑💼 Step-by-Step Setup Instructions Connect Gmail, Zendesk, and Google Sheets in n8n Credentials. 🔑 Set the Gmail Trigger to watch the desired label/inbox. 📨 Map Zendesk fields (description) from normalized Gmail data. 🧭 Point the Google Sheets node to your spreadsheet and confirm appendOrUpdate mode. 📄 Assign credentials to all nodes, including any Execute workflow steps. 🔁 Run once to test end-to-end; then activate the workflow. ✅
by M Ayoub
Who is this for? Crypto traders, investors, and enthusiasts who want automated daily market analysis delivered to Discord without manually checking multiple data sources. What it does Fetches real-time cryptocurrency data from 6 free APIs, analyzes market sentiment and indicators using Google Gemini AI, and sends beautifully formatted investment recommendations to your Discord channel. ✅ Uses Free APIs Only — CoinGecko, Yahoo Finance, Alternative.me, and OKX public endpoints require no paid subscriptions or API keys. How it works Triggers daily at scheduled time (default: 5PM) Fetches BTC/ETH/USDT prices and global market metrics from CoinGecko Gets DXY (US Dollar Index) from Yahoo Finance Retrieves Fear & Greed Index from Alternative.me Pulls BTC and ETH funding rates from OKX Combines all data and builds comprehensive analysis prompt Gemini AI analyzes correlations, sentiment, and provides investment stance Formats rich Discord embed with market overview, dominance metrics, indicators, and AI insights Sends alert to your Discord webhook Set up steps Get a free Google Gemini API key from Google AI Studio Create a Discord webhook in your server (Server Settings → Integrations → Webhooks) Connect your Gemini API credentials to the Gemini AI Analysis node Update the webhook URL in the Send to Discord Webhook node with your Discord webhook URL Optionally adjust the trigger time in the Daily Schedule Trigger (5PM) node Setup time: ~5 minutes