by Daniel
Transform any website into a structured knowledge repository with this intelligent crawler that extracts hyperlinks from the homepage, intelligently filters images and content pages, and aggregates full Markdown-formatted content—perfect for fueling AI agents or building comprehensive company dossiers without manual effort. 📋 What This Template Does This advanced workflow acts as a lightweight web crawler: it scrapes the homepage to discover all internal links (mimicking a sitemap extraction), deduplicates and validates them, separates image assets from textual pages, then fetches and converts non-image page content to clean Markdown. Results are seamlessly appended to Google Sheets for easy analysis, export, or integration into vector databases. Automatically discovers and processes subpage links from the homepage Filters out duplicates and non-HTTP links for efficient crawling Converts scraped content to Markdown for AI-ready formatting Categorizes and stores images, links, and full content in a single sheet row per site 🔧 Prerequisites Google account with Sheets access for data storage n8n instance (cloud or self-hosted) Basic understanding of URLs and web links 🔑 Required Credentials Google Sheets OAuth2 API Setup Go to console.cloud.google.com → APIs & Services → Credentials Click "Create Credentials" → Select "OAuth client ID" → Choose "Web application" Add authorized redirect URIs: https://your-n8n-instance.com/rest/oauth2-credential/callback (replace with your n8n URL) Download the client ID and secret, then add to n8n as "Google Sheets OAuth2 API" credential type During setup, grant access to Google Sheets scopes (e.g., spreadsheets) and test the connection by listing a sheet ⚙️ Configuration Steps Import the workflow JSON into your n8n instance In the "Set Website" node, update the website_url value to your target site (e.g., https://example.com) Assign your Google Sheets credential to the three "Add ... to Sheet" nodes Update the documentId and sheetName in those nodes to your target spreadsheet ID and sheet name/ID Ensure your sheet has columns: "Website", "Links", "Scraped Content", "Images" Activate the workflow and trigger manually to test scraping 🎯 Use Cases Knowledge base creation: Crawl a company's site to aggregate all content into Sheets, then export to Notion or a vector DB for internal wikis AI agent training: Extract structured Markdown from industry sites to fine-tune LLMs on domain-specific data like legal docs or tech blogs Competitor intelligence: Build dossiers by crawling rival websites, separating assets and text for SEO audits or market analysis Content archiving: Preserve dynamic sites (e.g., news portals) as static knowledge dumps for compliance or historical research ⚠️ Troubleshooting No links extracted: Verify the homepage has tags; test with a simple site like example.com and check HTTP response in executions Sheet update fails: Confirm column names match exactly (case-sensitive) and credential has edit permissions; try a new blank sheet Content truncated: Google Sheets limits cells to ~50k chars—adjust the .slice(0, 50000) in "Add Scraped Content to Sheet" or split into multiple rows Rate limiting errors: Add a "Wait" node after "Scrape Links" with 1-2s delay if the site blocks rapid requests
by vinci-king-01
Social Media Sentiment Analysis Dashboard with AI and Real-time Monitoring 🎯 Target Audience Social media managers and community managers Marketing teams monitoring brand reputation PR professionals tracking public sentiment Customer service teams identifying trending issues Business analysts measuring social media ROI Brand managers protecting brand reputation Product managers gathering user feedback 🚀 Problem Statement Manual social media monitoring is overwhelming and often misses critical sentiment shifts or trending topics. This template solves the challenge of automatically collecting, analyzing, and visualizing social media sentiment data across multiple platforms to provide actionable insights for brand management and customer engagement. 🔧 How it Works This workflow automatically monitors social media platforms using AI-powered sentiment analysis, processes mentions and conversations, and provides real-time insights through a comprehensive dashboard. Key Components Scheduled Trigger - Runs the workflow at specified intervals to maintain real-time monitoring AI-Powered Sentiment Analysis - Uses advanced NLP to analyze sentiment, emotions, and topics Multi-Platform Integration - Monitors Twitter, Reddit, and other social platforms Real-time Alerting - Sends notifications for critical sentiment changes or viral content Dashboard Integration - Stores all data in Google Sheets for comprehensive analysis and reporting 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the mention was recorded | "2024-01-15T10:30:00Z" | | platform | String | Social media platform | "Twitter" | | username | String | User who posted the content | "@john_doe" | | content | String | Full text of the post/comment | "Love the new product features!" | | sentiment_score | Number | Sentiment score (-1 to 1) | 0.85 | | sentiment_label | String | Sentiment classification | "Positive" | | emotion | String | Primary emotion detected | "Joy" | | topics | Array | Key topics identified | ["product", "features"] | | engagement | Number | Likes, shares, comments | 1250 | | reach_estimate | Number | Estimated reach | 50000 | | influence_score | Number | User influence metric | 0.75 | | alert_priority | String | Alert priority level | "High" | 🛠️ Setup Instructions Estimated setup time: 20-25 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Slack workspace for notifications (optional) Social media API access (Twitter, Reddit, etc.) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for sentiment analysis data Configure the sheet name (default: "Sentiment Analysis") 4. Configure Social Media Monitoring Update the websiteUrl parameters in ScrapeGraphAI nodes Add URLs for social media platforms you want to monitor Customize the user prompt to extract specific sentiment data Set up keywords, hashtags, and brand mentions to track 5. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials for alerts Define sentiment thresholds for different alert levels Test notification delivery 6. Configure Schedule Trigger Set monitoring frequency (every 15 minutes, hourly, etc.) Choose appropriate time zones for your business hours Consider social media platform rate limits 7. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Test sentiment analysis with sample content 🔄 Workflow Customization Options Modify Monitoring Targets Add or remove social media platforms Change keywords, hashtags, or brand mentions Adjust monitoring frequency based on platform activity Extend Sentiment Analysis Add more sophisticated emotion detection Implement topic clustering and trend analysis Include influencer identification and scoring Customize Alert System Set different thresholds for different sentiment levels Create tiered alert systems (info, warning, critical) Add sentiment trend analysis and predictions Output Customization Add data visualization and reporting features Implement sentiment trend charts and graphs Create executive dashboards with key metrics Add competitor sentiment comparison 📈 Use Cases Brand Reputation Management**: Monitor and respond to brand mentions Crisis Management**: Detect and respond to negative sentiment quickly Customer Feedback Analysis**: Understand customer satisfaction and pain points Product Launch Monitoring**: Track sentiment around new product releases Competitor Analysis**: Monitor competitor sentiment and engagement Influencer Identification**: Find and engage with influential users 🚨 Important Notes Respect social media platforms' terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update your monitoring keywords and parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider privacy implications and data protection regulations 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Sentiment analysis errors: Review the Code node's JavaScript logic Rate limiting: Adjust monitoring frequency and implement delays Alert delivery failures: Check notification service credentials Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Social media platform API documentation Sentiment analysis best practices and guidelines
by Oneclick AI Squad
This automated n8n workflow processes student applications on a scheduled basis, validates data, updates databases, and sends welcome communications to students and guardians. Main Components Trigger at Every Day 7 am** - Scheduled trigger that runs the workflow daily Read Student Data** - Reads pending applications from Excel/database Validate Application Data** - Checks data completeness and format Process Application Data** - Processes validated applications Update Student Database** - Updates records in the student database Prepare Welcome Email** - Creates personalized welcome messages Send Email** - Sends welcome emails to students/guardians Success Response** - Confirms successful processing Error Response** - Handles any processing errors Essential Prerequisites Excel file with student applications (student_applications.xlsx) Database access for student records SMTP server credentials for sending emails File storage access for reading application data Required Excel File Structure (student_applications.xlsx): Application ID | First Name | Last Name | Email | Phone Program Interest | Grade Level | School | Guardian Name | Guardian Phone Application Date | Status | Notes Expected Input Data Format: { "firstName": "John", "lastName": "Doe", "email": "john.doe@example.com", "phone": "+1234567890", "program": "Computer Science", "gradeLevel": "10th Grade", "school": "City High School", "guardianName": "Jane Doe", "guardianPhone": "+1234567891" } Key Features ⏰ Scheduled Processing:** Runs daily at 7 AM automatically 📊 Data Validation:** Ensures application completeness 💾 Database Updates:** Maintains student records 📧 Auto Emails:** Sends welcome messages ❌ Error Handling:** Manages processing failures Quick Setup Import workflow JSON into n8n Configure schedule trigger (default: 7 AM daily) Set Excel file path in "Read Student Data" node Configure database connection in "Update Student Database" node Add SMTP settings in "Send Email" node Test with sample data Activate workflow Parameters to Configure excel_file_path: Path to student applications file database_connection: Student database credentials smtp_host: Email server address smtp_user: Email username smtp_password: Email password admin_email: Administrator notification email
by Eugen
👥 Who the Automation is for This automation is perfect for bloggers, solopreneurs, business owners, and marketing teams who want to scale SEO content creation. Instead of spending hours on research and drafting, you can go from a single keyword idea to a ready-to-edit WordPress draft in minutes. ⚙️ How the Automation Works Collect keywords in a Google Sheet and mark the ones you want as “prioritized.” Click “Prepare Content” → your keyword(s) are sent to n8n. n8n pulls the top 10 Google SERP results. AI analyzes competitors (tone, content type, gaps) and creates a content brief. Another AI generates a blog draft based on the brief. The draft is automatically uploaded to WordPress and your sheet updates. 👉 In short: Keyword → SERP → Brief → Draft → WordPress. 🛠 How to Set Up Full Setup Guide Copy the Google Sheets Template. Import the workflow into n8n. Add your API keys: Google Custom Search, Claude AI, and WordPress credentials. Test the webhook connection from Google Sheets. 🎉 Done — you now have a one-click pipeline from keyword idea to WordPress draft.
by Rakin Jakaria
Who this is for This workflow is for digital marketing agencies or sales teams who want to automatically find business leads based on industry & location, gather their contact details, and send personalized cold emails — all from one form submission. What this workflow does This workflow starts every time someone submits the Lead Machine Form. It then: Scrapes business data* (company name, website, phone, address, category) using *Apify** based on business type & location. Extracts the best email address* from each business website using *Google Gemini AI**. Stores valid leads* in *Google Sheets**. Generates cold email content** (subject + body) with AI based on your preferred tone (Friendly, Professional, Simple). Sends the cold email** via Gmail. Updates the sheet** with send status & timestamp. Setup To set this workflow up: Form Trigger – Customize the “Lead Machine” form fields if needed (Business Type, Location, Lead Number, Email Style). Apify API – Add your Apify Actor Endpoint URL in the HTTP Request node. Google Gemini – Add credentials for extracting email addresses. Google Sheets – Connect your sheet for storing leads & email status. OpenAI – Add your credentials for cold email generation. Gmail – Connect your Gmail account for sending cold emails. How to customize this workflow to your needs Change the AI email prompt to reflect your brand’s voice and offer. Add filters to only target leads that meet specific criteria (e.g., website must exist, email must be verified). Modify the Google Sheets structure to track extra info like “Follow-up Date” or “Lead Source”. Switch Gmail to another email provider if preferred.
by moosa
This workflow monitors product prices from BooksToScrape and sends alerts to a Discord channel via webhook when competitor's prices are lower than our prices. 🧩 Nodes Used Schedule (for daily or required schedule) If nodes (to check if checked or unchecked data exists) HTTP Request (for fetching product page ) Extract HTML (for extracting poduct price) Code(to clean and extract just the price number) Discord Webhook (send discord allerts) Sheets (extract and update) 🚀 How to Use Replace the Discord webhook URL with your own. Customize the scraping URL if you're monitoring a different site.(Sheet i used) Run the workflow manually or on a schedule. ⚠️ Important Do not use this for commercial scraping without permission. Ensure the site allows scraping (this example is for learning only).
by Renan Miller
How it works This workflow automatically extracts specific data from received emails and saves it into a Google Sheets document for easy tracking and analysis. It connects to a Gmail account, searches for emails received within a defined date range from a specific sender, opens links inside those emails, extracts data from the linked pages (such as case ID, patient name, birth date, complaint, and location), processes and cleans the information using custom JavaScript logic, and finally saves the structured results into a Google Sheet. Setup steps Connect Gmail using OAuth2 credentials. Adjust the date filters and sender email in the “Search Emails” node. Customize the CSS selectors in the HTML extraction nodes to match the desired elements from your email or linked page. Open the Code node and modify the logic if you need to calculate or transform additional fields. Link your Google Sheets account and specify the spreadsheet and sheet name where the results will be appended.
by Daniel Rosehill
Voice Note Context Extraction Pipeline with AI Agent & Vector Storage This n8n template demonstrates how to automatically extract and store contextual information from voice notes using AI agents and vector databases for future retrieval. How it works Webhook trigger** receives voice note data including title, transcript, and timestamp from external services (example here: voicenotes.com) Field extraction** isolates the key data fields (title, transcript, timestamp) for processing AI Context Agent** processes the transcript to extract meaningful context while: Correcting speech-to-text errors Converting first-person references to third-person facts Filtering out casual conversation and focusing on significant information Output formatting** structures the extracted context with timestamps for embedding File conversion** prepares the context data for vector storage Vector embedding** uses OpenAI embeddings to create searchable representations Milvus storage** stores the embedded context for future retrieval in RAG applications How to use Configure the webhook endpoint to receive data from your voice note service Set up credentials for OpenRouter (LLM), OpenAI (embeddings), and Milvus (vector storage) Customize the AI agent's system prompt to match your context extraction needs The workflow automatically processes incoming voice notes and stores extracted context Requirements OpenRouter account for LLM access OpenAI API key for embeddings Milvus vector database (cloud or self-hosted) Voice note service with webhook capabilities (e.g., Voicenotes.com) Customizing this workflow Modify the context extraction prompt** to focus on specific types of information (preferences, facts, relationships) Add filtering logic** to process only voice notes with specific tags or keywords Integrate with other storage** systems like Pinecone, Weaviate, or local vector databases Connect to RAG systems** to use the stored context for enhanced AI conversations Add notification nodes** to confirm successful context extraction and storage Use cases Personal AI assistant** that remembers your preferences and context from voice notes Knowledge management** system for capturing insights from recorded thoughts Content creation** pipeline that extracts key themes from voice recordings Research assistant** that builds context from interview transcripts or meeting notes
by Nick Canfield
Try It Out! This n8n template uses AI to automatically respond to your Gmail inbox by drafting response for your approval via email. How it works Gmail Trigger** monitors your inbox for new emails AI Analysis** determines if a response is needed based on your criteria Draft Generation** creates contextually appropriate replies using your business information Human Approval** sends you the draft for review before sending Auto-Send** replies automatically once approved Setup Connect your Gmail account to the Gmail Trigger node Update the "Your Information" node with: Entity name and description Approval email address Resource guide (FAQs, policies, key info) Response guidelines (tone, style, formatting preferences) Configure your LLM provider (OpenAI, Claude, Gemini, etc.) with API credentials Test with a sample email Requirements n8n instance (self-hosted or cloud) Gmail account with API access LLM provider API key Need Help? Email Nick @ nick@tropicflare.com
by Cheng Siong Chin
Introduction Automates gold market tracking using AI forecasting by collecting live prices, financial news, and macro indicators (inflation, interest rates, employment) to produce real-time insights and trend predictions for analysts and investors. How It Works Every 6 hours, the system fetches market data and news → runs AI sentiment and trend analysis → generates a concise forecast report → publishes it to WordPress → and alerts users via Slack or email. Workflow Template Trigger → Fetch → Format → Merge → AI Analyze → Report → Publish → Notify Workflow Steps Schedule: Executes automatically every 6 hours using a Cron trigger. Fetch: Retrieves live gold prices (MetalPriceAPI), financial headlines (NewsAPI), and macroeconomic indicators (FRED). Format & Merge: Cleans, normalizes, and merges all data into a single structured dataset for AI analysis. AI Analyze (OpenAI): Performs sentiment, trend, and correlation analysis to forecast short-term gold price movements. Report Generation: Creates a concise summary report with forecasts, insights, and confidence metrics. Publish & Notify: Automatically posts to WordPress and sends alerts via Slack and Email to keep analysts updated. Setup Add API keys: MetalPrice, NewsAPI, FRED, OpenAI, WordPress, Slack, Gmail. Configure scheduling interval, API endpoints, and authentication in n8n. Predefine WordPress post format and Slack message templates for smooth automation. Prerequisites n8n v1.0+, API keys, OAuth credentials, and internet access. Use Cases Investment forecasting, financial newsletter automation, or market monitoring dashboards. Customization Add cryptocurrency or stock tracking, modify AI prompts, or route summaries to Telegram, Notion, or Google Sheets. Benefits Saves analyst time, ensures consistent insights, enhances accuracy, and delivers timely, AI-driven financial intelligence.
by Ruthwik
🚀 AI-Powered WhatsApp Customer Support for Shopify Brands This n8n template builds a WhatsApp support copilot that answers **order status* and *product availability** from Shopify using LLM "agents," then replies to the customer in WhatsApp or routes to human support. Use cases "Where is my order?" → live status + tracking link "What are your best-selling T-shirts?" → in-stock sizes & variants Greetings / small talk → welcome message Anything unclear → handoff to support channel Good to know WhatsApp Business conversations are billed by Meta/Twilio/Exotel; plan accordingly. Shopify Admin API has rate limits (leaky bucket) --- stagger requests. LLM usage incurs token costs; cap max tokens and enable caching where possible. Avoid sending PII to the model; only pass minimal order/product fields. How it works WhatsApp Trigger\ Receives an incoming message (e.g., "Where is my order?"). Get Customer from Shopify → Customer Details → Normalize Input\ Looks up the customer by phone, formats the query (lower-case, emoji & punctuation normalization). Switch (intent router)\ Classifies into welcome, orderStatusQuery, productQuery, or supportQuery. Welcome path\ Welcome message → polite greeting → (noop placeholder). Order status path (Orders Agent) Orders Agent (LLM + Memory) interprets the user request and extracts needed fields. Get Customer Orders (HTTP to Shopify) fetches the user's latest order(s). Structured Output Parser cleans the agent's output into a strict schema. Send Order Status (WhatsApp message) returns status, ETA, and tracking link. Products path (Products Agent) Products Agent (LLM + Memory) turns the ask into a product query. Get Products from Shopify (HTTP) pulls best sellers / inventory & sizes. Structured Output Parser formats name, price, sizes, stock. Send Products message (WhatsApp) sends a tidy, human-readable reply Support path Send a message to support posts the transcript/context to your agent/helpdesk channel and informs the user a human will respond How to use Replace the manual/WhatsApp trigger with your live WhatsApp number/webhook. Set env vars/credentials: Shopify domain + Admin API token, WhatsApp provider keys, LLM key (OpenAI/OpenRouter), and (optionally) your support channel webhook. Edit message templates for tone, add your brand name, and localize if needed. Test with samples: "Where is my order?", "Show best sellers", "Hi". Requirements WhatsApp Business API (Meta/Twilio/Exotel) Shopify store + Admin API access LLM provider (OpenAI/OpenRouter etc.) Slack webhook for human handoff Prerequisites Active WhatsApp Business Account connected via API provider (Meta, Twilio, or Exotel). Shopify Admin API credentials** (API key, secret, store domain). Slack OAuth app** or webhook for human support escalation. API key for your LLM provider (OpenAI, OpenRouter, etc.). Customising this workflow Add intents: returns/exchanges, COD confirmation, address changes. Enrich product replies with images, price ranges, and "Buy" deep links. Add multilingual support by detecting locale and templating responses. Log all interactions to a DB/Sheet for analytics and quality review. Guardrails: confidence thresholds → fallback to support; redact PII; retry on API errors.
by Cheng Siong Chin
Introduction This workflow connects to OpenAI, Anthropic, and Groq, processing requests in parallel with automatic performance metrics. Ideal for testing speed, cost, and quality across models. How It Works Webhooks trigger parameter extraction and routing. Three AI agents run simultaneously with memory and parsing. Responses merge with detailed metrics. Workflow Template Webhook → Extract Parameters → Router ├→ OpenAI Agent ├→ Anthropic Agent ├→ Groq Agent → Merge → Metrics → Respond Workflow Steps Webhook receives POST with prompt and settings. Parameters extracted and validated. Router directs by cost, latency, or type. AI agents run in parallel. Results merged with metadata. Metrics compute time, cost, and quality. Response returns outputs and recommendation. Setup Instructions Activate Webhook with authentication. Add API keys for all providers. Define models, tokens, and temperature. Adjust Router logic for selection. Tune Metrics scoring formulas. Prerequisites n8n v1.0+ instance API keys: OpenAI, Anthropic, Groq HTTP client for testing Customization Add providers like Gemini or Azure OpenAI. Enable routing by cost or performance. Benefits Auto-select efficient providers and compare model performance in real time.