by Le Nguyen
This template implements a recursive web crawler inside n8n. Starting from a given URL, it crawls linked pages up to a maximum depth (default: 3), extracts text and links, and returns the collected content via webhook. 🚀 How It Works 1) Webhook Trigger Accepts a JSON body with a url field. Example payload: { "url": "https://example.com" } 2) Initialization Sets crawl parameters: url, domain, maxDepth = 3, and depth = 0. Initializes global static data (pending, visited, queued, pages). 3) Recursive Crawling Fetches each page (HTTP Request). Extracts body text and links (HTML node). Cleans and deduplicates links. Filters out: External domains (only same-site is followed) Anchors (#), mailto/tel/javascript links Non-HTML files (.pdf, .docx, .xlsx, .pptx) 4) Depth Control & Queue Tracks visited URLs Stops at maxDepth to prevent infinite loops Uses SplitInBatches to loop the queue 5) Data Collection Saves each crawled page (url, depth, content) into pages[] When pending = 0, combines results 6) Output Responds via the Webhook node with: combinedContent (all pages concatenated) pages[] (array of individual results) Large results are chunked when exceeding ~12,000 characters 🛠️ Setup Instructions 1) Import Template Load from n8n Community Templates. 2) Configure Webhook Open the Webhook node Copy the Test URL (development) or Production URL (after deploy) You’ll POST crawl requests to this endpoint 3) Run a Test Send a POST with JSON: curl -X POST https://<your-n8n>/webhook/<id> \ -H "Content-Type: application/json" \ -d '{"url": "https://example.com"}' 4) View Response The crawler returns a JSON object containing combinedContent and pages[]. ⚙️ Configuration maxDepth** Default: 3. Adjust in the Init Crawl Params (Set) node. Timeouts** HTTP Request node timeout is 5 seconds per request; increase if needed. Filtering Rules** Only same-domain links are followed (apex and www treated as same-site) Skips anchors, mailto:, tel:, javascript: Skips document links (.pdf, .docx, .xlsx, .pptx) You can tweak the regex and logic in Queue & Dedup Links (Code) node 📌 Limitations No JavaScript rendering (static HTML only) No authentication/cookies/session handling Large sites can be slow or hit timeouts; chunking mitigates response size ✅ Example Use Cases Extract text across your site for AI ingestion / embeddings SEO/content audit and internal link checks Build a lightweight page corpus for downstream processing in n8n ⏱️ Estimated Setup Time ~10 minutes (import → set webhook → test request)
by David Ashby
🛠️ NASA Tool MCP Server Complete MCP server exposing all NASA Tool operations to AI agents. Zero configuration needed - all 15 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every NASA Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n NASA Tool tool with full error handling 📋 Available Operations (15 total) Every possible NASA Tool operation is included: 🔧 Asteroidneobrowse (1 operations) • Get many asteroid neos 🔧 Asteroidneofeed (1 operations) • Get an asteroid neo feed 🔧 Asteroidneolookup (1 operations) • Get an asteroid neo lookup 🔧 Astronomypictureoftheday (1 operations) • Get the astronomy picture of the day 🔧 Donkicoronalmassejection (1 operations) • Get a DONKI coronal mass ejection 🔧 Donkihighspeedstream (1 operations) • Get a DONKI high speed stream 🔧 Donkiinterplanetaryshock (1 operations) • Get a DONKI interplanetary shock 🔧 Donkimagnetopausecrossing (1 operations) • Get a DONKI magnetopause crossing 🔧 Donkinotifications (1 operations) • Get a DONKI notifications 🔧 Donkiradiationbeltenhancement (1 operations) • Get a DONKI radiation belt enhancement 🔧 Donkisolarenergeticparticle (1 operations) • Get a DONKI solar energetic particle 🔧 Donkisolarflare (1 operations) • Get a DONKI solar flare 🔧 Donkiwsaenlilsimulation (1 operations) • Get a DONKI wsa enlil simulation 🔧 Earthassets (1 operations) • Get Earth assets 🔧 Earthimagery (1 operations) • Get Earth imagery 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native NASA Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every NASA Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing 14 Domains-Index API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Domains-Index API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Domains-Index API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to /v1 • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (14 total) 🔧 Domains (9 endpoints) • GET /domains/search: Domains Database Search • GET /domains/tld/{zone_id}: Get TLD records • GET /domains/tld/{zone_id}/download: Download Whole Dataset for TLD • GET /domains/tld/{zone_id}/search: Domains Search for TLD • GET /domains/updates/added: Get added domains, latest if date not specified • GET /domains/updates/added/download: Download added domains, latest if date not specified • GET /domains/updates/deleted: Get deleted domains, latest if date not specified • GET /domains/updates/deleted/download: Download deleted domains, latest if date not specified • GET /domains/updates/list: List of updates 🔧 Info (5 endpoints) • GET /info/api: GET /info/api • GET /info/stat/: Returns overall stagtistics • GET /info/stat/{zone}: Returns statistics for specific zone • GET /info/tld/: Returns overall Tld info • GET /info/tld/{zone}: Returns statistics for specific zone 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Domains-Index API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Rahi
Workflow: Track Email Campaign Engagement Analytics with Smartlead and Google Sheets Automatically fetch lead-level email engagement analytics (opens, clicks, replies, unsubscribes, bounces) from Smartlead and update them in Google Sheets. Use this to keep a single, always-fresh source of truth for campaign performance and sequence effectiveness. Summary Pull Smartlead campaign analytics on a schedule and write them to a Google Sheet (append or update). Works with pagination, avoids duplicates via a stable key, and is ready for dashboards, pivots, or BI tools. What This Workflow Does Collects campaign stats from Smartlead (per-lead, per-sequence). Handles pagination safely (offset/limit). Writes to Google Sheets using appendOrUpdate with a matching column to prevent duplicates. Can run on a schedule for near real-time analytics. Node Structure Overview | Step | Node | Purpose | |---|---|---| | 1️⃣ | Schedule Trigger | Starts the workflow on a cadence (e.g., hourly) | | 2️⃣ | Code (Pagination Generator) | Emits {offset, limit} pairs (e.g., 0..9900, step 100) | | 3️⃣ | Split in Batches | Sends each pagination pair to the API sequentially | | 4️⃣ | HTTP Request (Smartlead) | GET /campaigns/{campaign_id}/statistics with offset/limit | | 5️⃣ | Split Out | Turns the API data[] array into one item per lead record | | 6️⃣ | Google Sheets (appendOrUpdate) | Upserts rows by stats_id into EngagedLeads tab | | 7️⃣ | Loop Back | Continues until all batches have been processed | Step-by-Step Setup Prerequisites Smartlead account + API key with access to campaign statistics. Google account + Google Sheets OAuth connected in n8n. Create the Google Sheet Spreadsheet name: Email Analytics (can be anything). Tab name: EngagedLeads. Add these exact headers (first row): lead_name, lead_email, lead_category, sequence_number, stats_id, email_subject, sent_time, open_time, click_time, reply_time, open_count, click_count, is_unsubscribed, is_bounced Configure the Schedule Trigger Choose a frequency (e.g., every 2 hours). If you’re testing, set a single run or a short cadence. Configure the Code Node (Pagination) Emit N items like: { "offset": 0, "limit": 100 } { "offset": 100, "limit": 100 } ... 100 is a good default limit. For up to 10,000 records, generate 100 offsets. Configure the Smartlead API Node Method: GET URL: https://server.smartlead.ai/api/v1/campaigns/{campaign_id}/statistics Query parameters: api_key = <YOUR_SMARTLEAD_API_KEY> offset = {{ $json.offset }} limit = {{ $json.limit }} Map response to JSON. Split Out the Response Use a Split Out (or similar) to iterate over data[] so each lead record is one item. Google Sheets Node (Append or Update) Operation: appendOrUpdate. Document: Your Email Analytics sheet. Sheet/Tab: EngagedLeads. Matching Column: stats_id. Map fields from Smartlead response to sheet columns: lead_name ← lead name (or composed from first/last if provided) lead_email ← email lead_category ← category/type if available sequence_number ← sequence step number stats_id ← stable identifier (e.g., Smartlead stats_id or message id) email_subject ← subject sent_time, open_time, click_time, reply_time ← timestamps open_count, click_count ← integers is_unsubscribed, is_bounced ← booleans If the same stats_id arrives again, the row is updated, not appended. Test and Activate Run once manually to verify API and sheet mapping. Check the sheet for new/updated rows. Activate the workflow to run automatically. Smartlead API Reference (Used by This Workflow) Endpoint** GET https://server.smartlead.ai/api/v1/campaigns/{campaign_id}/statistics Required query parameters** api_key (string) offset (number) limit (number) Typical response (trimmed example)** { "data": [ { "lead_name": "Jane Doe", "lead_email": "jane@example.com", "sequence_number": 2, "stats_id": "15b6ff3a-...-b2b9f343c2e1", "email_subject": "Quick intro", "sent_time": "2025-10-08T10:18:55.496Z", "open_time": "2025-10-08T10:20:10.000Z", "click_time": null, "reply_time": null, "open_count": 1, "click_count": 0, "is_unsubscribed": false, "is_bounced": false } ], "total": 1234 } Google Sheets Structure (Recommended) Spreadsheet: Email Analytics Tab: EngagedLeads Columns:lead_name, lead_email, lead_category, sequence_number, stats_id, email_subject, sent_time, open_time, click_time, reply_time, open_count, click_count, is_unsubscribed, is_bounced Matching Column: stats_id (prevents duplicates and allows updates) Customization Tips Multiple Campaigns** Duplicate the workflow and set a different {campaign_id} and/or write results to a separate tab in your Google Sheet. Batch Size** Increase or decrease the limit value (e.g., 200) in your Code node if you want fewer or more API calls. Filtering** Add a Code or IF node to skip rows where is_bounced = true or is_unsubscribed = true. Dashboards** Create a new tab named Dashboard in Google Sheets and visualize your data using built-in charts or connect it to Looker Studio for advanced visualization. Enrichment** Join this dataset with your CRM data (e.g., HubSpot or Salesforce) using lead_email as a key to gain deeper customer insights. Security and Publishing Notes Do not hardcode** your Smartlead API key in the workflow export. Use n8n credentials or environment variables instead. When sharing the template publicly, replace sensitive values with placeholders like: <YOUR_SMARTLEAD_API_KEY> and <YOUR_GOOGLE_SHEET_ID>. Keep your Google Sheet private unless you intentionally want to share it publicly. Troubleshooting No rows in Sheets** Verify that the API response includes data[], confirm that the Split Out node is configured correctly, and check field mappings. Duplicates** Ensure the Google Sheets node has its matching column set to stats_id. Rate Limits** Increase the schedule interval, add a short Wait node between batches, or reduce the limit size. Mapping Errors** Ensure that column names in Sheets exactly match your field mappings — they are case-sensitive. Timezone Differences** Smartlead timestamps are in UTC. Convert them downstream if your local timezone is different. Example Use Case Run this workflow hourly to maintain a live, company-wide Email Engagement Sheet. Sales teams** can monitor replies and active leads. Marketing teams** can track open and click rates by sequence. Operations** can export monthly summaries — no Smartlead login required. Tags Smartlead EmailMarketing Automation GoogleSheets Analytics CRM MarketingOps
by David Ashby
Complete MCP server exposing 9 Api2Pdf - PDF Generation, Powered by AWS Lambda API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Api2Pdf - PDF Generation, Powered by AWS Lambda credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Api2Pdf - PDF Generation, Powered by AWS Lambda API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://v2018.api2pdf.com • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (9 total) 🔧 Chrome (3 endpoints) • POST /chrome/html: Convert raw HTML to PDF • GET /chrome/url: Convert URL to PDF • POST /chrome/url: Convert URL to PDF 🔧 Libreoffice (1 endpoints) • POST /libreoffice/convert: Convert office document or image to PDF 🔧 Merge (1 endpoints) • POST /merge: Merge multiple PDFs together 🔧 Wkhtmltopdf (3 endpoints) • POST /wkhtmltopdf/html: Convert raw HTML to PDF • GET /wkhtmltopdf/url: Convert URL to PDF • POST /wkhtmltopdf/url: Convert URL to PDF 🔧 Zebra (1 endpoints) • GET /zebra: Generate bar codes and QR codes with ZXING. 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Api2Pdf - PDF Generation, Powered by AWS Lambda API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing 15 BulkSMS JSON REST API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add BulkSMS JSON REST API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the BulkSMS JSON REST API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.bulksms.com/v1 • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (15 total) 🔧 Blocked-Numbers (2 endpoints) • GET /blocked-numbers: Block Phone Number • POST /blocked-numbers: Create a blocked number 🔧 Credit (1 endpoints) • POST /credit/transfer: Transfer Account Credits 🔧 Messages (5 endpoints) • GET /messages: List Related Messages • POST /messages: Send Messages • GET /messages/send: Send message by simple GET or POST • GET /messages/{id}: Show Message • GET /messages/{id}/relatedReceivedMessages: List Related Messages 🔧 Profile (1 endpoints) • GET /profile: Retrieve User Profile 🔧 Rmm (1 endpoints) • POST /rmm/pre-sign-attachment: Generate Attachment Upload URL 🔧 Webhooks (5 endpoints) • GET /webhooks: Update Webhook Settings • POST /webhooks: Create a webhook • DELETE /webhooks/{id}: Delete a webhook • GET /webhooks/{id}: Read a webhook • POST /webhooks/{id}: Update a webhook 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native BulkSMS JSON REST API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing 9 NPR Listening Service API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add NPR Listening Service credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the NPR Listening Service API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://listening.api.npr.org • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (9 total) 🔧 V2 (9 endpoints) • GET /v2/aggregation/{aggId}/recommendations: Get a set of recommendations for an aggregation independent of the user's lis... • GET /v2/channels: List Available Channels • GET /v2/history: Get User Ratings History • GET /v2/organizations/{orgId}/categories/{category}/recommendations: Get a list of recommendations from a category of content from an organization • GET /v2/organizations/{orgId}/recommendations: Get a variety of details about an organization including various lists of rec... • GET /v2/promo/recommendations: Get Recent Promo Audio • POST /v2/ratings: Submit Media Ratings • GET /v2/recommendations: Get User Recommendations • GET /v2/search/recommendations: Get Search Recommendations 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native NPR Listening Service API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Bao Duy Nguyen
Who is this for? If you hate SPAM emails or don't want to clean them up frequently. What problem is this workflow solving? It automatically deletes SPAM emails for you, so you don't have to. Save a bit of your time. What this workflow does You can specify when to execute the workflow: daily, weekly, or monthly. Setup Select the preferred execution time Configure credentials for Gmail OAuth2 How to customize this workflow to your needs There's no need. It just works!
by Matthew
Automated Cold Email Personalization This workflow automates the creation of highly personalized cold outreach emails by extracting lead data, scraping company websites, and leveraging AI to craft unique email components. This is ideal for sales teams, marketers, and business development professionals looking to scale their outreach efforts while maintaining a high degree of personalization. How It Works Generate Batches: The workflow starts by generating a sequence of numbers, defining how many leads to process in batches. Scrape Lead Data: It uses an external API (Apify) to pull comprehensive lead information, including contact details, company data, and social media links. Fetch Client Data: The workflow then retrieves relevant client details from your Google Sheet based on the scraped data. Scrape Company Website: The lead's company website is automatically scraped to gather content for personalization. Summarize Prospect Data: An OpenAI model analyzes both the scraped website content and the individual's profile data to create concise summaries and identify unique angles for outreach. Craft Personalized Email: A more advanced OpenAI model uses these summaries and specific instructions to generate the "icebreaker," "intro," and "value proposition" components of a personalized cold email. Update Google Sheet: Finally, these generated email components are saved back into your Google Sheet, enriching your lead records for future outreach. Google Sheet Structure Your Google Sheet must have the following exact column headers to ensure proper data flow: Email** (unique identifier for each lead) Full Name** Headline** LinkdIn** cityName** stateName** company/cityName** Country** Company Name** Website** company/businessIndustry** Keywords** icebreaker** (will be populated by the workflow) intro** (will be populated by the workflow) value\_prop** (will be populated by the workflow) Setup Instructions Add Credentials: In n8n, add your OpenAI API key via the Credentials menu. Connect your Google account via the Credentials menu for Google Sheets access. You will also need an Apify API key for the Scraper node. Configure Google Sheets Nodes: Select the Client data and Add email data to sheet nodes. For each, choose your Google Sheets credential, select your spreadsheet, and the specific sheet name. Ensure all column mappings are correct according to the "Google Sheet Structure" section above. Configure Apify Scraper Node: Select the Scraper node. Update the Authorization header with your Apify API token (Bearer KEY). In the JSON Body, set the searchUrl to your Apollo link (or equivalent source URL for lead data). Configure OpenAI Nodes: Select both Summarising prospect data and Creating detailed email nodes. Choose your OpenAI credential from the dropdown. In the Creating detailed email node's prompt, replace PUT YOUR COMPANY INFO HERE with your company's context and verify the target sector for the email generation. Verify Update Node: On the final Add email data to sheet node, ensure the Operation is set to Append Or Update and the Matching Columns field is set to Email. Customization Options 💡 Trigger: Change the When clicking 'Execute workflow' node to an automatic trigger, such as a **Cron node for daily runs, or a Google Sheets trigger when new rows are added. Lead Generation: Modify the **Code node to change the number of leads processed per run (currently set to 50). Scraping Logic**: Adjust the Scraper node's parameters (e.g., count) or replace the Apify integration with another data source if needed. AI Prompting: Experiment with the prompts in the **Summarising prospect data and Creating detailed email OpenAI nodes to refine the tone, style, length, or content focus of the generated summaries and emails. AI Models**: Test different OpenAI models (e.g., gpt-3.5-turbo, gpt-4o) in the OpenAI nodes to find the optimal balance between cost, speed, and output quality. Data Source/CRM**: Replace the Google Sheets nodes with integrations for your preferred CRM (e.g., HubSpot, Salesforce) or a database (e.g., PostgreSQL, Airtable) to manage your leads.
by Incrementors
Financial Insight Automation: Market Cap to Telegram via Bright Data 📊 Description An automated n8n workflow that scrapes financial data from Yahoo Finance using Bright Data, processes market cap information, generates visual charts, and sends comprehensive financial insights directly to Telegram for instant notifications. 🚀 How It Works This workflow operates through a simple three-zone process: 1. Data Input & Trigger User submits a keyword (e.g., "AI", "Crypto", "MSFT") through a form trigger that initiates the financial data collection process. 2. Data Scraping & Processing Bright Data API discovers and scrapes comprehensive financial data from Yahoo Finance, including market cap, stock prices, company profiles, and financial metrics. 3. Visualization & Delivery The system generates interactive market cap charts, saves data to Google Sheets for record-keeping, and sends visual insights to Telegram as PNG images. ⚡ Setup Steps > ⏱️ Estimated Setup Time: 15-20 minutes Prerequisites Active n8n instance (self-hosted or cloud) Bright Data account with Yahoo Finance dataset access Google account for Sheets integration Telegram bot token and chat ID Step 1: Import the Workflow Copy the provided JSON workflow code In n8n: Go to Workflows → + Add workflow → Import from JSON Paste the JSON content and click Import Step 2: Configure Bright Data Integration Set up Bright Data Credentials: In n8n: Navigate to Credentials → + Add credential → HTTP Header Auth Add Authorization header with value: Bearer BRIGHT_DATA_API_KEY Replace BRIGHT_DATA_API_KEY with your actual API key Test the connection to ensure it works properly > Note: The workflow uses dataset ID gd_lmrpz3vxmz972ghd7 for Yahoo Finance data. Ensure you have access to this dataset in your Bright Data dashboard. Step 3: Set up Google Sheets Integration Create a Google Sheet: Go to Google Sheets and create a new spreadsheet Name it "Financial Data Tracker" or similar Copy the Sheet ID from the URL Configure Google Sheets credentials: In n8n: Credentials → + Add credential → Google Sheets OAuth2 API Complete OAuth setup and test connection Update the workflow: Open the "📊 Filtered Output & Save to Sheet" node Replace YOUR_SHEET_ID with your actual Sheet ID Select your Google Sheets credential Step 4: Configure Telegram Bot Set up Telegram Integration: Create a Telegram bot using @BotFather Get your bot token and chat ID In n8n: Credentials → + Add credential → Telegram API Enter your bot token Update the "📤 Send Chart on Telegram" node with your chat ID Replace YOUR_TELEGRAM_CHAT_ID with your actual chat ID Step 5: Test and Activate Test the workflow: Use the form trigger with a test keyword (e.g., "AAPL") Monitor the execution in n8n Verify data appears in Google Sheets Check for chart delivery on Telegram Activate the workflow: Turn on the workflow using the toggle switch The form trigger will be accessible via the provided webhook URL 📋 Key Features 🔍 Keyword-Based Discovery: Search companies by keyword, ticker, or industry 💰 Comprehensive Financial Data: Market cap, stock prices, earnings, and company profiles 📊 Visual Charts: Automatic generation of market cap comparison charts 📱 Telegram Integration: Instant delivery of insights to your mobile device 💾 Data Storage: Automatic backup to Google Sheets for historical tracking ⚡ Real-time Processing: Fast data retrieval and processing with Bright Data 📊 Output Data Points | Field | Description | Example | |-------|-------------|---------| | Company Name | Full company name | "Apple Inc." | | Stock Ticker | Trading symbol | "AAPL" | | Market Cap | Total market capitalization | "$2.89T" | | Current Price | Latest stock price | "$189.25" | | Exchange | Stock exchange | "NASDAQ" | | Sector | Business sector | "Technology" | | PE Ratio | Price to earnings ratio | "28.45" | | 52 Week Range | Annual high and low prices | "$164.08 - $199.62" | 🔧 Troubleshooting Common Issues Bright Data Connection Failed: Verify your API key is correct and active Check dataset permissions in Bright Data dashboard Ensure you have sufficient credits Google Sheets Permission Denied: Re-authenticate Google Sheets OAuth Verify sheet sharing settings Check if the Sheet ID is correct Telegram Not Receiving Messages: Verify bot token and chat ID Check if bot is added to the chat Test Telegram credentials manually Performance Tips Use specific keywords for better data accuracy Monitor Bright Data usage to control costs Set up error handling for failed requests Consider rate limiting for high-volume usage 🎯 Use Cases Investment Research:** Quick financial analysis of companies and sectors Market Monitoring:** Track market cap changes and stock performance Competitive Analysis:** Compare financial metrics across companies Portfolio Management:** Monitor holdings and potential investments Financial Reporting:** Generate automated financial insights for teams 🔗 Additional Resources n8n Documentation Bright Data Datasets Google Sheets API Telegram Bot API For any questions or support, please contact: info@incrementors.com or fill out this form: https://www.incrementors.com/contact-us/
by Lucas Peyrin
How it works This workflow is an interactive, hands-on tutorial designed to teach you the absolute basics of JSON (JavaScript Object Notation) and, more importantly, how to use it within n8n. It's perfect for beginners who are new to automation and data structures. The tutorial is structured as a series of simple steps. Each node introduces a new, fundamental concept of JSON: Key/Value Pairs: The basic building block of all JSON. Data Types: It then walks you through the most common data types one by one: String (text) Number (integers and decimals) Boolean (true or false) Null (representing "nothing") Array (an ordered list of items) Object (a collection of key/value pairs) Using JSON with Expressions: The most important step! It shows you how to dynamically pull data from a previous node into a new one using n8n's expressions ({{ }}). Final Exam: A final node puts everything together, building a complete JSON object by referencing data from all the previous steps. Each node has a detailed sticky note explaining the concept in simple terms. Set up steps Setup time: 0 minutes! This is a tutorial workflow, so there is no setup required. Simply click the "Execute Workflow" button to run it. Follow the instructions in the main sticky note: click on each node in order, from top to bottom. For each node, observe the output in the right-hand panel and read the sticky note next to it to understand what you're seeing. By the end, you'll have a solid understanding of what JSON is and how to work with it in your own n8n workflows.
by Daniel Shashko
How it Works This workflow automates competitive price intelligence using Bright Data's enterprise web scraping API. On a scheduled basis (default: daily at 9 AM), the system loops through configured competitor product URLs, triggers Bright Data's web scraper to extract real-time pricing data from each site, and intelligently compares competitor prices against your current pricing. The workflow handles the full scraping lifecycle: it sends scraping requests to Bright Data, waits for completion, fetches the scraped product data, and parses prices from various formats and website structures. All pricing data is automatically logged to Google Sheets for historical tracking and trend analysis. When a competitor's price drops below yours by more than the configured threshold (e.g., 10% cheaper), the system immediately sends detailed alerts via Slack and email to your pricing team with actionable intelligence. At the end of each monitoring run, the workflow generates a comprehensive daily summary report that aggregates all competitor data, calculates average price differences, identifies the lowest and highest competitors, and provides a complete competitive landscape view. This eliminates hours of manual competitor research and enables data-driven pricing decisions in real-time. Who is this for? E-commerce businesses and online retailers needing automated competitive price monitoring Product managers and pricing strategists requiring real-time competitive intelligence Revenue operations teams managing dynamic pricing strategies across multiple products Marketplaces competing in price-sensitive categories where margins matter Any business that needs to track competitor pricing without manual daily checks Setup Steps Setup time: Approx. 30-40 minutes (Bright Data configuration, credential setup, competitor URL configuration) Requirements: Bright Data account with Web Scraper API access Bright Data API token (from dashboard) Google account with a spreadsheet for price tracking Slack workspace with pricing channels SMTP email provider for alerts Sign up for Bright Data and create a web scraping dataset (use e-commerce template for product data) Obtain your Bright Data API token and dataset ID from the dashboard Configure these nodes: Schedule Daily Check: Set monitoring frequency using cron expression (default: 9 AM daily) Load Competitor URLs: Add competitor product URLs array, configure your current price, set alert threshold percentage Loop Through Competitors: Automatically handles multiple URLs (no configuration needed) Scrape with Bright Data: Add Bright Data