by Amit Mehta
This workflow performs structured data extraction and data mining from a web page by combining the capabilities of Bright Data and Google Gemini. How it Works This workflow focuses on extracting structured data from a web page using Bright Data's Web Unlocker Product. It then uses n8n's AI capabilities, specifically Google Gemini Flash Exp, for information extraction and custom sentiment analysis. The results are sent to webhooks and saved as local files. Use Cases Data Mining**: Automating the process of extracting and analyzing data from websites. Web Scraping**: Gathering structured data for market research, competitive analysis, or content aggregation. Sentiment Analysis**: Performing custom sentiment analysis on unstructured text. Setup Instructions Bright Data Credentials: You need to have an account and a Web Unlocker zone with Bright Data. Update the Header Auth account credentials in the Perform Bright Data Web Request node. Google Gemini Credentials: Provide your Google Gemini(PaLM) Api account credentials for the AI-related nodes. Configure URL and Zone: In the Set URL and Bright Data Zone node, set the web URL you want to scrape and your Bright Data zone. Update Webhook: Update the Webhook Notification URL in the relevant HTTP Request nodes. Workflow Logic Trigger: The workflow is triggered manually. Set Parameters: It sets the target URL and the Bright Data zone. Web Request: The workflow performs a web request to the specified URL using Bright Data's Web Unlocker. The output is formatted as markdown. Data Extraction & Analysis: The markdown content is then processed by multiple AI nodes to: Extract textual data from the markdown. Perform topic analysis with a structured response. Analyze trends by location and category with a structured response. Output: The extracted data and analysis are sent to webhooks and saved as JSON files on disk. Node Descriptions | Node Name | Description | |-----------|-------------| | When clicking 'Test workflow' | A manual trigger node to start the workflow. | | Set URL and Bright Data Zone | A Set node to define the URL to be scraped and the Bright Data zone to be used. | | Perform Bright Data Web Request | An httpRequest node that performs the web request to Bright Data's API to retrieve the content. | | Markdown to Textual Data Extractor | An AI node that uses Google Gemini to convert markdown content into plain text. | | Google Gemini Chat Model | A node representing the Google Gemini model used for the data extraction. | | Topic Extractor with the structured response | An AI node that performs topic analysis and outputs the results in a structured JSON format. | | Trends by location and category with the structured response | An AI node that analyzes and clusters emerging trends by location and category, outputting a structured JSON. | | Initiate a Webhook Notification... | These nodes send the output of the AI analysis to a webhook. | | Create a binary file... | Function nodes that convert the JSON output into binary format for writing to a file. | | Write the topics/trends file to disk | readWriteFile nodes that save the binary data to a local file (d:\topics.json and d:\trends.json). | Customization Tips Change the web URL in the Set URL and Bright Data Zone node to scrape different websites. Modify the AI prompts in the AI nodes to customize the analysis (e.g., change the sentiment analysis criteria). Adjust the output path in the readWriteFile nodes to save the files to a different location. Suggested Sticky Notes for Workflow Note**: "This workflow deals with the structured data extraction by utilizing Bright Data Web Unlocker Product... Please make sure to set the web URL of your interest within the 'Set URL and Bright Data Zone' node and update the Webhook Notification URL". LLM Usages**: "Google Gemini Flash Exp model is being used... Information Extraction is being used for the handling the custom sentiment analysis with the structured response". Required Files 1GOrjyc9mtZCMvCr_Structured_Data_Extract,Data_Mining_with_Bright_Data&_Google_Gemini.json: The main n8n workflow export for this automation. Testing Tips Run the workflow and check the webhook to verify that the extracted data is being sent correctly. Confirm that the d:\topics.json and d:\trends.json files are created on your disk with the expected structured data. Suggested Tags & Categories Engineering AI
by Daniel
Harness OpenAI's Sora 2 for instant video creation from text or images using fal.ai's APIโpowered by GPT-5 for refined prompts that ensure cinematic quality. This template processes form submissions, intelligently routes to text-to-video (with mandatory prompt enhancement) or image-to-video modes, and polls for completion before redirecting to your generated clip. ๐ What This Template Does Users submit prompts, aspect ratios (9:16 or 16:9), models (sora-2 or pro), durations (4s, 8s, or 12s), and optional images via a web form. For text-to-video, GPT-5 automatically refines the prompt for optimal Sora 2 results; image mode uses the raw input. It calls one of four fal.ai endpoints (text-to-video, text-to-video/pro, image-to-video, image-to-video/pro), then loops every 60s to check status until the video is ready. Handles dual modes: Text (with GPT-5 enhancement) or image-seeded generation Supports pro upgrades for higher fidelity and longer clips Auto-uploads images to a temp host and polls asynchronously for hands-free results Redirects directly to the final video URL on completion ๐ง Prerequisites n8n instance with HTTP Request and LangChain nodes enabled fal.ai account for Sora 2 API access OpenAI account for GPT-5 prompt refinement ๐ Required Credentials fal.ai API Setup Sign up at fal.ai and navigate to Dashboard โ API Keys Generate a new key with "sora-2" permissions (full access recommended) In n8n, create "Header Auth" credential: Name it "fal.ai", set Header Name to "Authorization", Value to "Key [Your API Key]" OpenAI API Setup Log in at platform.openai.com โ API Keys (top-right profile menu) Click "Create new secret key" and copy it (store securely) In n8n, add "OpenAI API" credential: Paste key, select GPT-5 model in the LLM node โ๏ธ Configuration Steps Import the workflow JSON into your n8n instance via Settings โ Import from File Assign fal.ai and OpenAI credentials to the relevant HTTP Request and LLM nodes Activate the workflowโthe form URL auto-generates in the trigger node Test by submitting a sample prompt (e.g., "A cat chasing a laser"); monitor executions for video output Adjust polling wait (60s node) for longer generations if needed ๐ฏ Use Cases Social Media Teams**: Generate 9:16 vertical Reels from text ideas, like quick product animations enhanced by GPT-5 for professional polish Content Marketers**: Animate uploaded images into 8s promo clips, e.g., turning a static ad graphic into a dynamic story for email campaigns Educators and Trainers**: Create 4s explainer videos from outlines, such as historical reenactments, using pro mode for detailed visuals App Developers**: Embed as a backend service to process user prompts into Sora 2 videos on-demand for creative tools โ ๏ธ Troubleshooting API quota exceeded**: Check fal.ai dashboard for usage limits; upgrade to pro tier or extend polling waits Prompt refinement fails**: Ensure GPT-5 credential is set and output matches JSON schemaโtest LLM node independently Image upload errors**: Confirm file is JPG/PNG under 10MB; verify tmpfiles.org endpoint with a manual curl test Endless polling loop**: Add an IF node after 10 checks to timeout; increase wait to 120s for 12s pro generations
by furuidoreandoro
Automated TikTok Repurposing & Video Generation Workflow Whoโs it for This workflow is designed for content creators, social media managers, and marketersโspecifically those in the career, recruitment, or "job change" (่ปข่ท/ๅฐฑ่ท) niches. It is ideal for anyone looking to automate the process of finding trending short-form content concepts and converting them into fresh AI-generated videos. How it works / What it does This workflow automates the pipeline from content research to video creation: Scrape Data: It triggers an Apify actor (clockworks/tiktok-scraper) to search and scrape TikTok videos related to "Job Change" (่ปข่ท) and "Employment" (ๅฐฑ่ท). Store Raw Data: It saves the scraped TikTok metadata (text, stats, author info) into a Google Sheet. AI Analysis & Prompting: An AI Agent (via OpenRouter) analyzes the scraped video content and creates a detailed prompt for a new video (concept, visual cues, aspect ratio). Log Prompts: The generated prompt is saved to a separate tab in the Google Sheet. Video Generation: The prompt is sent to Fal AI (Veo3 model) to generate a new 8-second, vertical (9:16) video with audio. Wait & Retrieve: The workflow waits for the generation to complete, then retrieves the video file. Cloud Storage: Finally, it uploads the generated video file to a specific Google Drive folder. How to set up Credentials: Configure the following credentials in n8n: Apify API: (Currently passed via URL query params in the workflow, recommended to switch to Header Auth). Google Sheets OAuth2: Connect your Google account. OpenRouter API: For the AI Agent. Fal AI (Header Auth): For the video generation API. Google Drive OAuth2: For uploading the final video. Google Sheets: Create a spreadsheet. Note the documentId and update the Google Sheets nodes. Ensure you have the necessary Sheet names (e.g., "ใทใผใ1" for raw data, "็ๆๆธใฟ" for prompts) and columns mapped. Google Drive: Create a destination folder. Update the Upload file node with the correct folderId. Apify: Update the token in the HTTP Request and HTTP Request1 URLs with your own Apify API token. Requirements n8n Version:** 1.x or higher (Workflow uses version 4.3 nodes). Apify Account:** With access to clockworks/tiktok-scraper and sufficient credits. Fal.ai Account:** With credits for the fal-ai/veo3 model. OpenRouter Account:** With credits for the selected LLM. Google Workspace:** Access to Drive and Sheets. How to customize the workflow Change the Niche:* Update the searchQueries JSON body in the first *HTTP Request** node (e.g., change "่ปข่ท" to "Cooking" or "Fitness"). Adjust AI Logic:* Modify the *AI Agent** system prompt to change the style, tone, or structure of the video prompts it generates. Video Settings:* In the *Fal Submit** node, adjust bodyParameters to change the duration (e.g., 5s), aspect ratio (e.g., 16:9), or disable audio. Scale:* Increase the amount in the *Limit** node to process more than one video per execution.
by Dahiana
AI Content Summarizer Suite This n8n template collection demonstrates how to build a comprehensive AI-powered content summarization system that handles multiple input types: URLs, raw text, and PDF files. Built as 4 separate workflows for maximum flexibility. Use cases: Research workflows, content curation, document processing, meeting prep, social media content creation, or integrating smart summarization into any app or platform. How it works Multi-input handling: Separate workflows for URLs (web scraping), direct text input, and PDF file processing Smart PDF processing: Attempts text extraction first, falls back to OCR.Space for image-based PDFs AI summarization: Uses OpenAI's GPT-4.1-mini with customizable length (brief/standard/detailed) and focus areas (key points/numbers/conclusions/action items) Language support: Multi-language summaries with automatic language detection Flexible output: Returns clean markdown-formatted summaries via webhook responses Unified option: The all-in-one workflow automatically detects input type and routes accordingly How to use Replace webhook triggers with your preferred method (manual, form, API endpoint) Each workflow accepts different parameters: URL, text content, or file upload Customize summary length and focus in the AI prompt nodes Authentication is optional - switch to "none" if running internally Perfect for integration with Bubble, Zapier, or any platform that can make HTTP requests Requirements OpenAI API key or OpenRouter Keys OCR.Space API key (for PDF fallback processing) n8n instance (cloud or self-hosted) Any platform that can make HTTP requests. Setup Steps Replace "Dummy OpenAI" with your OpenAI credentials Add your OCR.Space API key in the OCR nodes is not mandatory. Update webhook authentication as needed Test each workflow path individually
by Fahmi Fahreza
Sign up for Decodo HERE for Discount Automatically scrape, structure, and log forum or news content using Decodo and Google Gemini AI. This workflow extracts key details like titles, URLs, authors, and engagement stats, then appends them to a Google Sheet for tracking and analysis. Whoโs it for? Ideal for data journalists, market researchers, or AI enthusiasts who want to monitor trending topics across specific domains. How it works Trigger: Workflow runs on schedule. Data Setup: Defines forum URLs and geolocation. Scraping: Extracts raw text data using the Decodo API. AI Extraction: Gemini parses and structures the scraped text into clean JSON. Data Storage: Each news item is appended or updated in Google Sheets. Logging: Records scraping results for monitoring and debugging. How to set up Add your Decodo, Google Gemini, and Google Sheets credentials in n8n. Adjust the forum URLs, geolocation, and Google Sheet ID in the Workflow Config node. Set your preferred trigger interval in Schedule Trigger. Activate and monitor from the n8n dashboard.
by Fahmi Fahreza
TikTok Trend Analyzer with Apify + Gemini + Airtable Automatically scrape trending TikTok videos, analyze their virality using Gemini AI, and store insights directly into Airtable for creative research or content planning. Whoโs it for? Marketing analysts, creators, and creative agencies looking to understand why videos go viral and how to replicate successful hooks and formats. How it works A scheduled trigger runs the Apify TikTok Trends Scraper weekly. The scraper collects trending video metadata. Data is stored in Airtable (views, likes, captions, sounds, etc.). When a specific video is submitted via webhook, the workflow fetches it from Airtable. Gemini AI analyzes the video and extracts structured insights: summary, visual hook, audio, and subtitle analysis. The workflow updates the Airtable record with these AI insights. How to set up Connect Apify and Airtable credentials, link Gemini or OpenAI keys, and adjust the schedule frequency. Add your Airtable base and table IDs. You can trigger analysis manually via the webhook endpoint.
by Atik
Automate multi-document handling with AI-powered extraction that adapts to any format and organizes it instantly. What this workflow does Monitors Google Drive for new uploads (receipts, resumes, claims, physician orders, blueprints, or any doc type) Automatically downloads and prepares files for analysis Identifies the document type using Google Gemini Parses structured data via the trusted VLM Run node with OCR + layout parsing Stores records in Google Sheets โ AI Agent maps values to the correct sheet dynamically Setup Prerequisites: Google Drive & Google Sheets accounts, VLM Run API credentials, n8n instance. Install the verified VLM Run node by searching for VLM Run in the node list, then click Install. Once installed, you can integrate it directly for high-accuracy data extraction. Quick Setup: Configure Google Drive OAuth2 and select a folder for uploads Add VLM Run API credentials Create a Master Reference Google Sheet with the following structure: | Document_Name | Spreadsheet_ID | | ---------------------- | ----------------------------- | | Receipt | your-receipt-sheet-id | | Resume | your-resume-sheet-id | | Physician Order | your-physician-order-sheet-id | | Claims Processing | your-claims-sheet-id | | Construction Blueprint | your-blueprint-sheet-id | The first column holds the document type, and the second column holds the target sheet ID where extracted data should be appended. In the AI Agent node, edit the agent prompt to: Analyze the JSON payload from VLM Run Look up the document type in the Master Reference Sheet If a matching sheet exists โ fetch headers, then append data accordingly If headers donโt exist โ create them from JSON keys, then insert values If no sheet exists โ add the new type to the Master Reference with an empty Spreadsheet ID Test with a sample upload and activate the workflow How to customize this workflow to your needs Extend functionality by: Adjusting the AI Agent prompt to support any new document schema (just update field mappings) Adding support for multi-language OCR or complex layouts in VLM Run Linking Sheets data to BI dashboards or reporting tools Triggering notifications when new entries are stored This workflow leverages the VLM Run node for flexible, precision extraction and the AI Agent for intelligent mapping, creating a powerful system that adapts to any document type with minimal setup changes.
by Arlin Perez
๐ Description: Effortlessly delete unused or inactive workflows from your n8n instance while automatically backing them up as .json files into your Google Drive. Keep your instance clean, fast, and organized โ no more clutter slowing you down. This workflow is ideal for users managing large self-hosted n8n setups, or anyone who wants to maintain optimal performance while preserving full workflow backups. โ What it does: Accepts a full n8n Workflow URL via a form Retrieves workflow info automatically Converts the workflowโs full JSON definition into a file Uploads that file to Google Drive Deletes the workflow safely using the official n8n API Sends a Telegram notification confirming backup and deletion โ๏ธ How it works: ๐ Form โ Collects the full workflow URL from the user ๐ n8n Node (Get Workflow) โ Uses the URL to fetch workflow details ๐ฆ Code Node ("JSON to File") โ Converts the workflow JSON into a properly formatted .json file with UTF-8 encoding, ready to be uploaded to Google Drive. โ๏ธ Google Drive Upload โ Uploads the .json backup file to your selected Drive folder ๐๏ธ n8n Node (Delete Workflow) โ Deletes the workflow from your instance using its ID ๐ฌ Telegram Notification โ Notifies you that the workflow was backed up and deleted, showing title, ID, and date ๐ Requirements: Google Drive connected to your n8n account Telegram Bot connected to n8n An n8n instance with API access (self-hosted or Cloud) Your n8n API Key (Create one in the settings) ๐ ๏ธ How to Set Up: โ Add your Google Drive credentials โ Add your Telegram Bot credentials ๐งพ In the โJSON to Fileโ Code node, no additional setup is required โ it automatically converts the workflow JSON into a downloadable .json file using the correct encoding and filename format. โ๏ธ In the Google Drive node: Binary Property: data Folder ID: your target folder in Google Drive ๐ Create a new credential for the n8n node using: API Key: your personal n8n API key Base URL: your full n8n instance API path (e.g. https://your-n8n-instance.com/api/v1) โ๏ธ Use this credential in both the Get Workflow and Delete Workflow n8n nodes ๐ฌ In the Telegram node, use this message template: ๐๏ธ Workflow "{{ $json.name }}" (ID: {{ $json.id }}) was backed up to Google Drive and deleted from n8n. ๐ {{ $now }} ๐ Important: This workflow backs up the entire workflow data to Google Drive. Please be careful with the permissions of your Google Drive folder and avoid sharing it publicly, as the backups may contain sensitive information. Ensuring proper security and access control is essential to protect your data. ๐ Activate the workflow and you're ready to safely back up and remove workflows from your n8n instance
by Rahul Joshi
Description Turn incoming Gmail messages into Zendesk tickets and keep a synchronized log in Google Sheets. Uses Gmail as the trigger, creates Zendesk tickets, and appends or updates a central sheet for tracking. Gain a clean, auditable pipeline from inbox to support queue. โจ What This Template Does Fetches new emails via Gmail Trigger. โ๏ธ Normalizes Gmail payload for consistent fields. ๐งน Creates a Zendesk ticket from the email content. ๐ซ Formats data for Sheets and appends or updates a row. ๐ Executes helper sub-workflows and writes logs for traceability. ๐๐งพ Key Benefits Converts emails to actionable support tickets automatically. โก Maintains a single source of truth in Google Sheets. ๐ Reduces manual triage and data entry. ๐ Improves accountability with structured logs. โ Features Gmail Trigger for real-time intake. โฑ๏ธ Normalize Gmail Data for consistent fields. ๐งฉ Create Zendesk Ticket (create: ticket). ๐๏ธ Format Sheet Data for clean columns. ๐งฑ Log to Google Sheets with appendOrUpdate. ๐ Execute workflow (sub-workflow) steps for modularity. ๐งฉ Requirements n8n instance (cloud or self-hosted). ๐ ๏ธ Gmail credentials configured in n8n (with read access to the monitored inbox). โ๏ธ Zendesk credentials (API token or OAuth) with permission to create tickets. ๐ Google Sheets credentials with access to the target spreadsheet for append/update. ๐ Access to any sub-workflows referenced by the Execute workflow nodes. ๐ Target Audience IT support and helpdesk teams managing email-based requests. ๐ฅ๏ธ Ops teams needing auditable intake logs. ๐งพ Agencies and service providers converting client emails to tickets. ๐ค Small teams standardizing email-to-ticket flows. ๐งโ๐ผ Step-by-Step Setup Instructions Connect Gmail, Zendesk, and Google Sheets in n8n Credentials. ๐ Set the Gmail Trigger to watch the desired label/inbox. ๐จ Map Zendesk fields (description) from normalized Gmail data. ๐งญ Point the Google Sheets node to your spreadsheet and confirm appendOrUpdate mode. ๐ Assign credentials to all nodes, including any Execute workflow steps. ๐ Run once to test end-to-end; then activate the workflow. โ
by Meak
Firecrawl Web Search Agent โ Google Sheets Logger with OpenRouter + n8n Most teams craft search operators by hand and copy results into spreadsheets. This workflow automates query generation, multi-operator searches, scraping, and logging โ from a single webhook call. Benefits Auto-generate Firecrawl queries from natural language (OpenRouter Agent) Use pro operators: site:, inurl:, intitle:, exclusions, related Run parallel searches (site match, in-URL, exclusions, YouTube/intitle) Append titles/URLs/results to Google Sheets automatically Return results to the caller via webhook response Optional scraping of markdown + full-page screenshots How It Works Webhook receives a natural-language search request OpenRouter-powered Agent converts it to a Firecrawl query (+ limit) Firecrawl Search runs with scrapeOptions (markdown, screenshot) Parallel queries: site:, inurl:, negative filters, YouTube intitle:automation Collect results (title, url, data fields) from each call Append rows to Google Sheets (one per result) Respond to the webhook with the aggregated payload Ready to chain into alerts, enrichment, or CRM sync Who Is This For Researchers and content teams building source lists Growth/SEO teams needing precise operator queries Agencies automating discovery, monitoring, and logging Setup Connect OpenRouter (select your LLM; e.g., GPT-4.1-mini) Add Firecrawl API key and endpoint (/v1/search) Connect Google Sheets (Document ID + Sheet/Tab) Set webhook path and allow POST from your app Define default limit (fallback = 5) and scrapeOptions ROI & Monetization Save 3โ6 hours/week on manual searching & copy/paste Offer as a $500โ$2k/month research automation for clients Upsell alerts (cron/webhook) and data enrichment for premium retainers Strategy Insights In the full walkthrough, I show how to: Prompt the Agent to produce flawless site:/inurl/intitle/-exclusions Map Firecrawl data fields cleanly into Sheets Handle rate limits, empty results, and retries Extend with dedupe, domain filtering, and Slack/Telegram alerts Check Out My Channel For more advanced AI automation systems that generate real business results, check out my YouTube channel where I share the exact strategies I use to build automation agencies, sell high-value services, and scale to $20k+ monthly revenue.
by Rapiwa
Automatically Send WhatsApp Discount Codes to Shopify Customers Using Rapiwa Who is this for? This n8n workflow automatically sends WhatsApp promotional messages to top customers whenever a new discount code is created in Shopify. Itโs perfect for store owners, marketers, sales teams, or support agents who want to engage their best customers effortlessly. The workflow fetches customer data, filters high-spending customers, verifies their WhatsApp numbers using the Rapiwa API, sends discount messages to verified contacts, and logs all activity in Google Sheets. Designed for non-technical users who donโt use the official WhatsApp Business API, this automation simplifies customer outreach and tracking without any manual work. What this Workflow Does This n8n workflow connects with a Google Sheet that contains a list of contacts. It reads rows marked for processing, cleans the phone numbers, checks their validity using Rapiwa's WhatsApp validation API, sends WhatsApp messages to valid numbers, and updates the status of each row accordingly. Key Features Runs Every 5 Minutes**: Automatically triggers the workflow Google Sheets Integration**: Reads and writes data from a specific sheet Phone Number Validation**: Confirms if a WhatsApp number is active via Rapiwa API Message Sending**: Sends a message using Rapiwa's /send-message endpoint Status Update**: Sheet is updated with success or failure status Safe API Usage**: Delays added between requests to prevent rate limits Batch Limit**: Processes max 60 rows per cycle Conditional Checks**: Skips rows without a "check" value Requirements A Google Sheet with necessary columns Rapiwa account** with active subscription (you can free 200 message) Your WhatsApp number connected to Rapiwa Valid Bearer Token n8n Instance** (self-hosted or cloud) Google Sheets node configured HTTP Request node access How to Use Step-by-Step Setup Webhook Receives Shopify Webhook (discount creation) via HTTP POST request. This is triggered when a discount is created in your Shopify store. Configure Google Sheets in n8n Use the Google Sheets node with OAuth2 access Get Rapiwa API Token Create an account on Rapiwa Connect your WhatsApp number Copy your Bearer Token from the Rapiwa dashboard Set Up HTTP Request Nodes Validate number via: https://app.rapiwa.com/api/verify-whatsapp Send message via: https://app.rapiwa.com/api/send-message Add your bearer token to the headers Google Sheet Column Structure A Google Sheet** formatted like this โค Sample | discount_code | created_at | shop_domain | name | number | verify | status | | -------------------------------------------- | ------- | ------------------------- | ----------------------- | -------------- | ------------- | ---------- | -------- | | V8ZGVRDFP5TB | 2025-09-25T05:26:40-04:00 | your_shop_domain | Abdul Mannan | 8801322827798| unverified | not sent | | V8ZGVRDFP5TB | 2025-09-25T05:26:40-04:00 | your_shop_domain | Abdul Mannan | 8801322827799| verified | sent | Support & Help Rapiwa Website:** https://rapiwa.com WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Evoort Solutions
๐ Website Traffic Monitoring with SEMrush API and Google Sheets Integration Leverage the powerful SEMrush Website Traffic Checker API to automatically fetch detailed website traffic insights and log them into Google Sheets for real-time monitoring and reporting. This no-code n8n workflow simplifies traffic analysis for marketers, analysts, and website owners. โ๏ธ Node-by-Node Workflow Breakdown 1. ๐ข On Form Submission Trigger:** The workflow is initiated when a user submits a website URL via a form. This serves as the input for further processing. Use Case:** When you want to track multiple websites and monitor their performance over time. 2. ๐ Website Traffic Checker API Request:* The workflow makes a POST request to the *SEMrush Website Traffic Checker API** via RapidAPI using the website URL that was submitted. API Data:** The API returns detailed traffic insights, including: Visits Bounce rate Page views Sessions Traffic sources And more! 3. ๐ Reformat Parsing:** The raw API response is parsed to extract the relevant data under trafficSummary. Data Structure:** The workflow creates a clean dataset of traffic data, making it easy to store in Google Sheets. 4. ๐ Google Sheets Logging Data:** The traffic data is appended as a new row in your Google Sheet. Google Sheet Setup:** The data is organized and updated in a structured format, allowing you to track website performance over time. ๐ก Use Cases ๐ SEO & Digital Marketing Agencies:** Automate client website audits by pulling live traffic data into reports. ๐ Website Owners & Bloggers:** Monitor traffic growth and analyze content performance automatically. ๐ Data Analysts & Reporting Teams:** Feed traffic data into dashboards and integrate with other KPIs for deeper analysis. ๐ต๏ธ Competitor Tracking:** Regularly log competitor site metrics for comparative benchmarking. ๐ฏ Key Benefits โ Automated Traffic Monitoring โ Run reports automatically on-demand or on a scheduled basis. โ Real-Time Google Sheets Logging โ Easily centralize and structure traffic data for easy sharing and visualization. โ Zero Code Required โ Powered by n8nโs visual builder, set up workflows quickly without writing a single line of code. โ Scalable & Flexible โ Extend the workflow to include alerts, additional API integrations, or other automated tasks. ๐ How to Get Your SEMrush API Key via RapidAPI Visit the API Listing ๐ SEMrush Website Traffic Checker API Sign In or Create an Account Log in to RapidAPI or sign up for a free account. Subscribe to the API Choose the appropriate pricing plan and click Subscribe. Access Your API Key Go to the Endpoints tab. Your API key is located under the X-RapidAPI-Key header. Secure & Use the Key Add your API key to the request headers in your workflow. Never expose the key publicly. ๐ง Step-by-Step Setup Instructions 1. Creating the Form to Capture URL In n8n, create a new workflow and add a Webhook trigger node to capture website URLs. Configure the webhook to accept URL submissions from your form. Add a form to your website or app that triggers the webhook when a URL is submitted. 2. Configure SEMrush API Request Node Add an HTTP Request node after the webhook. Set the method to POST and the URL to the SEMrush API endpoint. Add the necessary headers: X-RapidAPI-Host: semrush-website-traffic-checker.p.rapidapi.com X-RapidAPI-Key: [Your API Key] Pass the captured website URL from the webhook as a parameter in the request body. 3. Reformat API Response Add a Set node to parse and structure the API response. Extract only the necessary data, such as: trafficSummary.visits trafficSummary.bounceRate trafficSummary.pageViews trafficSummary.sessions Format the response to be clean and suitable for Google Sheets. 4. Store Data in Google Sheets Add the Google Sheets node to your workflow. Authenticate with your Google account. Select the spreadsheet and worksheet where you want to store the traffic data. Configure the node to append new rows with the extracted traffic data. Google Sheets Columns Setup A**: Website URL B**: Visits C**: Bounce Rate D**: Page Views E**: Sessions F**: Date/Time (optional, you can use a timestamp) 5. Test and Deploy Run a test submission through your form to ensure the workflow works as expected. Check the Google Sheets document to verify that the data is being logged correctly. Set up scheduling or additional workflows as needed (e.g., periodic updates). ๐ Customizing the Template You can modify the workflow to suit your specific needs: Add more data points**: Customize the SEMrush API request to fetch additional metrics (e.g., traffic sources, keywords, etc.). Create separate sheets**: If you're tracking multiple websites, you can create a different sheet for each website or group websites by category. Add alerts**: Set up email or Slack notifications if specific traffic conditions (like sudden drops) are met. Visualize data**: Integrate Google Sheets with Google Data Studio or other tools for more advanced visualizations. ๐ Start Automating in Minutes Build your automated website traffic dashboard with n8n today โ no coding required. ๐ Start with n8n for Free Save time, improve accuracy, and supercharge your traffic insights workflow!