by Yang
Who is this for? This workflow is for social media agencies, influencer marketers, and brand managers who need to automatically qualify TikTok creators based on their follower metrics. It’s especially useful for teams managing influencer outreach campaigns or building talent databases. What problem is this workflow solving? Manually tracking TikTok user stats is time-consuming and inconsistent. This automation instantly pulls TikTok profile data and only saves creators who meet a defined follower threshold. It removes manual vetting, reduces spreadsheet work, and makes influencer qualification scalable. What this workflow does This workflow uses Airtable as the trigger, Dumpling AI to scrape TikTok profile information, and a logic condition to check if the profile has more than 100k followers. Qualified profiles are updated with full metrics and stored back in Airtable. Setup Airtable Setup Create a table with a field named Tik tok username Connect your Airtable account to n8n using a Personal Access Token Set up a trigger to run when a new TikTok username is added Dumpling AI Sign up at Dumpling AI Create a Dumpling AI credential in n8n using your API key The HTTP node sends the TikTok handle to Dumpling’s /get-tiktok-profile endpoint Configure Filter The IF node checks if followerCount is greater than or equal to 100000 Airtable Update If qualified, the record is updated with: ID (TikTok ID) followerCount followingCount heartCount videoCount How to customize this workflow to your needs Change the follower count threshold to fit your campaign (e.g. 10K, 500K, 1M) Add fields like engagement rate, niche tags, or scraped bio Chain additional steps like sending approved creators to your CRM or triggering outreach messages Add another filter to exclude private or inactive accounts
by Miquel Colomer
This n8n workflow template automates the process of collecting and delivering the "Top Deals of the Day" from MediaMarkt, tailored to user preferences. By combining user-submitted forms, Bright Data web scraping, GPT-4o-mini deal generation, and email delivery, this workflow sends personalized product recommendations straight to a user’s inbox. > ⚠️ Note: This workflow uses community nodes (Bright Data and Document Generator) which only work on *self-hosted n8n instances*. 🚀 What It Does Collects user preferences via a form (categories + email) Scrapes MediaMarkt’s deals page using Bright Data Uses GPT-4o-mini (OpenAI) to recommend top deals Generates a structured HTML email using a template Sends the personalized deals directly via email 🧩 Community Node Integration We created and used the following community nodes: Bright Data** – To scrape MediaMarkt deals using proxy-based scraping Document Generator** – To generate a templated HTML document from deal data These nodes are not available in n8n Cloud and require self-hosted n8n. 🛠️ Step-by-Step Setup Install Community Nodes Make sure you're on a self-hosted n8n instance. Install: n8n-nodes-brightdata n8n-nodes-document-generator Configure Credentials Bright Data API Key (Proxy + Scraping setup) OpenAI API Key (GPT-4o-mini access) SMTP Credentials for sending emails Customize the Form Adapt the form node to collect desired categories and email addresses. Typical categories include appliances, phones, laptops, etc. Design Your HTML Template In the Document Generator node, you can tweak the HTML/CSS to change how deals appear in the final email. Test the Workflow Submit the form with test data and check that the entire flow—from scraping to email—executes as expected. 🧠 How It Works: Workflow Overview User Interaction via Form Users select product categories and enter their email. This triggers the workflow. Data Extraction via Bright Data Bright Data scrapes the MediaMarkt offers page and returns HTML content. HTML Parsing Key elements like product names, prices, and links are extracted for processing. GPT-4o-mini Recommendation Generation The extracted data is sent to OpenAI (GPT-4o-mini), which filters, ranks, and enhances deals based on the user’s preferences. Data Structuring & Split The result is split into individual deal items to be formatted. HTML Document Creation Document Generator populates a clean HTML template with the top recommended deals. Email Delivery The final document is emailed via SMTP to the user with a friendly message. 📨 Final Output Users receive a custom HTML email featuring a curated list of top MediaMarkt deals based on their selected categories. 🔐 Credentials Used Bright Data API** – Web scraping with proxy support OpenAI API** – Generating personalized recommendations SMTP** – Sending personalized deal emails ✨ Customization Tips Change the Data Source**: You can adapt this to scrape other e-commerce sites. Update the Email Template**: Make it match your branding or include images. Extend the Form**: Add preferences like price range or specific brands. Add Scheduling**: Use Cron to run the workflow daily or weekly. ❓Questions? Template and node created by Miquel Colomer and n8nhackers.com. Need help customizing or deploying? Contact us for consulting and support.
by bangank36
Overview This workflow retrieves all blog and event collection items from a Squarespace site and saves them into a Google Sheets spreadsheet. It uses pagination to fetch 20 items per request, ensuring all content is collected efficiently. How It Works The workflow queries your Squarespace blog and event collections. It fetches data in paginated batches (20 items per page). The retrieved data is formatted and inserted into Google Sheets. The workflow runs on demand or on a schedule, ensuring your data stays up to date. Requirements Credentials To use this template, you need: Your Squarespace collection URL Google Sheets API credentials Google Sheets Setup Use this sample Google Sheets template to get started quickly. Who Is This For? This template is designed for: Bloggers looking to manage and analyze content externally. Businesses and marketers tracking content performance. Anyone who needs an automated way to extract Squarespace blog and event data. Explore More Templates Check out my other n8n templates: 👉 n8n.io/creators/bangank36
by n8n Team
This workflow syncs Discord scheduled events to Google Calendar. On a specified schedule, a request to Discord's API is made to get the scheduled events on a particular server. Only the events that have not been created or have recently been updated will be sent to Google Calendar. Prerequisites Discord account and Discord credentials. Google account and Google credentials. How it works Triggers off on the On schedule node. Gets the scheduled events from Discord. The IDs of the Discord scheduled events are used to get the events from Google Calendar, since the IDs are the same on creation of the Google Calendar event. We can now determine which events are new or have been updated. The new or updated events are created or updated in Google Calendar.
by hani safaei
This template helps anyone track how often their website appears in Google’s AI Overview. a growing part of search results that can’t currently be tracked using traditional SEO tools. With this workflow, users can: Input a list of keywords (from Google Search Console or manual research). Use the SerpApi to pull Google search results. Extract AI Overview content and its list of sources. Map that information into a structured Google Sheet, including whether your site is listed in those sources. Setup is straightforward and fully automated, but you'll need: A SerpApi key A connected Google Sheets account Who is this for? This workflow is designed for SEO professionals, digital marketers, and site owners who want to track their website’s visibility in Google AI Overviews. What problem does it solve? AI Overviews are rapidly becoming more common in Google search results. However, there's no tool (yet) that tells you if your website is appearing in those answers. This is a blind spot for SEO. This workflow helps you check your site’s presence in AI Overviews manually, at scale. What does the workflow do? The workflow: Takes a list of target keywords (exported from GSC or elsewhere) Uses SerpApi to get search results from Google Extracts the AI Overview block and its sources Checks if your domain is among them Saves all results into a Google Sheet The final Google Sheet will contain: Keyword | AI Overview Exists | List of Sources | Is my domain listed Setup You’ll need: A SerpApi API key A Google Sheet with your list of keywords A connected Google Sheets account in n8n How to customize this workflow Change the list of keywords (pull from GSC or edit the sheet manually) Replace the placeholder domain with your own Adjust the Google Sheet column mapping as needed
by Ahmed Saadawi
📝 Sync MySQL Rows to Google Sheet Description: This n8n template automates the process of syncing new records from a MySQL database table into a Google Sheet, ideal for reporting, backup, or lightweight dashboards. It is designed for teams or individuals who need to periodically export new data rows from a custom database (e.g., CRM, registrations, surveys) into a structured Google Sheet for further analysis, sharing, or archiving—without duplicates. 🛠️ What This Workflow Does: Runs every 15 minutes** via a schedule trigger. Selects unsynced rows** (sync = 0) from a MySQL table (fifa25_customers). Checks if records exist** to prevent unnecessary writes. Appends records to a Google Sheet**, mapping fields like name, email, phone, gender, and more. Updates the MySQL table** to mark those rows as synced (sync = 1) to avoid reprocessing. Fully annotated using sticky notes for easier understanding and onboarding. 📋 Setup Instructions: Create or select a Google Sheet and make sure the columns match the following: id, name, phone, birthdate, email, region, gender, datatime Ensure your MySQL table (fifa25_customers) has a sync column (default = 0 for new rows). Connect your MySQL and Google Sheets credentials inside n8n. (Optional): Add custom filtering or column transformations as needed. 👤 Who Is It For? Marketers syncing leads to a spreadsheet Ops teams pulling user data from internal tools Analysts logging form submissions or customer data Anyone needing lightweight scheduled ETL from MySQL to Sheets 🔐 Credentials Required: MySQL** Google Sheets OAuth2** ✅ Best Practices Followed: Uses IF node to prevent unnecessary processing Updates source database to avoid duplicates Includes sticky notes for clarity All columns are explicitly mapped Works out-of-the-box on any n8n instance with proper creds
by Rizky Febriyan
How It Works This workflow automates the analysis of security alerts from Sophos Central, turning raw events into actionable intelligence. It uses the official Sophos SIEM integration tool to fetch data, enriches it with VirusTotal, and leverages Google Gemini to provide a real-time threat summary and mitigation plan via Telegram. Prerequisite (Important): This workflow is triggered by a webhook that receives data from an external Python script. You must first set up the Sophos-Central-SIEM-Integration script from the official Sophos GitHub. This script will fetch data and forward it to your n8n webhook URL. Tool Source Code: Sophos/Sophos-Central-SIEM-Integration The n8n Workflow Steps Webhook: Receives enriched event and alert data from the external Python script. IF (Filter): Immediately filters the incoming data to ensure only events with a high or critical severity are processed, reducing noise from low-priority alerts. Code (Prepare Indicator): Intelligently inspects the Sophos event data to extract the primary threat indicator. It prioritizes indicators in the following order: File Hash (SHA256), URL/Domain, or Source IP. HTTP Request (VirusTotal): The extracted indicator is sent to the VirusTotal API to get a detailed reputation report, including how many security vendors flagged it as malicious. Code (Prompt for Gemini): The raw JSON output from VirusTotal is processed into a clean, human-readable summary and a detailed list of flagging vendors. AI Agent (Google Gemini): All collected data—the original Sophos log, the full alert details, and the formatted VirusTotal reputation—is compiled into a detailed prompt for Gemini. The AI acts as a virtual SOC analyst to: Create a concise incident summary. Determine the risk level. Provide a list of concrete, actionable mitigation steps. Telegram: The complete analysis and mitigation plan from Gemini is formatted into a clean, easy-to-read message and sent to your specified Telegram chat. Setup Instructions Configure the external Python script to forward events to this workflow's Production URL. In n8n, create Credentials for Google Gemini, VirusTotal, and Telegram. Assign the newly created credentials to the corresponding nodes in the workflow.
by InfraNodus
Using the knowledge graphs instead of RAG vector stores This workflow creates an AI chatbot agent that has access to several knowledge bases at the same time (used as "experts"). These knowledge bases are provided using the InfraNodus GraphRAG using the knowledge graphs and providing high-quality responses without the need to set up complex RAG vector store workflows. The advantages of using GraphRAG instead of the standard vector stores for knowledge are: Easy and quick to set up (no complex data import workflows needed) A knowledge graph has a holistic view of your knowledge base Better retrieval of relations between the document chunks = higher quality responses How it works This template uses the n8n AI agent node as an orchestrating agent that decides which tool (knowledge graph) to use based on the user's prompt. Here's a description step by step: The user submits a question using the AI chatbot (n8n interface, in this case, which can be accessed via a URL or embedded to any website) The AI agent node checks a list of tools it has access to. Each tool has a description of the knowledge it has auto-generated by InfraNodus. The AI agent decides which tool should be used to generate a response. It may reformulate user's query to be more suitable for the expert. The query is then sent to the InfraNodus HTTP node endpoint, which will query the graph that corresponds to that expert. Each InfraNodus GraphRAG expert provides a rich response that takes the whole context into account and provides a response from each expert (graph) along with a list of relevant statements retrieved using a combination or RAG and GraphRAG. The n8n AI Agent node integrates the responses received from the experts to produce the final answer. The final answer is sent back to the user's chat (or a webhook endpoint) How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Create a separate knowledge graph for each expert (using PDF / content import options) in InfraNodus For each graph, go to the workflow, paste the name of the graph into the body name field. Keep other settings intact or learn more about them at the InfraNodus access points page. Once you add one or more graphs as experts to your flow, add the LLM key to the OpenAI node and launch the workflow Requirements An InfraNodus account and API key An OpenAI (or any other LLM) API key Customizing this workflow You can use this same workflow with a Telegram bot, so you can interact with it using Telegram. There are many more customizations available. Check out the complete guide at https://support.noduslabs.com/hc/en-us/articles/20174217658396-Using-InfraNodus-Knowledge-Graphs-as-Experts-for-AI-Chatbot-Agents-in-n8n Also check out the video tutorial with a demo:
by damo
Overview This workflow leverages the KIE. AI Veo3 model to generate AI videos from simple text descriptions. Users interact via a form interface, inputting a prompt (e.g., a scene description), and the system automatically submits the request to the KIE. AI API, monitors the generation status in real time, and retrieves the final video output. It's ideal for content creators, marketers, or developers exploring text-to-video AI creation, supporting intelligent video generation with minimal setup. Prerequisites A KIE. AI account and API key: Sign up at KIE.AI to obtain your free or paid API key. An active n8n instance (cloud or self-hosted) with HTTP Request and form submission capabilities. Basic knowledge of AI prompts for video generation to achieve optimal results. Setup Instructions Obtain API Key: Register at KIE. AI and generate your API key. Store it securely—do not share it publicly. Configure the Form: In the "On Form Submission" node, ensure fields like "prompt" (for video description) and "api_key" are set up. Example prompt: "A serene mountain landscape at sunset with birds flying." Test the Workflow: Click "Execute Workflow" in n8n. Access the generated form URL, submit your prompt and API key. The workflow will poll the API every 10 seconds until the video is ready, then display the results. Handle Outputs: The final node formats and displays the video file URL for download or embedding. Customization Tips Enhance Prompts**: Include specifics like duration, style (e.g., realistic, animated), actions, and visual elements to improve AI video quality. Keywords for SEO**: This template focuses on AI video generation, text-to-video models, Veo3 API integration, and automated workflows.
by Ranjan Dailata
Who this is for? Extract Amazon Best Seller Electronic Info is an automated workflow that extracts best seller data from Amazon's Electronics section using Bright Data Web Unlocker, transform it into structured JSON using Google Gemini's LLM, and forwards a fully structured JSON response to a specified webhook for downstream use. This workflow is tailored for: eCommerce Analysts** Who need to monitor Amazon best-seller trends in the Electronics category and track changes in real-time or on a schedule. Product Intelligence Teams** Who want structured insights on competitor offerings, including rankings, prices, ratings, and promotions. AI-powered Chatbot Developers** Who are building assistants capable of answering product-related queries with fresh, structured data from Amazon. Growth Hackers & Marketers** Looking to automate competitive research and surface trending product data to inform pricing strategies. Data Aggregators and Price Trackers** Who need reliable and smart scraping of Amazon data enriched with AI-driven parsing. What problem is this workflow solving? Keeping up with Amazon's best sellers in Electronics is a time-consuming, error-prone task when done manually.This workflow automates the process, ensuring: Automating Data Extraction from Amazon Best Sellers using Bright Data, ensuring reliable access to real-time, structured data. Enhancing Raw Data with Google Gemini, turning product lists into structured JSON using the Google Gemini LLM. Sending Results to a Webhook, enabling seamless integration into dashboards, databases, or chatbots. What this workflow does The workflow performs the following steps: Extracts Amazon Best Seller Electronics page info using Bright Data's Web Unlocker API. Processes the unstructured content using Google Gemini's Flash Exp model to extract structured product data. Sends the structured information to a webhook endpoint. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the Amazon URL with the Bright Data zone by navigating to the Amazon URL with the Bright Data Zone node. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs This workflow is built to be flexible - whether you're a market researcher, e-commerce entrepreneur, or data analyst. Here's how you can adapt it to fit your specific use case: Change the Amazon Category** Update the Amazon URL with the topic of your interest such as Computers & Accessories, Home Audio, etc. Customize the Gemini Prompt** Update the Gemini prompt to get different styles of output — comparison tables, summaries, feature highlights, etc. Send Output to Other Destinations** Replace the Webhook URL to forward output to: Google Sheets Airtable Slack or Discord Custom API endpoints
by Anurag
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Description This workflow automates document processing and structured table extraction using the Nanonets API. You can submit a PDF file via an n8n form trigger or webhook—the workflow then forwards the document to Nanonets, waits for asynchronous parsing to finish, retrieves the results (including header fields and line items/tables), and returns the output as an Excel file. Ideal for automating invoice, receipt, or order data extraction with downstream business use. How It Works A document is uploaded (via n8n form or webhook). The PDF is sent to the Nanonets Workflow API for parsing. The workflow waits until processing is complete. Parsed results are fetched. Both top-level fields and any table rows/line items are extracted and restructured. Data is exported to Excel format and delivered to the requester. Setup Steps Nanonets Account: Register for a Nanonets account and set up a workflow for your specific document type (e.g., invoice, receipt). Credentials in n8n: Add HTTP Basic Auth credentials in n8n for the Nanonets API (never store credentials directly in node parameters). Webhook/Form Configuration: Option 1: Configure and enable the included n8n Form Trigger node for document uploads. Option 2: Use the included Webhook node to accept external POSTs with a PDF file. Adjust Workflow: Update any HTTP nodes to use your credential profile. Insert your Nanonets Workflow ID in all relevant nodes. Test the Workflow: Enable the workflow and try with a sample document. Features Accepts documents via n8n Form Trigger or direct webhook POST. Securely sends files to Nanonets for document parsing (credentials stored in n8n credentials manager). Automatically waits for async processing, checking Nanonets until results are ready. Extracts both header data and all table/line items into a tabular format. Exports results as an Excel file download. Modular nodes allow easy customization or extension. Prerequisites Nanonets account** with workflow configured for your document type. n8n** instance with HTTP Request, Webhook/Form, Code, and Excel/Spreadsheet nodes enabled. Valid HTTP Basic Auth credentials** saved in n8n for API access. Example Use Cases | Scenario | Benefit | |-----------------------|--------------------------------------------------| | Invoice Processing | Automated extraction of line items and totals | | Receipt Digitization | Parse amounts and charges for expense reports | | Purchase Orders | Convert scanned POs into structured Excel sheets | Notes You must set up credentials in the n8n credentials manager—do not store API keys directly in nodes. All configuration and endpoints are clearly explained with inline sticky notes in the workflow editor. Easily adaptable for other document types or similar APIs—just modify endpoints and result mapping.
by Lucas Peyrin
How it works This workflow demonstrates a fundamental pattern for securing a webhook by requiring an API key. It acts as a gatekeeper, checking for a valid key in the request header before allowing the request to proceed. Incoming Request: The Secured Webhook node receives an incoming POST request. It expects an API key to be sent in the x-api-key header. API Key Verification: The Check API Key node takes the key from the incoming request's header. It then makes an internal HTTP request to a second webhook (Get API Key) which acts as a mock database. This second webhook retrieves a list of registered API keys (from the Registered API Keys node) and filters it to find a match for the key that was provided. Conditional Response: If a match is found, the API Key Identified node routes the execution to the "success" path, returning a 200 OK response with the identified user's ID. If no match is found, it routes to the "unauthorized" path, returning a 401 Unauthorized error. This pattern separates the public-facing endpoint from the data source, which is a good security practice. Set up steps Setup time: ~2 minutes This workflow is designed to be a self-contained example. Set up Credentials: This workflow uses "Header Auth" for its internal communication. Go to Credentials and create a new Header Auth credential. You can use any name and value (e.g., Name: X-N8N-Auth, Value: my-secret-password). Select this credential in all four webhook/HTTP Request nodes. Add Your API Keys: Open the Registered API Keys node. This is your mock database. Edit the array to include the user_id and api_key pairs you want to authorize. Activate the workflow. Test it: Use the Test Secure Webhook node to send a request. Try it with a valid key from your list to see the success response. Change the x-api-key header to an invalid key to see the 401 Unauthorized error. For Production: Replace the mock database part of this workflow (the Get API Key webhook and Registered API Keys node) with a real database node like Supabase, Postgres, or Baserow to look up keys.