by ist00dent
This n8n template allows you to instantly generate QR codes from any text or URL by simply sending a webhook request. It's a versatile tool for creating dynamic QR codes for various purposes, from marketing campaigns to event registrations, directly integrated into your automated workflows. 🔧 How it works Receive Data Webhook: This node acts as the entry point for the workflow. It listens for incoming POST requests and expects a JSON body with a data property containing the text or URL you want to encode into the QR code. Generate QR Code: This node makes an HTTP GET request to the QR Server API (api.qrserver.com) to generate the QR code image. The content from your webhook is passed as the data parameter to the API. Respond with QR Code: This node sends the response from the QR Server API back to the service that initiated the webhook. The QR Server API directly returns the image data, so your webhook response will be the QR code image itself. 👤 Who is it for? This workflow is ideal for: Marketers: Generate QR codes for product links, event registrations, or promotional materials on the fly. Developers: Integrate QR code generation into applications, websites, or internal tools. Event Organizers: Create dynamic QR codes for ticketing, information access, or check-ins. Businesses: Streamline processes requiring physical-to-digital transitions, like menu access or contact sharing. Automation Enthusiasts: Add QR code generation capabilities to any workflow. 📑 Data Structure When you trigger the webhook, send a POST request with a JSON body structured as follows: { "data": "https://www.yourwebsite.com/your-specific-page-or-text-to-encode" } The workflow will return the QR code image directly in the response. ⚙️ Setup Instructions Import Workflow: In your n8n editor, click "Import from JSON" and paste the provided workflow JSON. Configure Webhook Path: Double-click the Receive Data Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /generate-qr). Customize QR Code (Optional): Double-click the Generate QR Code node. You can adjust the size parameter in the URL (e.g., size=200x200 for a larger QR code) or add other parameters supported by the QR Server API (e.g., bgcolor, color, qzone). Activate Workflow: Save and activate the workflow. 📝 Tips Handling the Image Output: Since the QR Server API directly returns the image, the webhook response will be the image data. Depending on your use case, you might want to: Save to File/Cloud: Insert a node (e.g., Write Binary File, Amazon S3, Google Drive) after Generate QR Code to save the image to a file system or cloud storage. Embed in HTML/Email: If you're building an HTML response or sending an email, you might need to convert the image data to a Base64 string or provide a URL to a saved image. Error Handling: Enhance workflow robustness by adding an Error Trigger node. This allows you to catch any issues during QR code generation and set up notifications or logging. Dynamic Size/Color: You can extend the Receive Data Webhook to accept parameters for size, color, or bgcolor in the incoming JSON. Then, dynamically pass these to the url of the Generate QR Code node to create highly customizable QR codes. Input Validation: For more advanced use cases, you could add a Function node after the webhook to validate the incoming data to ensure it's in a valid format (e.g., a URL).
by Vadym Nahornyi
This workflow automatically transcribes audio files, translates the content between languages, and generates natural-sounding speech from the translated text - all in one seamless process. Who's it for Content creators, educators, and businesses needing to make their audio content accessible across language barriers. Perfect for translating podcasts, voice messages, lectures, or any audio content while preserving the spoken format. How it works The workflow receives an audio file through a webhook, transcribes it using OpenAI's Whisper, translates and structures the text with GPT-4, generates new audio in the target language, and stores it in S3 for easy access. The entire process takes seconds and returns both the transcribed/translated text and a URL to the translated audio file. How to set up Configure OpenAI credentials - Add your OpenAI API key for Whisper transcription and GPT-4 translation Set up AWS S3 - Create a bucket with public read permissions for audio storage Update configuration - Replace 'YOUR-BUCKET-NAME' with your actual S3 bucket name Activate webhook - Deploy and copy your webhook URL for receiving audio files Send a POST request with: Binary audio file (as 'audiofile') Languages parameter (e.g., "English, Spanish") Requirements OpenAI API account with access to Whisper and GPT-4 AWS account with S3 bucket configured Basic understanding of webhooks and API requests How to customize Add language detection** - Automatically detect source language if not specified Customize voice settings** - Adjust speech speed, pitch, or select different voices Add file validation** - Implement size limits and format checks Enhance security** - Add webhook authentication and rate limiting Extend functionality** - Add subtitle generation or multiple output formats
by Roni Bandini
This workflow receives plain English instructions from a retro console via a webhook. Using an AI agent, it can combine multiple tools to read general RSS news headlines, stock market updates, emails, calendar events, search X, send Telegram messages, and run Linux commands. The idea is to avoid using smartphones or regular laptops in the morning, and instead use a retro console installed on an old notebook or netbook. You will need to copy a Python script onto the notebook, configure the webhook URL, and set up all the required credentials. Steps: Setup Gemini API key, Google Gmail and Calendar credentials from console.google.com Setup X credentials, RSS URL, etc Obtain the webhook URL and paste into the Python code to be executed at the Linux machine Run the python script with python3 console.py Note: if you ask for a Linux command, the command will not only be returned but also executed.
by Pavel Zamorev
This n8n template automates the transformation of raw meeting notes into structured tasks and documents using GPT (or another model) , syncing them to Notion and TickTick via a Telegram bot. Use Cases Automate note-taking and formatting for daily standups, brainstorming sessions, or client calls. Reduce cognitive load by eliminating manual tracking of ideas and tedious formatting. Convert discussions into actionable tasks instantly with TickTick and structured notes in Notion. How It Works Capture Notes: Send raw meeting notes to a Telegram bot. AI Processing: The workflow sends the text to AI, which: Removes duplicates and extracts key points. Formats content into structured Markdown notes for Notion. Identifies tasks with deadlines (e.g., "- Prepare presentation (Responsible: John, Deadline: Friday)"). Task Parsing: Extracts task titles, removing metadata like "Responsible" and "Deadline." Review & Edit: The bot returns formatted notes and tasks for review in Telegram. Sync & Publish: Notes are published to a Notion database. Tasks are exported to TickTick via API. Confirmation: A Telegram reaction (e.g., 👌 emoji) confirms successful processing. Setup Instructions Set Up Telegram Bot: Create a Telegram bot via BotFather and obtain an API token. Add the token to the "Telegram Trigger" and "Send-Edited-Notes" nodes under credentials (telegramApi). Configure OpenAI: Obtain an OpenAI API key and add it to the "Edit-Notes" node (openAiApi credentials). Ensure the model is set to gpt-4.1-mini in the node parameters. Set Up Notion: Create a Notion database for notes (e.g., "Meetings"). Add the database ID to the "Create a Database Page" node (databaseId). Configure Notion API credentials (notionApi) in the node. Set Up TickTick: Obtain a TickTick API key and add it to the "Create a Task" node (tickTickOAuth2Api credentials). Specify your TickTick project ID in the node (projectId). Deploy Workflow: Ensure your n8n instance is self-hosted to support community nodes (TickTick, Notion). Activate the workflow in n8n. Test: Send a test message to the Telegram bot (e.g., "Discussed project timeline. Tasks: - Prepare slides (Responsible: Alice, Deadline: Friday)"). Verify that notes appear in Notion, tasks in TickTick, and a 👌 reaction in Telegram. Configuration Examples Telegram Trigger: { "parameters": { "updates": ["message"], "additionalFields": {} }, "credentials": { "telegramApi": { "id": "your-telegram-api-id", "name": "meeting notes" } } } OpenAI Prompt (in "Edit-Notes" node): Analyze the quick meeting notes from {{ $json.message.text }} Generate meeting notes and a task list in the following format:\nMeeting Notes:\n- [Note 1]\n- [Note 2]\n\nTasks:\n- [Task 1] \n- [Task 2] Notion Database Page { "parameters": { "resource": "databasePage", "databaseId": "your-notion-database-id", "title": "MN {{ $now }}", "blockUi": { "blockValues": [ { "textContent": "{{ $json.message.text }}" } ] } } } Requirements Requires an OpenAI API key (or another model). APIs: Pre-configured Notion and TickTick API credentials are required. The template includes setup guides. Setup: Uses community nodes, requiring a self-hosted n8n instance. Customizing This Workflow Replace the Telegram bot with a webhook or form for alternative inputs (e.g., mobile apps). Modify the OpenAI prompt in the "Edit-Notes" node to customize note and task formats. Add filters in the "Split Notes and Tasks" node to prioritize tasks (e.g., ++#urgent++). Integrate Google Calendar via an additional HTTP Request node to auto-set deadlines based on text (e.g., "by Friday").
by Don Jayamaha Jr
A medium-term trend analyzer for the Binance Spot Market that leverages core technical indicators across 4-hour candle data to provide human-readable swing-trade signals via AI. 🎥 Watch Tutorial: 🎯 What It Does Accepts a Binance trading pair (e.g., AVAXUSDT) Sends the symbol to an internal webhook for technical indicator calculation Computes 4h RSI, MACD, Bollinger Bands, SMA, EMA, ADX Returns structured, GPT-analyzed signals ready for Telegram delivery 🧠 AI Agent Details Model:** GPT-4.1-mini (OpenAI Chat) Agent Role:** Translates raw indicator values into sentiment-labeled signals Memory:** Tracks session + symbol context for cleaner multi-turn logic 🔗 Required Backend Workflow To calculate indicators, this tool depends on: POST https://treasurium.app.n8n.cloud/webhook/4h-indicators { "symbol": "AVAXUSDT" } Returns a JSON object with the latest 40×4h candle-based calculations. 📥 Input Format { "message": "AVAXUSDT", "sessionId": "telegram_chat_id" } 📊 Sample Output 🕓 4h Technical Signals – AVAXUSDT • RSI: 64 → Slightly Bullish • MACD: Bullish Cross above baseline • BB: Upper band touch – volatility expanding • EMA > SMA → Confirmed Upside Momentum • ADX: 31 → Strengthening Trend 📚 Use Case Scenarios | Use Case | Result | | ----------------------------- | ---------------------------------------------------- | | Swing trend confirmation | Uses 4h indicators to validate or reject setups | | Breakout signal confluence | Helps assess if momentum is real or noise | | Inputs to Quant AI or Analyst | Supports higher-frame trade recommendation synthesis | 🛠️ Setup Instructions Import the JSON template into your n8n workspace. Set your OpenAI API credentials for the GPT node. Ensure the /webhook/4h-indicators backend tool is live and accessible. Connect this to your Binance Financial Analyst Tool or master Quant AI orchestrator. 🤖 Parent Workflows That Use This Tool Binance SM Financial Analyst Tool Binance Spot Market Quant AI Agent 📎 Sticky Notes & Annotations This workflow includes internal sticky notes describing: Node roles (GPT, webhook, memory) System behavior (reasoning agent logic) Telegram formatting guidance 🔐 Licensing & Attribution © 2025 Treasurium Capital Limited Company All architecture, prompt logic, and signal formatting are proprietary. Redistribution or rebranding is prohibited. 🔗 Connect with the creator: Don Jayamaha – LinkedIn
by Dvir Sharon
📰 Publish Latest News on X and Other Social Media Platforms Using Keyword A comprehensive n8n automation that fetches the latest news based on keywords, generates AI-powered social media content, and automatically publishes to X (Twitter) with complete tracking and notification systems. 📋 Overview This workflow provides a professional news publishing solution that automatically discovers breaking news, creates engaging social media content using AI, and publishes to X (Twitter) with comprehensive tracking. Perfect for news organizations, content creators, social media managers, and businesses wanting to stay current with automated news sharing. The system uses BrightData's Google News dataset, OpenAI's GPT-4o for content generation, and multi-platform integration for complete automation. ⭐ Key Features 📝 Form-Based Input**: Clean web form for keyword and country submission 📰 Real-Time News Fetching**: BrightData Google News integration for latest articles 🤖 AI Content Generation**: GPT-4o powered tweet creation with hashtags 📱 Auto X Publishing**: Direct posting to X (Twitter) with URL tracking 📊 Complete Tracking**: Google Sheets logging of all published content 🔔 Email Notifications**: Success alerts with tweet links 🌍 Multi-Country Support**: Localized news for US, India, UK, Australia ⚡ Status Monitoring**: Real-time progress tracking with retry logic 🛡 Error Handling**: Robust error management and validation 🔄 Loop Management**: Intelligent waiting for news processing completion 🎯 What This Workflow Does Input: News Name**: Keyword or topic for news search (required) Country**: Target country for localized news (dropdown: US/IN/GB/AU) Processing: Form Submission: Captures news keyword and target country News Triggering: Initiates BrightData Google News scraping job Status Monitoring: Checks scraping progress with intelligent retry loop Data Retrieval: Fetches latest news articles when ready AI Content Creation: Generates engaging tweet content using GPT-4o Social Publishing: Posts content to X (Twitter) automatically URL Generation: Creates direct tweet links for tracking Data Logging: Saves content and URLs to Google Sheets Email Notification: Sends success confirmation with tweet link Completion: Workflow ends with full audit trail 📋 Output Data Points | Field | Description | Example | | :------------ | :---------------------------------- | :----------------------------------------------------------------------------------------------------- | | TweetMessage | AI-generated social media content | "Breaking: AI revolution transforming healthcare with 40% efficiency gains. New study shows promising results in patient care automation. #AI #Healthcare #Innovation #TechNews #US" | | TweetURL | Direct link to published tweet | https://twitter.com/i/web/status/1234567890123456789 | 🛠️ Setup Instructions Prerequisites: n8n instance (self-hosted or cloud) X (Twitter) account with API v2 access OpenAI account with GPT-4o access Gmail account for notifications Google account with Sheets access BrightData account with Google News dataset access Basic understanding of social media automation Step 1: Import the Workflow Copy the JSON workflow code from the provided file. In n8n, click "+ Add workflow". Select "Import from JSON". Paste the workflow code and click "Import". The workflow will appear with all nodes properly connected. Step 2: Configure API Credentials X (Twitter) API Setup: Create X Developer Account at developer.twitter.com. Create new app and generate API keys. In n8n: Credentials → + Add credential → Twitter OAuth2 API. Add your Twitter API credentials: API Key API Secret Key Bearer Token Access Token Access Token Secret Test the connection with a sample tweet. OpenAI API Configuration: Get API key from platform.openai.com. Ensure GPT-4o model access is available. In n8n: Credentials → + Add credential → OpenAI API. Add your OpenAI API key. Verify model access in the "OpenAI Chat Model" node. Gmail Integration: Create "Gmail OAuth2" credential. Follow OAuth setup process. Grant email sending permissions. Test with sample email. BrightData News API: The workflow uses pre-configured token: 5662edde-6735-4c5d-a6c6-693043a5a9a5. Dataset ID: gd_lnsxoxzi1omrwnka5r (Google News). Verify access to Google News dataset. Test API connection. Google Sheets Integration: Create "Google Sheets OAuth2 API" credential. Complete OAuth authentication. Grant read/write permissions. Test connection. Step 3: Configure Google Sheets Integration Create Google Sheets Structure: Sheet Name: "Publish Latest News on Social Media Platforms Using Keyword" Tab: "Data" (default) Columns: Tweet Message: AI-generated content posted to X Tweet URL: Direct link to published tweet Sheet Configuration: Create new Google Sheet or use existing one. Add the required column headers. Copy Sheet ID from URL: https://docs.google.com/spreadsheets/d/SHEET_ID_HERE/edit. Current configured Sheet ID: 1koxNrwdeuaSBdREuKc7JQh3d9blEk0sQDJ8VgVLjPOo. Update Workflow Settings: Open "Google Sheets" node. Replace Document ID with your Sheet ID. Select your Google Sheets credential. Choose "Data" sheet/tab. Verify column mapping is correct. Step 4: Configure Form Interface Form Settings: Open "On form submission" node. Form configuration: Title: "News Publisher" Description: "publish latest news to direct social media" Fields: News Name (text, required) Country (dropdown: US, IN, GB, AU, required) Webhook URL: Copy webhook URL from form trigger node. Current webhook ID: 8d320705-688c-4150-a393-cf899d2bbb52. Test form accessibility and submission. Step 5: Configure Email Notifications Gmail Setup: Open "Gmail" node. Update recipient email: raushan@iwantonlinemarketing.com. Email template includes: Success confirmation Direct tweet link Professional formatting Test email delivery. Step 6: Test the Workflow Sample Test Data: Use these examples for testing: | News Name | Country | Expected Results | | :-------------------- | :------ | :------------------------------------------------- | | artificial intelligence | US | Latest AI news with US-specific hashtags | | cricket world cup | IN | Sports news with India-focused content | | brexit update | GB | UK political news with British hashtags | | bushfire news | AU | Australian environmental news | Testing Process: Activate the workflow (toggle switch). Navigate to the webhook form URL. Submit test data. Monitor execution progress: News fetching (30-60 seconds) AI content generation (10-15 seconds) X publishing (5-10 seconds) Sheet update and email (5 seconds) Verify results in all platforms. 📖 Usage Guide Using the Form Interface Navigate to the webhook URL provided by the form trigger. Enter news keyword or topic (e.g., "climate change", "stock market", "technology"). Select target country from dropdown. Click submit and wait for processing. Check email for success notification with tweet link. Example Inputs to Test | News Name | Country | Expected | | :-------------------------------- | :------ | :----------------------------------------------------- | | "artificial intelligence breakthrough" | "US" | Latest AI developments with tech hashtags | | "football premier league" | "GB" | UK football news with sports hashtags | | "stock market updates" | "IN" | Indian market news with finance hashtags | | "hollywood movies" | "AU" | Entertainment news with Australian perspective | Country-Specific Considerations United States (US)**: Focus on national news and global impact. Hashtags: #USA, #American, #Breaking, #News. Time zone considerations for optimal posting. India (IN)**: Emphasis on regional relevance. Hashtags: #India, #Indian, #News, #Breaking. Cultural context in content generation. United Kingdom (GB)**: British perspective and terminology. Hashtags: #UK, #British, #News, #Breaking. Focus on European context. Australia (AU)**: Australian angle and regional focus. Hashtags: #Australia, #Australian, #News, #Breaking. Pacific region context. 📊 Reading the Results Google Sheets Data The output sheet contains: Complete tweet content with hashtags and formatting. Direct tweet URLs for easy access and sharing. Chronological record of all published content. Audit trail for content management. Email Notifications Success emails include: Confirmation that content was published. Direct link to view the tweet. Professional formatting for easy reference. X (Twitter) Posts Published content features: AI-optimized messaging within 260 character limit. Relevant hashtags based on topic and country. Engaging format designed for social media. Professional tone suitable for news sharing. 🔧 Customization Options Expanding Social Media Platforms Add more platforms to the publishing workflow: // Add LinkedIn publishing { "node": "LinkedIn", "type": "n8n-nodes-base.linkedin", "parameters": { "text": "={{ $json.output }}", "additionalFields": {} } } // Add Facebook posting { "node": "Facebook", "type": "n8n-nodes-base.facebook", "parameters": { "pageId": "YOUR_PAGE_ID", "message": "={{ $json.output }}" } }
by keisha kalra
Try It Out! This n8n template creates a fully automated Instagram content schedule using AI and Google Sheets. It is perfect for content creators, marketing teams, or local businesses looking to organize and scale their social media posting. How it works The workflow starts by reading two sets of inputs from a Google Sheet: Your content strategy inputs (Pillar, Objective, Frequency, Format, Structure, Examples). A list of scraped blog posts with title, URL, and description (fetched from your website). Blog posts are scraped using Apify and parsed to extract key fields, which are stored in a tab labeled "Input (blog month)". You can assign a preferred posting month for each blog (e.g. fall blog posts get tagged for September). The workflow then merges both inputs and extracts the relevant information for further information added by ChatGPT. AI Scheduling & Personalization Once merged, the workflow loops through each content item and: Identifies if the scheduled post falls on or near a holiday (like Mother’s Day) and adjusts the content accordingly. A reference tool is attached to guide structure and tone, based on a library of post examples. Sends the content to an AI Agent (using GPT-4, but customizable) that generates: A compelling Instagram caption A visual description Hashtags Suggested post date, day, content pillar, and format (carousel, reel, image, etc.) Output All generated content including captions, structure, dates, hashtags, and pillar is exported into a tab titled Output in your Google Sheet. The final schedule is ready for manual review, editing, or publishing to social media. How to use The workflow uses a manual trigger to start, but you can replace it with a Webhook, cron job, or form submission. Add/edit your content strategy in Google Sheets. How to Set-Up Initial Input Tab Define your content pillars and structure Create a tab named "Input" or "Strategy" Include these columns: Pillar: e.g., Family images Objective: e.g., Showcase images Frequency: e.g., Bi-weekly Content Form: e.g., Images, Reels Structure: brief description of expected layout (e.g., carousel Q&A, singular photo) Examples: prompts or questions to guide AI (e.g., Why do you think families should do a session?) Input (blog month) Tab – Store scraped blog content Include these columns: URL: direct link to blog post Title: blog post title Description: short summary of the post Preferred Month: month you want it posted (e.g., August, September) This sheet is partially auto-filled by the workflow (except for Preferred Month) Output Tab – Final scheduled content Include these columns: Date: scheduled posting date (YYYY-MM-DD) Day: day of the week Pillar: content category assigned Format: e.g., Images, Reels, Carousel Description: visual summary Caption: Instagram-ready caption Hashtags: complete hashtag block To use the Apify HTTP Request node: Drag in an HTTP Request node into your n8n workflow. Set the Method and URL based on how you're using Apify: Use POST if you want to run an actor live with dynamic input (e.g. scrape blog posts in real time). Use GET if you want to retrieve results from a completed or static dataset run (faster and cheaper if you're reusing previous data). Configure query or body parameters: Include your Apify API token for authentication (e.g. token=YOUR_API_KEY) For POST: include an input object with any required actor settings (e.g., blog URL to scrape). For GET: specify the dataset ID in the URL Test the node to ensure you're retrieving the blog titles, descriptions, and URLs as expected. Requirements Apify account for scraping blog posts OpenAI key (e.g. GPT-4) or another model of your choice Google Sheets Credentials Example Use Cases A photographer repurposing blogs into Instagram carousels A nonprofit automatically generating seasonal posts A small team managing multi-pillar content across weeks or months Need Help? Join the n8n Discord or ask in the n8n Forum! Happy Content Making ! 📅✨
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for? This workflow template enables intelligent data extraction from ProductHunt using Bright Data’s Model Context Protocol (MCP) and processes search results with Google Gemini. This workflow is designed for individuals and teams who need automated, intelligent discovery and analysis of new tech products. It's especially valuable for: Startup Analysts & VC Researchers Growth Hackers & Marketers Recruiters & Tech Scouts Product Managers & Innovation Teams AI & Automation Enthusiasts What problem is this workflow solving? Traditional product discovery on ProductHunt is constrained by limited descriptions and requires repeated manual validation through web searches. Manually extracting and enriching this data is slow, repetitive, and error-prone. This workflow solves the problem by: Extracting real-time ProductHunt data using Bright Data’s MCP infrastructure to mimic real-user behavior and avoid blocks. Performing contextual searches on Google for a specific product on ProductHunt to gather use cases, reviews, and related information. Structuring results using Google Gemini LLM to provide human-readable insights and reduce noise. Delivering results seamlessly by saving output to disk, updating Google Sheets, and sending Webhook alerts. What this workflow does Input Field Node Define the ProductHunt category with the search term(s) you want to target. This is used to drive extraction and search operations. Agent Operation Node The agent performs two major tasks: Extract from ProductHunt Retrieves trending products from ProductHunt using Bright Data MCP Contextual Google Search for the product the agent searches Google for deeper context, including: Reviews Competitor mentions Real-world usage examples LLM Node (Google Gemini) Analyzes and summarizes extracted web content Removes noise (ads, menus, etc.) Structures content into bullet points, insights, or JSON objects Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs This workflow is flexible and modular, allowing you to adapt it for various research, product discovery, or trend analysis use cases. Below are the key customization points and how to modify them. Define Your Target Products or Topics: Change the input parameter to a specific ProductHunt category, tag, or keyword (e.g., "AI tools", "SaaS", "DevOps") Change Output Destinations : Save to Disk**: Change the file format (.json, .csv, .md) or directory path Google Sheet**: Modify sheet name, structure (columns like Product, Summary, Link) Webhook Notification**: Point to a Slack/Discord/CRM/Webhook URL with payload mapping
by Vijayeta Sinel
Automate document translation and ensure translation accuracy using Straker Verify, Google Drive and Slack. **How it works? ** A workflow step is set up to "watch" a Google Drive folder. When your team members place new files in this folder, they are downloaded. Straker Verify then translates them and provides a quality score. Once Straker Verify has completed this, the job info is fetched, the translation is saved to an output folder and you are notified via Slack. What problem does this solve? When using AI to translate documents, you have no idea about the quality and accuracy of the output. This template answers the question “How good is my translation?” so you have a high level of confidence before you publish. Who is this for? This workflow template is designed for businesses needing translation and localization of documents such as text docs, presentations, web pages, transcripts, video subtitles and others. Use it to build workflows that localize your content at scale while maintaining translation quality, accuracy and compliance. Set up instructions Straker Verify Integration with n8n **Connect to Straker Verify: Obtain Your API Key Sign Up/Log In:** Visit https://verify.straker.ai/ to create an account or log in. Navigate to API Keys: Go to `Verify → Settings → API Keys. Copy Your Key: Find and copy your API key. Add Key to n8n: In n8n, go to Settings → Credentials → Straker Verify. Set Up Credentials for the Straker Verify Node Open n8n and go to "Credentials". From the left sidebar, click on "Credentials". Search for "Straker Verify" and select "Straker Verify API" from the dropdown. Paste the copied access token from the Verify app and save it. Assign Credentials to the Node 1.Go to your workflow and open the Straker Verify node. 2.Select the "Straker Verify API" credentials you just created from the dropdown (Credentials to connect with), and save. ✅ You are now ready to use the workflow. Workflow Steps Step 1: Initiate Workflow – Upload Files to Google Drive 1.Upload one or more files to the designated Google Drive folder. This triggers the "New File in Google Drive" node in n8n. Step 2: Verify Token Balance The workflow checks your token balance via: Get Current User Balance User Has Enough Tokens Not enough tokens? You will receive a Slack message: Not enough tokens Please top up and re-upload your files. Step 3: Select Workflow The system fetches available workflows via: Fetch All Users Workflows No matching workflow found? You'll be notified via Slack: No workflow found Step 4: Create Project in Straker Verify The following steps are handled automatically: Fetch language/project options Download files from Google Drive Create a new project using: Create Straker Project ✅ You'll receive a Slack confirmation: > "Project created – ID: xxxxxx" Step 5: Process Completion & File Return Once files are processed by Straker: The Incoming Translation Result webhook is triggered. The workflow: Downloads processed files: Get File Content from Strakerh - Uploads them to Google Drive: Upload File to Google Drivee Step 6: Confirmation - Workflow Complete You'll receive a final Slack message: > "Workflow complete – Files are now available in Google Drive"
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for? The Search Engine Intelligence Extractor is a powerful n8n automation that leverages Bright Data’s MCP based AI Agents to simulate human-like searches across Google, Bing, and Yandex, and then distills clean, structured insights using Google Gemini. This workflow is tailored for: SEO analysts researching competitors or market trends Market researchers needing real-time search visibility Journalists & content writers gathering contextual insights AI developers creating intelligent assistants Digital marketers tracking brand mentions or news What problem is this workflow solving? Traditional scraping of search engines is often blocked, cluttered, or filled with irrelevant information. Manually analyzing and cleaning this data for insight is time-consuming. This workflow solves the problem by: Simulating real user search behavior via Bright Data MCP based AI Agent Performing multi-platform search (Google, Bing, Yandex) in one unified flow Extracting clean, human-readable results (stripping ads, navigation, etc.) Structuring the content using Google Gemini LLM Automating delivery via Webhook or saving to disk What this workflow does Input Fields Node: Accepts the search query Accepts action for example - Perform a google search. Replace the action with bing, yandex etc. for other search providers Accepts Webhook notification URL Bright Data MCP Agent Execution: Triggers Bright Data’s intelligent search agent Handles search navigation, result loading, pagination Human Readable Data Extractor: Cleanses HTML, removes ads, footers, irrelevant links Produces a readable narrative of results Final Output Handling: Saves the processed response to disk Sends the structured data to a Webhook for real-time use Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs Add Scheduled Execution Add a Cron trigger to run this workflow on a set schedule (e.g., daily/weekly keyword tracking). Push Results to Custom Destinations Connect output to: Google Sheets (for analytics or dashboards) PostgreSQL or MySQL DBs (for structured storage) Notion or Airtable (for content pipelines) Slack or Email (for alerting teams) Customize Webhook Notifications Update the Webhook URL in the notification node to push processed results to external APIs, CRMs, or real-time dashboards.
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. How It Works Gravity Forms Customer Feedback Form Integration: This workflow streamlines the process of handling customer feedback submitted via Gravity Forms. It ensures the data is correctly formatted and seamlessly integrates with KlickTipp. Data Transformation: Input data is validated and transformed to meet KlickTipp’s API requirements, including formatting phone numbers and converting dates. Key Features Gravity Forms Trigger Captures new form submissions from Gravity Forms via a webhook and initiates the workflow. Data Processing and Transformation Formats and validates essential data: Converts phone numbers to numeric-only format with international prefixes. Transforms dates (e.g., birthdays) to UNIX timestamps. Calculates and scales numeric responses (e.g., webinar ratings). Parses webinar selections into timestamps for structured scheduling. Subscriber Management in KlickTipp Adds or updates contacts in a KlickTipp subscriber list. Includes custom field mappings such as: Personal details (name, email, birthday, phone number). Feedback and preferences (e.g., webinar ratings, chosen sessions). Structured answers from form responses. Tags contacts for segmentation: Adds fixed and dynamic tags to contacts. Error Handling Ensures invalid or empty data is handled gracefully, preventing workflow interruptions. Setup Instructions Install and Configure Nodes: Set up the Webhook, Set, and KlickTipp nodes in your n8n instance. Authenticate your Gravity Forms and KlickTipp accounts. Prepare Custom Fields in KlickTipp: Create fields in KlickTipp to align with the form submission data, such as: | Name | Datentyp | |-----------------------------------|----------------| | Gravityforms_URL_Linkedin | URL | | Gravityforms_kurs/webinar_zeitpunkt | Datum & Zeit | | Gravityforms_kurs/webinar_bewertung | Dezimalzahl | | Gravityforms_feedback | Absatz | | Gravityforms_kontaktaufnahme | Zeile | After creating fields, allow 10-15 minutes for them to sync. If fields don’t appear, reconnect your KlickTipp credentials. Field Mapping and Adjustments: Verify and customize field assignments in the workflow to align with your specific form and subscriber list setup. Workflow Logic Trigger via Gravity Forms Submission: The workflow begins when a new form submission is received through the webhook. Transform Data for KlickTipp: Formats and validates raw form data for compatibility with KlickTipp’s API. Add to KlickTipp Subscriber List: Adds processed data as a new subscriber or updates an existing one. Get all tags from KlickTipp and create a list: Fetches all existing Tags and turns them into an array Define tags to dynamically set for contacts: Definiton of variables that are received from the form submission and should be converted into tags Merge tags of both lists: Checks whether the list of existing tags in KlickTipp contains the tags which should be dynamically set based on the form submission Tag creation and tagging contacts: Creates new tags if it previously did not exist and then tags the contact Benefits Efficient lead generation: Contacts from forms are automatically imported into KlickTipp and can be used immediately, saving time and increasing the conversion rate. Automated processes: Experts can start workflows directly, such as welcome emails or course admissions, reducing administrative effort. Error-free data management: The template ensures precise data mapping, avoids manual corrections and reinforces a professional appearance. Testing and Deployment Test the workflow by filling the form on Gravity Forms and verifying data updates in KlickTipp. Notes Customization: Update field mappings within the KlickTipp nodes to align with your account setup. This ensures accurate data syncing. Resources: Gravity Forms KlickTipp Knowledge Base help article Use KlickTipp Community Node in n8n Automate Workflows: KlickTipp Integration in n8n
by Jay Hartley
What this workflow does This workflow allows you to monitor multiple Github repos simultaneously without polling due to use of Webhooks. It programmatically allows for adding and deleting of repos to your watchlist to make management convenient. Description Can monitor multiple repos simultaneously. Programmatically register or unregister repos from a list. No need for manual work. Webhook notification means no constant polling necessary. Setup 1. Creating Credentials on Github Generate a personal access token on github by following these esteps; Right hand side of page -> Settings -> scroll to bottom -> Developer Settings > Personal Access Token > Tokens (classic) > Generate New Token Give scopes: admin:repo_hook repo (if you want to use it for your own private repo) if you need more help, see here: https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens 2. Setting Credentials in n8n In Register Github Webhook Authenticaion > Generic Credential Type Generic Auth Type > Header Auth Header Auth > Create New Credential with Name set to 'Authorization' and Value set to 'Bearer '. (You can reuse this for Delete Github Webhook and Get Existing Webhooks). Now in Register Github Webhook, scroll down to Send Body > JSON and inside the JSON, change the value of "url" to the webhook address given as Production URL in the node Webhook Trigger. 3. Notification settings In the third row, link up the Webhook Trigger to any API of your choice. Slack and Telegram are given as examples. You can also format the notification message as you wish. Setup time: roughly 10 minutes. Instructions Video: https://vimeo.com/1013473758 Test 1. Register Webhooks In Repos to Monitor, add any repo you want to monitor changes for. Disable Webhook Trigger, Click Test Workflow and if your Github credentials were set correctly, it will automatically register the webhooks. - You can test this by running the single node Get Existing Webhook and confirming it outputs the repo addresses. 2. Handle Github Events Now that you have registered the webhooks, re-enable Webhook Trigger and activate the workflow. Make a commit to any of the registered repos. Confirm that the notification went through. That's it!