by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for? This workflow template enables intelligent data extraction from ProductHunt using Bright Data’s Model Context Protocol (MCP) and processes search results with Google Gemini. This workflow is designed for individuals and teams who need automated, intelligent discovery and analysis of new tech products. It's especially valuable for: Startup Analysts & VC Researchers Growth Hackers & Marketers Recruiters & Tech Scouts Product Managers & Innovation Teams AI & Automation Enthusiasts What problem is this workflow solving? Traditional product discovery on ProductHunt is constrained by limited descriptions and requires repeated manual validation through web searches. Manually extracting and enriching this data is slow, repetitive, and error-prone. This workflow solves the problem by: Extracting real-time ProductHunt data using Bright Data’s MCP infrastructure to mimic real-user behavior and avoid blocks. Performing contextual searches on Google for a specific product on ProductHunt to gather use cases, reviews, and related information. Structuring results using Google Gemini LLM to provide human-readable insights and reduce noise. Delivering results seamlessly by saving output to disk, updating Google Sheets, and sending Webhook alerts. What this workflow does Input Field Node Define the ProductHunt category with the search term(s) you want to target. This is used to drive extraction and search operations. Agent Operation Node The agent performs two major tasks: Extract from ProductHunt Retrieves trending products from ProductHunt using Bright Data MCP Contextual Google Search for the product the agent searches Google for deeper context, including: Reviews Competitor mentions Real-world usage examples LLM Node (Google Gemini) Analyzes and summarizes extracted web content Removes noise (ads, menus, etc.) Structures content into bullet points, insights, or JSON objects Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs This workflow is flexible and modular, allowing you to adapt it for various research, product discovery, or trend analysis use cases. Below are the key customization points and how to modify them. Define Your Target Products or Topics: Change the input parameter to a specific ProductHunt category, tag, or keyword (e.g., "AI tools", "SaaS", "DevOps") Change Output Destinations : Save to Disk**: Change the file format (.json, .csv, .md) or directory path Google Sheet**: Modify sheet name, structure (columns like Product, Summary, Link) Webhook Notification**: Point to a Slack/Discord/CRM/Webhook URL with payload mapping
by Vijayeta Sinel
Automate document translation and ensure translation accuracy using Straker Verify, Google Drive and Slack. **How it works? ** A workflow step is set up to "watch" a Google Drive folder. When your team members place new files in this folder, they are downloaded. Straker Verify then translates them and provides a quality score. Once Straker Verify has completed this, the job info is fetched, the translation is saved to an output folder and you are notified via Slack. What problem does this solve? When using AI to translate documents, you have no idea about the quality and accuracy of the output. This template answers the question “How good is my translation?” so you have a high level of confidence before you publish. Who is this for? This workflow template is designed for businesses needing translation and localization of documents such as text docs, presentations, web pages, transcripts, video subtitles and others. Use it to build workflows that localize your content at scale while maintaining translation quality, accuracy and compliance. Set up instructions Straker Verify Integration with n8n **Connect to Straker Verify: Obtain Your API Key Sign Up/Log In:** Visit https://verify.straker.ai/ to create an account or log in. Navigate to API Keys: Go to `Verify → Settings → API Keys. Copy Your Key: Find and copy your API key. Add Key to n8n: In n8n, go to Settings → Credentials → Straker Verify. Set Up Credentials for the Straker Verify Node Open n8n and go to "Credentials". From the left sidebar, click on "Credentials". Search for "Straker Verify" and select "Straker Verify API" from the dropdown. Paste the copied access token from the Verify app and save it. Assign Credentials to the Node 1.Go to your workflow and open the Straker Verify node. 2.Select the "Straker Verify API" credentials you just created from the dropdown (Credentials to connect with), and save. ✅ You are now ready to use the workflow. Workflow Steps Step 1: Initiate Workflow – Upload Files to Google Drive 1.Upload one or more files to the designated Google Drive folder. This triggers the "New File in Google Drive" node in n8n. Step 2: Verify Token Balance The workflow checks your token balance via: Get Current User Balance User Has Enough Tokens Not enough tokens? You will receive a Slack message: Not enough tokens Please top up and re-upload your files. Step 3: Select Workflow The system fetches available workflows via: Fetch All Users Workflows No matching workflow found? You'll be notified via Slack: No workflow found Step 4: Create Project in Straker Verify The following steps are handled automatically: Fetch language/project options Download files from Google Drive Create a new project using: Create Straker Project ✅ You'll receive a Slack confirmation: > "Project created – ID: xxxxxx" Step 5: Process Completion & File Return Once files are processed by Straker: The Incoming Translation Result webhook is triggered. The workflow: Downloads processed files: Get File Content from Strakerh - Uploads them to Google Drive: Upload File to Google Drivee Step 6: Confirmation - Workflow Complete You'll receive a final Slack message: > "Workflow complete – Files are now available in Google Drive"
by Ibrahim Malick
⚠️ This template uses only official n8n nodes. No community nodes required. 🧑💼 Who is this for? This workflow is designed for: Legal tech founders Marketing freelancers or consultants Agencies supporting lawyers and small law firms Anyone doing outbound outreach in the legal niche ❓ What problem is this solving? LinkedIn is a goldmine for targeting legal professionals — but scraping and personalizing outreach is tedious and expensive. Most tools either: Require paid LinkedIn Sales Navigator Can’t personalize at scale Violate LinkedIn’s TOS This workflow solves that by using free Google Search, OpenRouter AI, and GPT-4o to find, enrich, and message up to 1,000 solo lawyers per day — without using browser automation or scrapers. ⚙️ What this workflow does Uses Google Programmable Search to find solo lawyers and small firm founders on LinkedIn Parses each profile’s name, title, profile URL, and snippet Saves raw lead data to Google Sheets Uses OpenRouter Sonar Pro to enrich each profile with external content Generates a personalized, 1-line message using GPT-4o Appends the final message into Google Sheets for outreach 🛠️ Setup Estimated time: 15–20 minutes ✅ Google Programmable Search Enable the Custom Search API on Google Cloud Create a programmable search engine set to search the full web Copy your API key and CX ID ✅ Google Sheets Create a sheet with columns: Name, Title, Profile URL, Outreach Message Share the sheet with your OAuth-connected Google account ✅ OpenRouter Sign up at openrouter.ai Fund with at least $5 and generate your API key Use the model perplexity/sonar-pro for real-time research ✅ GPT-4o (optional) You can use your OpenAI key or route GPT-4o via OpenRouter All setup-specific values are marked clearly in sticky notes and placeholders. 🛠️ How to customize this workflow to your needs Change the Google search query to match your industry (e.g., "founder" AND "therapist" site:linkedin.com/in) Modify the AI prompt to match your tone (formal, casual, humorous) Connect the final output to your CRM (like HubSpot, Airtable, etc.) Add a second outreach message variant to A/B test performance 📌 Sticky Notes & Annotations All nodes are clearly renamed for understandability (e.g., Find Lawyer Profiles, Parse LinkedIn Search Results) Color-coded sticky notes explain: Setup instructions Required credentials Use case 🗂 Category AI Sales Marketing
by Sam Robertson
Generate Summaries from Uploaded Files using OpenAI Assistants API 📑 Overview Upload a document (PDF, DOCX, PPTX, TXT, CSV, JSON, or Markdown) and receive an AI-generated summary containing: title** – 5-10 words summary** – 1-2 sentences bullets** – 3-5 key points tags** – 3-6 short keywords The workflow: Stores the file in OpenAI. Runs an Assistant with File Search and Code Interpreter enabled. Polls until the run finishes. Retrieves the summary JSON. ✅ Prerequisites OpenAI Assistant Create one at <https://platform.openai.com/assistants> Enable File Search and Code Interpreter Note: The assistant ID starts with asst_ OpenAI API credential setup in n8n Go to Credentials → New → HTTP Header Auth Header name: Authorization Value: Bearer YOUR-OPENAI-API-KEY (replace YOUR-OPENAI-API-KEY with your OpenAI API secret key for your assistant, starts with sk-) Name it: openAIApiHeader 🔧 Setup Import the workflow JSON. When n8n prompts for a credential, choose openAIApiHeader for every HTTP Request node. Open Run Assistant → Body and replace "assistant_id": "REPLACE_WITH_YOUR_ASSISTANT_ID" with your real ID (starts with asst_…). Save. 🚀 How it works | # | Node | Purpose | |---|------|---------| | 1 | On form submission | User uploads a file (File). | | 2 | Upload File | POST /v1/files (multipart) → returns file_id. | | 3 | Create Thread | Creates a thread and attaches the uploaded file. | | 4 | Run Assistant | Starts the run using your assistant_id. | | 5 | Poll Run Status → Wait 2 s → IF | Loops until status = completed. | | 6 | Fetch Summary | GET /v1/threads/{thread_id}/messages → summary JSON. | 🖌️ Customisation ideas Edit the user prompt in Create Thread to change summary length, tone, or language. Add an HTTP Response node after Fetch Summary to return plaintext to the uploader. Replace the polling loop with OpenAI’s forthcoming wait-for-run endpoint when available. No community nodes required. Works on any n8n Cloud plan (Starter, Pro, Enterprise) or self-hosted Community Edition.
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. How It Works: Typeform Quiz Integration: This workflow streamlines the process of handling quiz answers submitted via Typeform. It ensures the data is correctly formatted and seamlessly integrates with KlickTipp. Data Transformation: Input data is validated and transformed to meet KlickTipp’s API requirements, including formatting phone numbers and converting dates. Key Features Typeform Trigger: Captures new quiz submissions from Typeform, including user details and quiz responses. Data Processing and Transformation: Formats phone numbers to numeric-only format with international prefixes. Converts dates (e.g., birthdays) to UNIX timestamps. Maps multiple-choice quiz answers to string values for API compatibility. Scales numeric quiz responses for tailored use cases. Subscriber Management in KlickTipp: Adds participants as subscribers to a designated KlickTipp list, with custom field mappings for: Personal details (e.g., name, email, phone number, birthday). Quiz responses (e.g., intended usage of KlickTipp, company location, and team size). Tags contacts for segmentation: Adds fixed and dynamic tags to contacts. Error Handling: Handles empty or malformed data gracefully, ensuring clean submissions to KlickTipp. Setup Instructions Install and Configure Nodes: Set up the Typeform and KlickTipp nodes in your n8n instance. Authenticate your Typeform and KlickTipp accounts. Prepare Custom Fields in KlickTipp: Create custom fields to store quiz answers and personal details, such as: | Name | Datentyp | |----------------------------------|-----------------| | Typeform_URL_Linkedin | URL | | Typeform_Frage1_klicktipp_nutzen | Zeile | | Typeform_Frage2_klicktipp_sitz | Zeile | | Typeform_Frage3_mitglieder_CHT | Dezimalzahl | After creating fields, allow 10-15 minutes for them to sync. If fields don’t appear, reconnect your KlickTipp credentials. Field Mapping and Adjustments: Verify and customize field assignments in the workflow to align with your specific form and subscriber list setup. Workflow Logic Trigger via Typeform Submission: The workflow initiates upon receiving a new quiz submission. Transform Data for KlickTipp: Converts and validates data from Typeform to match KlickTipp’s API requirements. Add to KlickTipp Subscriber List: Submits the cleaned data to KlickTipp, including all relevant quiz answers. Get all tags from KlickTipp and create a list: Fetches all existing Tags and turns them into an array Define tags to dynamically set for contacts: Definiton of variables that are received from the form submission and should be converted into tags Merge tags of both lists: Checks whether the list of existing tags in KlickTipp contains the tags which should be dynamically set based on the form submission Tag creation and tagging contacts: Creates new tags if it previously did not exist and then tags the contact Benefits Efficient lead generation: Contacts from forms are automatically imported into KlickTipp and can be used immediately, saving time and increasing the conversion rate. Automated processes: Experts can start workflows directly, such as welcome emails or course admissions, reducing administrative effort. Error-free data management: The template ensures precise data mapping, avoids manual corrections and reinforces a professional appearance. Testing and Deployment Test the workflow by filling the form on Typeform and verifying data updates in KlickTipp. Notes Customization: Update field mappings within the KlickTipp nodes to align with your account setup. This ensures accurate data syncing. Resources: Typeform KlickTipp Knowledge Base help article Use KlickTipp Community Node in n8n Automate Workflows: KlickTipp Integration in n8n
by Joachim Hummel
This n8n workflow automates posting Amazon affiliate products to Mastodon — complete with image upload, description, and a shortened tracking URL using Shlink. 🔧 How it works Input Source: The workflow starts by reading from a connected Google Sheet that contains: SHlink (Shortlink) Amazon Link Description (Optional) PicURL Send /NO or YES A Send column (used as a flag to check if the row was already posted) Image Upload: It fetches the product image via HTTP and uploads it directly to a Mastodon instance via the /media API endpoint. URL Shortening (Shlink): The original Amazon URL is shortened using your self-hosted or cloud-hosted Shlink instance to enable click tracking and better presentation. Text Generation: A two-line promotional text is automatically generated using a Language Model (LLM), based on the product description. Posting to Mastodon: The post is then published on Mastodon with: The image The generated text The shortened Shlink URL Row Update: Once published, the Send column in the Google Sheet is updated to "YES" to prevent duplicates. Requirements ✅ Shlink – Required for shortening and tracking Amazon URLs ✅ Google Sheet – Used as a product queue and post ✅ Google Sheet Example https://link.unixweb.home64.de/w7VqY ✅ Mastodon account – OAuth2 credentials with write scope ✅ Product image URL – Must be valid and accessible ✅ n8n credentials – Set up for Google Sheets, Mastodon, and optionally OpenRouter or other LLM providers This workflow is ideal for content creators, affiliate marketers, and automation fans who want to save time and optimize reach across the Fediverse. #affiliate #amazon #mastodon #advertisment
by DigiMetaLab
A smart personal assistant that can reason, search, calculate, and remember — powered by Google Gemini and ready in one click. Most AI agents only respond — this one thinks before replying, pulls in real-time facts, does the math, and even remembers the last 5 things you said. 🔧 How it works This template builds a conversational agent using the Google Gemini API. It uses multiple tools like: 🧠 Think – to reason step-by-step 🔍 SerpAPI – to search live data on Google ➗ Calculator – to solve math problems 💾 Memory – to remember short-term chat history You can embed this agent into a chatbot, web app, or automate any customer support, research, or productivity workflow. 🧠 Your agent will: Understand what you're asking Think logically using the Think tool Search facts in real-time using SerpAPI Calculate numbers using a math engine Recall past context using a memory buffer And respond clearly — just like a real assistant. 🧑💼 Who is this template for? This template is ideal for: Creators & developers building AI agents Teams needing a Gemini-powered assistant Beginners exploring LangChain + n8n Anyone curious about combining LLMs + tools + memory 🚀 How to set up Plug in your Google Gemini API key Add your SerpAPI key Run the workflow and start chatting! Everything is pre-wired for you — just import and go. 📬 Use cases You can connect this agent to: Telegram bots 🤖 WhatsApp via Twilio 📱 Slack, Discord, or Gmail 💬 Or just trigger it inside n8n manually 🔁 👉 Check out my other templates https://n8n.io/creators/digimetalab
by Alexander K.
Transform your creative sparks into professional Instagram Reel scripts instantly! This AI-powered workflow takes your raw ideas (text or voice messages) via Telegram and generates complete, viral-ready Reel scenarios with hooks, scripts, captions, and visual concepts. Who is this template for? This template is perfect for: Content creators looking to streamline their Reel production process Social media managers who need to generate multiple Reel concepts quickly Marketing professionals seeking data-driven, psychology-based content strategies Influencers and entrepreneurs who want to maintain consistent, engaging content Small business owners looking to create viral marketing content without hiring expensive copywriters Anyone who struggles with writer's block or wants to improve their Instagram engagement What this template does This comprehensive workflow provides a complete Reel creation assistant that: 🎯 Accepts Multiple Input Types: Text messages with your Reel ideas Voice notes that get automatically transcribed Processes ideas in real-time through Telegram 🧠 AI-Powered Content Generation: Creates 3 attention-grabbing hook variants designed to stop the scroll Generates a complete 30-60 second script with Hook, Subtitle, Body, and Call-to-Action Writes engaging captions that complement (not repeat) your video content Provides specific visual concepts with cinematic direction for filming 📊 Smart Features: Memory system that remembers your conversation context for personalized suggestions Optional Google Sheets integration to automatically log and organize all your Reel ideas Error handling for seamless user experience Instant delivery of results back to your Telegram chat 🎨 Professional Quality Output: Scripts based on proven marketing psychology from industry legends Hooks designed using viral content strategies Visual concepts that are specific and actionable (not generic "film yourself" advice) Captions optimized for engagement and shareability Sample Results Input (Idea): How I Saved 10 Hours a Week with Blog Automation? Output: 💡 Hook (variants): "Blogging doesn't have to be a time-suck. Here's my secret…" "Is blogging eating up your spare time? Let's fix that!" "Unlock 10 hours a week AND keep your blog thriving!" 🎬 Script: Hook: "Blogging doesn't have to be a time-suck. Here's my secret…" Subtitle: "Maximize your time with blog automation hacks!" Body: "Picture this: writing, editing, posting, and promoting your blog without breaking a sweat. I was buried under endless tasks until I discovered blog automation. Scheduling posts, auto-publishing, automating social shares—it's a game-changer. 10 hours a week, back in my pocket! More time for creativity or even a break. Imagine what automation could do for your content game." CTA: "Which blog task do you wish was automated? Drop a comment!" 📝 Reel Caption: Blog automation isn't just convenience—it's freedom. What will you create with your extra time? 📸 Visual Idea: Open with a whirlwind of papers and sticky notes symbolizing chaos. Transition to a person seamlessly typing on a laptop, where blog posts are auto-scheduled. Quick cuts show blog shares and responses happening automatically. Conclude with a serene scene: the person outdoors, notebook in hand, jotting ideas peacefully on a sunny day. Setup Instructions Prerequisites: Telegram account OpenAI API account with GPT-4 access Google account (optional, for logging ideas) Step 1: Create Your Telegram Bot Open Telegram and search for @BotFather Send /newbot and follow the instructions to create your bot Save the Bot Token you receive - you'll need this for n8n Send /setprivacy to @BotFather, select your bot, and choose "Disable" to allow the bot to read all messages Step 2: Get Your OpenAI API Key Visit OpenAI's API platform Create an account or log in Navigate to API Keys section Create a new API key and save it securely Ensure you have access to GPT-4 models (required for optimal results) Step 3: Configure the Workflow Import this template into your n8n instance Set up Telegram credentials: Add your Bot Token to all Telegram nodes Test the connection Configure OpenAI credentials: Add your API key to the "OpenAI Chat Model" and "Transcribes audio" nodes Verify GPT-4o model access Optional - Google Sheets setup: Create a new Google Sheet with columns: Status, Date, Description, Script Connect your Google account to the "Google Sheets" node Select your spreadsheet and sheet Step 4: Activate and Test Click "Activate" in the top-right corner of your workflow Open Telegram and find your bot Send a test message like "Create a Reel about morning routines" Verify you receive a complete Reel scenario response Step 5: Start Creating! Send text ideas directly to your bot Record voice notes with your concepts Receive professional Reel scenarios within seconds Use the optional Google Sheets integration to build your content library Pro Tips: Be specific with your ideas for better results The AI remembers your conversation context, so you can refine ideas iteratively Voice messages work great for capturing spontaneous ideas on-the-go Review the generated visual concepts - they're designed to be immediately actionable Troubleshooting: Ensure your OpenAI account has sufficient credits Verify your Telegram bot privacy settings allow message reading Check that all credentials are properly configured and tested For Google Sheets issues, confirm the sheet structure matches the expected columns This template transforms the tedious process of content creation into an instant, AI-powered system that delivers professional-quality Reel scenarios whenever inspiration strikes!
by Samuel Kimutai
How it works Automatically generates trending LinkedIn content topics using AI Researches current industry angles and hooks Writes posts in your authentic voice using OpenAI Creates professional images with DALL-E Posts everything on schedule without manual intervention Set up steps Connect OpenAI API for content generation and image creation Link LinkedIn API for automated posting Configure scheduling triggers (daily/weekly posting) Customize prompts to match your writing style and industry Set up content approval workflows (optional) Results you can expect 400% increase in profile views within 3 weeks Generate 120+ posts per month vs manual 12 posts Free up 15+ hours weekly for revenue-generating activities Consistent posting schedule that builds audience engagement Professional content that converts followers to clients Time to set up: 30-45 minutes Technical level: Beginner to intermediate APIs required: OpenAI, LinkedIn API Cost: OpenAI usage fees only (approximately $5-15/month) This workflow transforms LinkedIn content creation from a time-consuming daily task into a fully automated system that works while you sleep. Perfect for entrepreneurs, marketers, and content creators who want consistent LinkedIn presence without the manual effort.
by Niranjan G
Automated GitHub Scanner for Exposed AWS IAM Keys Overview This n8n workflow automatically scans GitHub for exposed AWS IAM access keys associated with your AWS account, helping security teams quickly identify and respond to potential security breaches. When compromised keys are found, the workflow generates detailed security reports and sends Slack notifications with actionable remediation steps. 🔑 Key Features Automated AWS IAM Key Scanning**: Regularly checks for exposed AWS access keys on GitHub Real-time Security Alerts**: Sends immediate Slack notifications when compromised keys are detected Comprehensive Security Reports**: Generates detailed reports with exposure information and risk assessment Actionable Remediation Steps**: Provides clear instructions for securing compromised credentials Continuous Monitoring**: Maintains ongoing surveillance of your AWS environment 📋 Workflow Steps List AWS Users: Retrieves all users from your AWS account Split Users for Processing: Processes each user individually Get User Access Keys: Retrieves access keys for each user Filter Active Keys Only: Focuses only on currently active access keys Search GitHub for Exposed Keys: Scans GitHub repositories for exposed access keys Aggregate Search Results: Consolidates and deduplicates search findings Check For Compromised Keys: Determines if any keys have been exposed Generate Security Report: Creates detailed security reports for compromised keys Extract AWS Usernames: Extracts usernames from AWS response for notification Format Slack Alert: Prepares comprehensive Slack notifications Send Slack Notification: Delivers alerts with actionable information Continue Scanning: Maintains continuous monitoring cycle 🛠️ Setup Requirements Prerequisites Active n8n instance AWS account with IAM permissions GitHub account/token for searching repositories Slack workspace for notifications Required Credentials AWS Credentials: IAM user with permissions to list users and access keys Access Key ID and Secret Access Key GitHub Credentials: Personal Access Token with search permissions Slack Credentials: Webhook URL for your notification channel ⚙️ Configuration AWS Configuration: Configure the "List AWS Users" node with your AWS credentials Ensure proper IAM permissions for listing users and access keys GitHub Configuration: Set up the "Search GitHub for Exposed Keys" node with your GitHub token Adjust search parameters if needed Slack Configuration: Configure the Slack node with your webhook URL Customize notification format if desired 🚀 Usage Running the Workflow Manual Execution: Click "Execute Workflow" to run an immediate scan Scheduled Execution: Set up a schedule to run periodic scans (recommended daily or weekly) Repository Compatibility This workflow is compatible with both public and private GitHub repositories to which you have access. It will scan all repositories you have permission to view based on your GitHub credentials. Handling Alerts When a compromised key is detected: Review the Slack notification for details about the exposure Follow the recommended remediation steps: Deactivate the compromised key immediately Create a new key if needed Investigate the exposure source Update any services using the compromised key ⚠️ Disclaimer This workflow template is provided for reference purposes only to demonstrate how to automate AWS IAM key exposure scanning. Please note: The scanning process may produce false positives as it only matches potential AWS access key patterns Always verify any reported exposures manually before taking action Disabling or deleting access keys without proper verification could have significant negative impacts on your environment Understand which systems and applications rely on identified access keys before deactivating them This template should be customized to fit your specific environment and security policies IMPORTANT: Use this workflow with caution and only after thoroughly understanding your AWS environment. The authors of this template are not responsible for any disruptions or damages resulting from its use. 🔒 Security Considerations This workflow requires access to sensitive AWS credentials Store all credentials securely within n8n Review and rotate access keys regularly 📝 Customization Options Adjust GitHub search parameters for more targeted scanning Customize Slack notification format and content Modify security report generation for your specific needs Integrate with additional notification channels (email, MS Teams, etc.) Optional: Enabling Interactive Slack Buttons The Slack Block Kit notification format supports interactive buttons that can be implemented if you want to perform actions directly from Slack: Disable Key: This button can be configured to automatically disable the compromised AWS IAM access key View Details: This button can be set up to show additional information about the exposure Acknowledge: This button can be used to mark the alert as acknowledged To make these buttons functional: Set up a Slack Socket Mode App: Create a Slack app in the Slack API Console Enable Socket Mode and Interactive Components Subscribe to the block_actions event to capture button clicks Create an n8n Webhook Endpoint: Add a new webhook node to receive Slack button click events Create separate workflows for each button action Implement AWS Key Disabling: For the "Disable Key" button, create a workflow that uses the n8n HTTP Request node to call the AWS IAM UpdateAccessKey API Example HTTP request that can be implemented in n8n: Method: POST URL: https://iam.amazonaws.com/ Query Parameters: Action: UpdateAccessKey AccessKeyId: AKIAIOSFODNN7EXAMPLE Status: Inactive UserName: {{$json.username}} Version: 2010-05-08 Update the Slack Message Format: Modify the Format Slack Alert node to include your webhook URL in the button action values Add callback_id and action_id values to identify which button was clicked This implementation allows for immediate response to security incidents directly from the Slack interface, reducing response time and improving security posture.
by Malik Hashir
Overview The n8n Telegram Gmail Assistant is an intelligent workflow that lets you search and retrieve specific Gmail emails simply by messaging a Telegram bot. Powered by advanced language models, it turns plain-language requests into precise Gmail searches, delivering results directly to your Telegram chat. This no-code automation is perfect for users who want instant, conversational access to their inbox—no Gmail tab required. Key Features Conversational Email Search: Just message the Telegram bot with requests like “Get me all emails from Amazon” or “Show unread emails after 6 June 2025.” The assistant understands sender names, keywords, and date filters—even if you only provide part of the information. AI-Powered Query Parsing: Uses a language model (LLM) to intelligently extract sender, keywords, and date range from your message, then builds an accurate Gmail search query. Flexible Filtering: Supports sender, keywords, ‘after’ and ‘before’ dates, or any combination. Handles both specific and broad queries. Instant Telegram Delivery: Each matching email is formatted with date, sender, subject, and a snippet, and sent as a separate Telegram message for easy reading. Customizable & Extendable: Swap the AI model (Google Gemini or OpenAI), adjust output formatting, and set email limits or read status as needed. How It Works User Sends a Telegram Message: For example, “Get unread emails from Amazon about invoices after 1 June 2025.” AI Interprets the Request: The workflow’s LLM agent extracts sender, keywords, and date filters, converting them into a Gmail search query using Gmail’s syntax (e.g., from:amazon AND (invoice OR invoices) AND after:2025/06/01). Gmail Search: The workflow fetches all matching emails from your connected Gmail account. Message Formatting: Each email is summarized into a concise, emoji-rich Telegram message (date, sender, subject, snippet). Telegram Delivery: Results are sent to your Telegram chat, one message per email. Setup Instructions Create a Telegram Bot: Use @BotFather on Telegram to create a bot and obtain the API token. Connect Telegram to n8n: Add your bot’s API token as a credential in n8n. Connect Gmail Account: Authorize your Gmail account in n8n, set email limits, and choose read/unread status preferences. Configure AI Model: Use your own Google Gemini or OpenAI API key, or select a preferred LLM node in the workflow. Deploy the Workflow: Activate the workflow and start messaging your Telegram bot to retrieve emails instantly. Value Proposition Save Time:** No need to open Gmail or remember search operators—just ask in plain language. Stay Organized:** Instantly filter and retrieve important emails, even on the go. User-Friendly:** No coding required, with clear setup steps and customizable options. Cost-Effective:** Available simply with an n8n subscription—no extra costs or hidden fees of anything. Enjoy the workflow Free Forever within your n8n plan.
by Shiv Gupta
🎵 TikTok Post Scraper via Keywords | Bright Data + Sheets Integration 📝 Workflow Description Automatically scrapes TikTok posts based on keyword search using Bright Data API and stores comprehensive data in Google Sheets for analysis and monitoring. 🔄 How It Works This workflow operates through a simple, automated process: Keyword Input:** User submits search keywords through a web form Data Scraping:** Bright Data API searches TikTok for posts matching the keywords Processing Loop:** Monitors scraping progress and waits for completion Data Storage:** Automatically saves all extracted data to Google Sheets Result Delivery:** Provides comprehensive post data including metrics, user info, and media URLs ⏱️ Setup Information Estimated Setup Time: 10-15 minutes This includes importing the workflow, configuring credentials, and testing the integration. Most of the process is automated once properly configured. ✨ Key Features 📝 Keyword-Based Search Search TikTok posts using specific keywords 📊 Comprehensive Data Extraction Captures post metrics, user profiles, and media URLs 📋 Google Sheets Integration Automatically organizes data in spreadsheets 🔄 Automated Processing Handles scraping progress monitoring 🛡️ Reliable Scraping Uses Bright Data's professional infrastructure ⚡ Real-time Updates Live status monitoring and data processing 📊 Data Extracted | Field | Description | Example | |-------|-------------|---------| | url | TikTok post URL | https://www.tiktok.com/@user/video/123456 | | post_id | Unique post identifier | 7234567890123456789 | | description | Post caption/description | Check out this amazing content! #viral | | digg_count | Number of likes | 15400 | | share_count | Number of shares | 892 | | comment_count | Number of comments | 1250 | | play_count | Number of views | 125000 | | profile_username | Creator's username | @creativity_master | | profile_followers | Creator's follower count | 50000 | | hashtags | Post hashtags | #viral #trending #fyp | | create_time | Post creation timestamp | 2025-01-15T10:30:00Z | | video_url | Direct video URL | https://video.tiktok.com/tos/... | 🚀 Setup Instructions Step 1: Prerequisites n8n instance (self-hosted or cloud) Bright Data account with TikTok scraping dataset access Google account with Sheets access Basic understanding of n8n workflows Step 2: Import Workflow Copy the provided JSON workflow code In n8n: Go to Workflows → + Add workflow → Import from JSON Paste the JSON code and click Import The workflow will appear in your n8n interface Step 3: Configure Bright Data In n8n: Navigate to Credentials → + Add credential → Bright Data API Enter your Bright Data API credentials Test the connection to ensure it's working Update the workflow nodes with your dataset ID: gd_lu702nij2f790tmv9h Replace BRIGHT_DATA_API_KEY with your actual API key Step 4: Configure Google Sheets Create a new Google Sheet or use an existing one Copy the Sheet ID from the URL In n8n: Credentials → + Add credential → Google Sheets OAuth2 API Complete OAuth setup and test connection Update the Google Sheets node with your Sheet ID Ensure the sheet has a tab named "Tiktok by keyword" Step 5: Test the Workflow Activate the workflow using the toggle switch Access the form trigger URL to submit a test keyword Monitor the workflow execution in n8n Verify data appears in your Google Sheet Check that all fields are populated correctly ⚙️ Configuration Details Bright Data API Settings Dataset ID:** gd_lu702nij2f790tmv9h Discovery Type:** discover_new Search Method:** keyword Results per Input:** 2 posts per keyword Include Errors:** true Workflow Parameters Wait Time:** 1 minute between status checks Status Check:** Monitors until scraping is complete Data Format:** JSON response from Bright Data Error Handling:** Automatic retry on incomplete scraping 📋 Usage Guide Running the Workflow Access the form trigger URL provided by n8n Enter your desired keyword (e.g., "viral dance", "cooking tips") Submit the form to start the scraping process Wait for the workflow to complete (typically 2-5 minutes) Check your Google Sheet for the extracted data Best Practices Use specific, relevant keywords for better results Monitor your Bright Data usage to stay within limits Regularly backup your Google Sheets data Test with simple keywords before complex searches Review extracted data for accuracy and completeness 🔧 Troubleshooting Common Issues 🚨 Scraping Not Starting Verify Bright Data API credentials are correct Check dataset ID matches your account Ensure sufficient credits in Bright Data account 🚨 No Data in Google Sheets Confirm Google Sheets credentials are authenticated Verify sheet ID is correct Check that the "Tiktok by keyword" tab exists 🚨 Workflow Timeout Increase wait time if scraping takes longer Check Bright Data dashboard for scraping status Verify keyword produces available results 📈 Use Cases Content Research Research trending content and hashtags in your niche to inform your content strategy. Competitor Analysis Monitor competitor posts and engagement metrics to understand market trends. Influencer Discovery Find influencers and creators in specific topics or industries. Market Intelligence Gather data on trending topics, hashtags, and user engagement patterns. 🔒 Security Notes Keep your Bright Data API credentials secure Use appropriate Google Sheets sharing permissions Monitor API usage to prevent unexpected charges Regularly rotate API keys for better security Comply with TikTok's terms of service and data usage policies 🎉 Ready to Use! Your TikTok scraper is now configured and ready to extract valuable data. Start with simple keywords and gradually expand your research as you become familiar with the workflow. Need Help? Visit the n8n community forum or check the Bright Data documentation for additional support and advanced configuration options. For any questions or support, please contact: Email or fill out this form