by Custom Workflows AI
Introduction This workflow offers a streamlined solution for uploading multiple files to a GitHub repository simultaneously using GitHub's REST API. It addresses a significant limitation of n8n's native GitHub node, which only supports single-file uploads at a time. By leveraging GitHub's Git Data API, this workflow creates a new Git tree containing multiple files, commits this tree, and updates the target branch—all in a single automated process. The workflow is particularly valuable for automation scenarios that require batch file operations, such as deploying website updates, publishing documentation, or maintaining configuration files across repositories. It eliminates the need for multiple separate API calls when working with multiple files, making your automation more efficient and less prone to partial update issues. By abstracting the complexities of GitHub's Git Data API into a reusable workflow, it provides a practical solution for developers, content managers, and DevOps professionals who need to programmatically manage repository content at scale. Who is this for? This workflow is designed for: Developers and DevOps engineers who need to automate file updates in GitHub repositories Content managers who regularly publish multiple files to GitHub-hosted websites or documentation Automation specialists looking to integrate GitHub operations into larger workflows Teams using n8n for CI/CD processes who need to push code or configuration changes Users should have basic familiarity with GitHub concepts (repositories, branches, commits) and should be comfortable obtaining and using GitHub Personal Access Tokens. While the workflow handles the API complexity, users should understand the fundamentals of version control to effectively utilize and customize it. What problem is this workflow solving? This workflow addresses several key challenges: Limited batch operations: n8n's native GitHub node only supports uploading one file at a time, making multi-file operations cumbersome and inefficient. API complexity: GitHub's Git Data API requires multiple sequential calls with interdependent data to create commits with multiple files, which is complex to implement manually. Automation bottlenecks: Without this workflow, automating multi-file updates would require either multiple separate API calls (risking partial updates) or custom scripting outside of n8n. Consistency issues: When files need to be updated together (e.g., code and corresponding documentation), this workflow ensures they're committed in a single atomic operation. By solving these issues, the workflow enables reliable, atomic updates of multiple files, maintaining repository consistency and simplifying automation processes. What this workflow does Overview This workflow uses GitHub's REST API to push multiple files to a repository in a single operation. It follows Git's internal model by: Retrieving the current state of the repository Creating a new tree with the files to be added or updated Creating a new commit with this tree Updating the branch reference to point to the new commit Process Initialization: The workflow starts with a manual trigger and sets up GitHub credentials and repository information. File Content Definition: Two "Set" nodes define the content for the files to be uploaded. Repository State Retrieval: The workflow fetches the latest commit SHA for the specified branch It then retrieves the base tree SHA from this commit Tree Creation: A new Git tree is created that includes both files (file1.txt and file2.txt), specifying their paths and content. Commit Creation: A new commit is created with the specified commit message, referencing the new tree and the parent commit. Branch Update: Finally, the branch reference is updated to point to the new commit, making the changes visible in the repository. Setup To use this workflow: Import the workflow: Download the workflow JSON and import it into your n8n instance. Create a GitHub Personal Access Token: Go to GitHub Settings → Developer Settings → Personal Access Tokens → Fine-grained tokens Create a new token with "Contents" permission (Read and write) for your target repository Configure the workflow: Update the "Set Github Info" node with: Your GitHub Personal Access Token Your GitHub username Your repository name The target branch (default is "main") A commit message Define file content: Modify the "File 1" and "File 2" nodes with the content you want to upload Adjust file paths if needed: In the "Create new tree" node, update the file paths if you want to change where the files are stored in the repository Save and run the workflow: Click "Test workflow" to execute the process. How to customize this workflow to your needs This workflow can be adapted in several ways: Add more files: Create additional "Set" nodes for more file content In the "Create new tree" node, add more tree entries following the same pattern (path, mode, type, content) Change file locations: Modify the "path" parameters in the "Create new tree" node to place files in different directories Dynamic file content: Replace the static content in the "File" nodes with data from other sources Use previous nodes or HTTP requests to generate file content dynamically Conditional file updates: Add IF nodes to determine which files should be updated based on certain conditions Create separate branches in your workflow for different update scenarios Scheduled updates: Replace the manual trigger with a Schedule node to run the workflow at specific intervals Combine with other triggers like Webhook or database events to push files when certain events occur Error handling: Add Error Trigger nodes to handle potential API failures Implement notification nodes to alert you of successful pushes or failures
by Danielle Gomes
Automatically classify incoming leads based on the sentiment of their message using Google Gemini, store them in Supabase by category, and send tailored WhatsApp messages via the official WhatsApp Cloud API. ✅ Use Case: This workflow is ideal for sales, onboarding, and customer support teams who want to: Understand the tone and urgency of each lead Prioritize hot leads instantly Send smart, automatic WhatsApp replies based on user sentiment 🧠 How it works: Capture lead via a Typeform webhook Clean and structure the data (name, email, message, etc.) Run sentiment analysis using Google Gemini to classify the message as: Positive → Hot Lead Neutral → Warm Lead Negative → Cold Lead Store lead data in Supabase under the corresponding category Merge data to unify flow paths Send WhatsApp message using the official WhatsApp Cloud API, with a custom reply for each sentiment result 🔧 Tools used: Typeform (incoming data) Google Gemini (AI-based sentiment classification) Supabase (database) WhatsApp Cloud API (response automation) 🏷 Tags: AI, Sentiment Analysis, Lead Qualification, Supabase, WhatsApp, Gemini, Typeform, CRM, Automation, Customer Engagement
by Oneclick AI Squad
This n8n workflow automates the process of scraping LinkedIn profiles using the Apify platform and organizing the extracted data into Google Sheets for easy analysis and follow-up. Use Cases Lead Generation**: Extract contact information and professional details from LinkedIn profiles Recruitment**: Gather candidate information for talent acquisition Market Research**: Analyze professional networks and industry connections Sales Prospecting**: Build targeted prospect lists with detailed professional information How It Works 1. Workflow Initialization & Input Webhook Start Scraper**: Triggers the entire scraping workflow Read LinkedIn URLs**: Retrieves LinkedIn profile URLs from Google Sheets Schedule Scraper Trigger**: Sets up automated scheduling for regular scraping 2. Data Processing & Extraction Data Formatting**: Prepares and structures the LinkedIn URLs for processing Fetch Profile Data**: Makes HTTP requests to Apify API with profile URLs Run Scraper Actor**: Executes the Apify LinkedIn scraper actor Get Scraped Results**: Retrieves the extracted profile data from Apify 3. Data Storage & Completion Save to Google Sheets**: Stores the scraped profile data in organized spreadsheet format Update Progress Tracker**: Updates workflow status and progress tracking Process Complete Wait**: Ensures all operations finish before final steps Send Success Notification**: Alerts users when scraping is successfully completed Requirements Apify Account Active Apify account with sufficient credits API token for authentication Access to LinkedIn Profile Scraper actor Google Sheets Google account with Sheets access Properly formatted input sheet with LinkedIn URLs Credentials configured in n8n n8n Setup HTTP Request node credentials for Apify Google Sheets node credentials Webhook endpoint configured How to Use Step 1: Prepare Your Data Create a Google Sheet with LinkedIn profile URLs Ensure the sheet has a column named 'linkedin_url' Add any additional columns for metadata (name, company, etc.) Step 2: Configure Credentials Set up Apify API credentials in n8n Configure Google Sheets authentication Update webhook endpoint URL Step 3: Customize Settings Adjust scraping parameters in the Apify node Modify data fields to extract based on your needs Set up notification preferences Step 4: Execute Workflow Trigger via webhook or manual execution Monitor progress through the workflow Check Google Sheets for scraped data Review completion notifications Good to Know Rate Limits**: LinkedIn scraping is subject to rate limits. The workflow includes delays to respect these limits. Data Quality**: Results depend on profile visibility and LinkedIn's anti-scraping measures. Costs**: Apify charges based on compute units used. Monitor your usage to control costs. Compliance**: Ensure your scraping activities comply with LinkedIn's Terms of Service and applicable laws. Customizing This Workflow Enhanced Data Processing Add data enrichment steps to append additional information Implement duplicate detection and merge logic Create data validation rules for quality control Advanced Notifications Set up Slack or email alerts for different scenarios Create detailed reports with scraping statistics Implement error recovery mechanisms Integration Options Connect to CRM systems for automatic lead creation Integrate with marketing automation platforms Export data to analytics tools for further analysis Troubleshooting Common Issues Apify Actor Failures**: Check API limits and actor status Google Sheets Errors**: Verify permissions and sheet structure Rate Limiting**: Implement longer delays between requests Data Quality Issues**: Review scraping parameters and target profiles Best Practices Test with small batches before scaling up Monitor Apify credit usage regularly Keep backup copies of your data Regular validation of scraped information accuracy
by Alex Hi no code
Automate Instagram DMs with OpenAI GPT and ManyChat How It Works: Once connected, GPT will automatically initiate conversations with messages from new recipients in Intagram. Who Is This For? This workflow is ideal for marketers, business owners content creators who want to automatically respond to Instagram direct messages using OpenAI GPT. By integrating ManyChat, you can manage conversations, nurture leads, and provide instant replies at scale. What This Workflow Does Captures** incoming Instagram DMs through ManyChat’s integration. Processes** messages with GPT to generate a relevant response. Delivers** instant replies back to Instagram users, creating efficient, AI-driven communication. Setup Import the Template: Copy the n8n workflow into your workspace. OpenAI Credentials: Add your OpenAI API key in n8n so GPT can generate responses. ManyChat Account: Create (or log in to) your ManyChat account. Connect Instagram: Link your Instagram profile as a channel in ManyChat. ManyChat Custom Field: Create a custom field for storing user input or conversation context. Configure Default Reply: In ManyChat, set up the default Instagram reply flow to point to your n8n webhook. Add External Request: Create an external request step in ManyChat to send messages to n8n. Test the Flow: Send yourself a DM on Instagram to confirm the workflow triggers and GPT responds correctly. Instructions and links: Notion instruction Register in ManyChat
by Rodrigue Gbadou
How it works This comprehensive recruitment automation workflow transforms your hiring process from manual screening to intelligent candidate management. The system begins by automatically collecting CVs from multiple job boards and career platforms, immediately parsing each submission using advanced AI technology to extract key information including skills, experience levels, educational background, and career progression patterns. Once parsed, the workflow employs predictive scoring algorithms that evaluate each candidate against your specific job requirements and company culture criteria. This multi-dimensional analysis considers technical skills alignment, experience relevance, cultural fit indicators, and career trajectory patterns to generate compatibility scores with remarkable accuracy. The system then seamlessly transitions qualified candidates into automated interview scheduling, coordinating availability across hiring managers, team members, and candidates while optimizing for timezone considerations and calendar conflicts. Finally, successful candidates enter a personalized onboarding workflow that adapts to their role, department, and experience level, ensuring smooth integration into your organization. Target audience and problem solved This workflow is designed for HR departments, talent acquisition teams, and growing companies struggling with time-intensive recruitment processes. It specifically addresses the challenges of manual CV screening, subjective candidate evaluation, scheduling conflicts, and inconsistent onboarding experiences. Organizations processing high volumes of applications or seeking to eliminate recruitment bias while maintaining quality standards will benefit most from this automation. Set up steps Prerequisites: Ensure you have API access to your chosen AI parsing service (OpenAI, Affinda, or equivalent), active accounts on target job boards, and administrative access to your calendar and ATS systems. Configure job board integrations: Connect your LinkedIn Recruiter, Indeed, and Glassdoor accounts using their respective APIs. Set up webhook endpoints to automatically capture new CV submissions and configure filtering criteria based on job titles, locations, and basic qualifications. Establish AI parsing service: Choose and configure your CV analysis provider (OpenAI for natural language processing, Affinda for specialized CV parsing, or alternative services). Set up API credentials and define extraction templates for skills, experience, education, and custom fields relevant to your industry. Integrate calendar systems: Connect Google Calendar, Outlook, or your preferred scheduling platform. Configure availability windows for all hiring team members, set interview duration templates, and establish buffer times between meetings. Synchronize ATS platform: Link your Applicant Tracking System (Workday, BambooHR, Greenhouse, etc.) to ensure seamless candidate data flow. Map workflow fields to your ATS schema and establish status update triggers. Connect interview tools: Integrate video conferencing platforms (Zoom, Microsoft Teams, Google Meet) for automatic meeting room creation and invitation distribution. Configure recording settings and waiting room preferences. Link HRMS for onboarding: Connect your Human Resource Management System to trigger personalized onboarding sequences based on role type, department, and seniority level. Key Features 🧠 Advanced CV analysis**: Leverages machine learning to automatically extract and categorize skills, experience, education, certifications, and career progression patterns with 95% accuracy 📊 Multi-criteria scoring**: Implements customizable evaluation matrices considering technical skills, soft skills, experience relevance, cultural fit indicators, and growth potential 📅 Intelligent scheduling**: Automatically coordinates complex interview schedules across multiple stakeholders, considering time zones, availability preferences, and interview type requirements 🎯 Precise candidate matching**: Generates compatibility percentages based on job requirements, team dynamics, and long-term career alignment factors ⚡ Accelerated recruitment cycle**: Reduces time-to-hire by up to 60% through automated screening, intelligent prioritization, and streamlined communication workflows 👥 Collaborative evaluation**: Enables structured feedback collection from multiple interviewers with standardized scoring rubrics and consensus-building tools 📱 Enhanced candidate experience**: Provides mobile-optimized interfaces for application tracking, interview scheduling, and communication throughout the recruitment journey 🔄 Continuous optimization**: Automatically tracks and analyzes recruitment metrics to continuously improve scoring algorithms and process efficiency Customization options The workflow offers extensive customization capabilities including adjustable scoring weights for different criteria, industry-specific skill taxonomies, custom interview formats, and role-based onboarding paths. Organizations can configure approval workflows, set up custom notification templates, and establish specific integration parameters to match their unique recruitment processes and company culture. This automation solution transforms recruitment from a time-intensive manual process into a strategic, data-driven system that improves both hiring quality and candidate experience while significantly reducing administrative overhead.
by ist00dent
This n8n template allows you to instantly generate QR codes from any text or URL by simply sending a webhook request. It's a versatile tool for creating dynamic QR codes for various purposes, from marketing campaigns to event registrations, directly integrated into your automated workflows. 🔧 How it works Receive Data Webhook: This node acts as the entry point for the workflow. It listens for incoming POST requests and expects a JSON body with a data property containing the text or URL you want to encode into the QR code. Generate QR Code: This node makes an HTTP GET request to the QR Server API (api.qrserver.com) to generate the QR code image. The content from your webhook is passed as the data parameter to the API. Respond with QR Code: This node sends the response from the QR Server API back to the service that initiated the webhook. The QR Server API directly returns the image data, so your webhook response will be the QR code image itself. 👤 Who is it for? This workflow is ideal for: Marketers: Generate QR codes for product links, event registrations, or promotional materials on the fly. Developers: Integrate QR code generation into applications, websites, or internal tools. Event Organizers: Create dynamic QR codes for ticketing, information access, or check-ins. Businesses: Streamline processes requiring physical-to-digital transitions, like menu access or contact sharing. Automation Enthusiasts: Add QR code generation capabilities to any workflow. 📑 Data Structure When you trigger the webhook, send a POST request with a JSON body structured as follows: { "data": "https://www.yourwebsite.com/your-specific-page-or-text-to-encode" } The workflow will return the QR code image directly in the response. ⚙️ Setup Instructions Import Workflow: In your n8n editor, click "Import from JSON" and paste the provided workflow JSON. Configure Webhook Path: Double-click the Receive Data Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /generate-qr). Customize QR Code (Optional): Double-click the Generate QR Code node. You can adjust the size parameter in the URL (e.g., size=200x200 for a larger QR code) or add other parameters supported by the QR Server API (e.g., bgcolor, color, qzone). Activate Workflow: Save and activate the workflow. 📝 Tips Handling the Image Output: Since the QR Server API directly returns the image, the webhook response will be the image data. Depending on your use case, you might want to: Save to File/Cloud: Insert a node (e.g., Write Binary File, Amazon S3, Google Drive) after Generate QR Code to save the image to a file system or cloud storage. Embed in HTML/Email: If you're building an HTML response or sending an email, you might need to convert the image data to a Base64 string or provide a URL to a saved image. Error Handling: Enhance workflow robustness by adding an Error Trigger node. This allows you to catch any issues during QR code generation and set up notifications or logging. Dynamic Size/Color: You can extend the Receive Data Webhook to accept parameters for size, color, or bgcolor in the incoming JSON. Then, dynamically pass these to the url of the Generate QR Code node to create highly customizable QR codes. Input Validation: For more advanced use cases, you could add a Function node after the webhook to validate the incoming data to ensure it's in a valid format (e.g., a URL).
by LukaszB
Crypto Price Alert – n8n Workflow A simple and effective crypto alert system for anyone who wants to stay up to date with coin price changes — without refreshing charts all day. This workflow checks the current price of your chosen cryptocurrency (via CoinGecko) and sends you an alert on Discord if it goes above or below your target range. It’s lightweight, easy to set up, and runs on autopilot. What the Workflow Does Checks the live price of a selected coin using the CoinGecko API. Compares it to the max/min prices you define manually. Decides if the price is too high or too low. Sends an alert message to Discord depending on the result. How It Works The flow is triggered manually or on a schedule (your choice). It pulls the current price of the coin you set. Compares that price with your min and max values. Sends a “high” or “low” message to your Discord webhook. Setup Steps Enter your coin ID and price thresholds in the “Set Low and High” node. Paste your Discord webhook URLs in the "Message High" and "Message Low" nodes. Optional: Adjust the schedule trigger to run every X minutes/hours. Run once manually to test — takes under 1 minutes. Full instructions and config tips are in sticky notes inside the workflow.
by Niranjan G
Automated GitHub Scanner for Exposed AWS IAM Keys Overview This n8n workflow automatically scans GitHub for exposed AWS IAM access keys associated with your AWS account, helping security teams quickly identify and respond to potential security breaches. When compromised keys are found, the workflow generates detailed security reports and sends Slack notifications with actionable remediation steps. 🔑 Key Features Automated AWS IAM Key Scanning**: Regularly checks for exposed AWS access keys on GitHub Real-time Security Alerts**: Sends immediate Slack notifications when compromised keys are detected Comprehensive Security Reports**: Generates detailed reports with exposure information and risk assessment Actionable Remediation Steps**: Provides clear instructions for securing compromised credentials Continuous Monitoring**: Maintains ongoing surveillance of your AWS environment 📋 Workflow Steps List AWS Users: Retrieves all users from your AWS account Split Users for Processing: Processes each user individually Get User Access Keys: Retrieves access keys for each user Filter Active Keys Only: Focuses only on currently active access keys Search GitHub for Exposed Keys: Scans GitHub repositories for exposed access keys Aggregate Search Results: Consolidates and deduplicates search findings Check For Compromised Keys: Determines if any keys have been exposed Generate Security Report: Creates detailed security reports for compromised keys Extract AWS Usernames: Extracts usernames from AWS response for notification Format Slack Alert: Prepares comprehensive Slack notifications Send Slack Notification: Delivers alerts with actionable information Continue Scanning: Maintains continuous monitoring cycle 🛠️ Setup Requirements Prerequisites Active n8n instance AWS account with IAM permissions GitHub account/token for searching repositories Slack workspace for notifications Required Credentials AWS Credentials: IAM user with permissions to list users and access keys Access Key ID and Secret Access Key GitHub Credentials: Personal Access Token with search permissions Slack Credentials: Webhook URL for your notification channel ⚙️ Configuration AWS Configuration: Configure the "List AWS Users" node with your AWS credentials Ensure proper IAM permissions for listing users and access keys GitHub Configuration: Set up the "Search GitHub for Exposed Keys" node with your GitHub token Adjust search parameters if needed Slack Configuration: Configure the Slack node with your webhook URL Customize notification format if desired 🚀 Usage Running the Workflow Manual Execution: Click "Execute Workflow" to run an immediate scan Scheduled Execution: Set up a schedule to run periodic scans (recommended daily or weekly) Repository Compatibility This workflow is compatible with both public and private GitHub repositories to which you have access. It will scan all repositories you have permission to view based on your GitHub credentials. Handling Alerts When a compromised key is detected: Review the Slack notification for details about the exposure Follow the recommended remediation steps: Deactivate the compromised key immediately Create a new key if needed Investigate the exposure source Update any services using the compromised key ⚠️ Disclaimer This workflow template is provided for reference purposes only to demonstrate how to automate AWS IAM key exposure scanning. Please note: The scanning process may produce false positives as it only matches potential AWS access key patterns Always verify any reported exposures manually before taking action Disabling or deleting access keys without proper verification could have significant negative impacts on your environment Understand which systems and applications rely on identified access keys before deactivating them This template should be customized to fit your specific environment and security policies IMPORTANT: Use this workflow with caution and only after thoroughly understanding your AWS environment. The authors of this template are not responsible for any disruptions or damages resulting from its use. 🔒 Security Considerations This workflow requires access to sensitive AWS credentials Store all credentials securely within n8n Review and rotate access keys regularly 📝 Customization Options Adjust GitHub search parameters for more targeted scanning Customize Slack notification format and content Modify security report generation for your specific needs Integrate with additional notification channels (email, MS Teams, etc.) Optional: Enabling Interactive Slack Buttons The Slack Block Kit notification format supports interactive buttons that can be implemented if you want to perform actions directly from Slack: Disable Key: This button can be configured to automatically disable the compromised AWS IAM access key View Details: This button can be set up to show additional information about the exposure Acknowledge: This button can be used to mark the alert as acknowledged To make these buttons functional: Set up a Slack Socket Mode App: Create a Slack app in the Slack API Console Enable Socket Mode and Interactive Components Subscribe to the block_actions event to capture button clicks Create an n8n Webhook Endpoint: Add a new webhook node to receive Slack button click events Create separate workflows for each button action Implement AWS Key Disabling: For the "Disable Key" button, create a workflow that uses the n8n HTTP Request node to call the AWS IAM UpdateAccessKey API Example HTTP request that can be implemented in n8n: Method: POST URL: https://iam.amazonaws.com/ Query Parameters: Action: UpdateAccessKey AccessKeyId: AKIAIOSFODNN7EXAMPLE Status: Inactive UserName: {{$json.username}} Version: 2010-05-08 Update the Slack Message Format: Modify the Format Slack Alert node to include your webhook URL in the button action values Add callback_id and action_id values to identify which button was clicked This implementation allows for immediate response to security incidents directly from the Slack interface, reducing response time and improving security posture.
by Dr. Firas
Auto-Publish Social Videos to 9 Platforms via Google Sheets and Blotato Who is this workflow for? This workflow is ideal for marketers, content creators, virtual assistants, and automation specialists managing multi-platform video content. It’s especially useful for teams who want to centralize publishing via a spreadsheet and automate social distribution in one shot. What problem does this workflow solve? Manually posting videos to multiple social platforms is tedious and time-consuming. This workflow allows you to streamline video distribution using Blotato’s API — no more switching between platforms or re-uploading the same video multiple times. What this workflow does This automation reads video metadata (URL, caption, title) from a Google Sheet, uploads the video to Blotato, and automatically publishes it to Instagram, YouTube, TikTok, Facebook, LinkedIn, Threads, Twitter (X), Pinterest, and Bluesky. It also updates the sheet to reflect the publishing status (STATUS = DONE), ensuring that your data remains clean and trackable. Setup Set up your Google Sheet with the required columns: PROMPT, DESCRIPTION, URL VIDEO, Titre, row_number, and STATUS. Add your Blotato API key in the headers of the Upload Video and Post to X nodes. Replace the platform-specific IDs in the Assign Social Media IDs node (Instagram ID, Facebook Page ID, etc.). Set the schedule in the Schedule Trigger node to define when the publishing happens. > ⚠️ Disclaimer: This workflow uses Community Nodes. These are only available on self-hosted n8n instances. How to customize this workflow Add logic to skip rows already marked as DONE. Expand to more platforms supported by Blotato. Use a webhook or Telegram trigger instead of the scheduler for more interactivity. Modify content per platform if needed (caption formatting, hashtags, etc.). 📄 Documentation: Notion Guide Demo Video 🎥 Watch the full tutorial here: YouTube Demo
by Dvir Sharon
📰 Publish Latest News on X and Other Social Media Platforms Using Keyword A comprehensive n8n automation that fetches the latest news based on keywords, generates AI-powered social media content, and automatically publishes to X (Twitter) with complete tracking and notification systems. 📋 Overview This workflow provides a professional news publishing solution that automatically discovers breaking news, creates engaging social media content using AI, and publishes to X (Twitter) with comprehensive tracking. Perfect for news organizations, content creators, social media managers, and businesses wanting to stay current with automated news sharing. The system uses BrightData's Google News dataset, OpenAI's GPT-4o for content generation, and multi-platform integration for complete automation. ⭐ Key Features 📝 Form-Based Input**: Clean web form for keyword and country submission 📰 Real-Time News Fetching**: BrightData Google News integration for latest articles 🤖 AI Content Generation**: GPT-4o powered tweet creation with hashtags 📱 Auto X Publishing**: Direct posting to X (Twitter) with URL tracking 📊 Complete Tracking**: Google Sheets logging of all published content 🔔 Email Notifications**: Success alerts with tweet links 🌍 Multi-Country Support**: Localized news for US, India, UK, Australia ⚡ Status Monitoring**: Real-time progress tracking with retry logic 🛡 Error Handling**: Robust error management and validation 🔄 Loop Management**: Intelligent waiting for news processing completion 🎯 What This Workflow Does Input: News Name**: Keyword or topic for news search (required) Country**: Target country for localized news (dropdown: US/IN/GB/AU) Processing: Form Submission: Captures news keyword and target country News Triggering: Initiates BrightData Google News scraping job Status Monitoring: Checks scraping progress with intelligent retry loop Data Retrieval: Fetches latest news articles when ready AI Content Creation: Generates engaging tweet content using GPT-4o Social Publishing: Posts content to X (Twitter) automatically URL Generation: Creates direct tweet links for tracking Data Logging: Saves content and URLs to Google Sheets Email Notification: Sends success confirmation with tweet link Completion: Workflow ends with full audit trail 📋 Output Data Points | Field | Description | Example | | :------------ | :---------------------------------- | :----------------------------------------------------------------------------------------------------- | | TweetMessage | AI-generated social media content | "Breaking: AI revolution transforming healthcare with 40% efficiency gains. New study shows promising results in patient care automation. #AI #Healthcare #Innovation #TechNews #US" | | TweetURL | Direct link to published tweet | https://twitter.com/i/web/status/1234567890123456789 | 🛠️ Setup Instructions Prerequisites: n8n instance (self-hosted or cloud) X (Twitter) account with API v2 access OpenAI account with GPT-4o access Gmail account for notifications Google account with Sheets access BrightData account with Google News dataset access Basic understanding of social media automation Step 1: Import the Workflow Copy the JSON workflow code from the provided file. In n8n, click "+ Add workflow". Select "Import from JSON". Paste the workflow code and click "Import". The workflow will appear with all nodes properly connected. Step 2: Configure API Credentials X (Twitter) API Setup: Create X Developer Account at developer.twitter.com. Create new app and generate API keys. In n8n: Credentials → + Add credential → Twitter OAuth2 API. Add your Twitter API credentials: API Key API Secret Key Bearer Token Access Token Access Token Secret Test the connection with a sample tweet. OpenAI API Configuration: Get API key from platform.openai.com. Ensure GPT-4o model access is available. In n8n: Credentials → + Add credential → OpenAI API. Add your OpenAI API key. Verify model access in the "OpenAI Chat Model" node. Gmail Integration: Create "Gmail OAuth2" credential. Follow OAuth setup process. Grant email sending permissions. Test with sample email. BrightData News API: The workflow uses pre-configured token: 5662edde-6735-4c5d-a6c6-693043a5a9a5. Dataset ID: gd_lnsxoxzi1omrwnka5r (Google News). Verify access to Google News dataset. Test API connection. Google Sheets Integration: Create "Google Sheets OAuth2 API" credential. Complete OAuth authentication. Grant read/write permissions. Test connection. Step 3: Configure Google Sheets Integration Create Google Sheets Structure: Sheet Name: "Publish Latest News on Social Media Platforms Using Keyword" Tab: "Data" (default) Columns: Tweet Message: AI-generated content posted to X Tweet URL: Direct link to published tweet Sheet Configuration: Create new Google Sheet or use existing one. Add the required column headers. Copy Sheet ID from URL: https://docs.google.com/spreadsheets/d/SHEET_ID_HERE/edit. Current configured Sheet ID: 1koxNrwdeuaSBdREuKc7JQh3d9blEk0sQDJ8VgVLjPOo. Update Workflow Settings: Open "Google Sheets" node. Replace Document ID with your Sheet ID. Select your Google Sheets credential. Choose "Data" sheet/tab. Verify column mapping is correct. Step 4: Configure Form Interface Form Settings: Open "On form submission" node. Form configuration: Title: "News Publisher" Description: "publish latest news to direct social media" Fields: News Name (text, required) Country (dropdown: US, IN, GB, AU, required) Webhook URL: Copy webhook URL from form trigger node. Current webhook ID: 8d320705-688c-4150-a393-cf899d2bbb52. Test form accessibility and submission. Step 5: Configure Email Notifications Gmail Setup: Open "Gmail" node. Update recipient email: raushan@iwantonlinemarketing.com. Email template includes: Success confirmation Direct tweet link Professional formatting Test email delivery. Step 6: Test the Workflow Sample Test Data: Use these examples for testing: | News Name | Country | Expected Results | | :-------------------- | :------ | :------------------------------------------------- | | artificial intelligence | US | Latest AI news with US-specific hashtags | | cricket world cup | IN | Sports news with India-focused content | | brexit update | GB | UK political news with British hashtags | | bushfire news | AU | Australian environmental news | Testing Process: Activate the workflow (toggle switch). Navigate to the webhook form URL. Submit test data. Monitor execution progress: News fetching (30-60 seconds) AI content generation (10-15 seconds) X publishing (5-10 seconds) Sheet update and email (5 seconds) Verify results in all platforms. 📖 Usage Guide Using the Form Interface Navigate to the webhook URL provided by the form trigger. Enter news keyword or topic (e.g., "climate change", "stock market", "technology"). Select target country from dropdown. Click submit and wait for processing. Check email for success notification with tweet link. Example Inputs to Test | News Name | Country | Expected | | :-------------------------------- | :------ | :----------------------------------------------------- | | "artificial intelligence breakthrough" | "US" | Latest AI developments with tech hashtags | | "football premier league" | "GB" | UK football news with sports hashtags | | "stock market updates" | "IN" | Indian market news with finance hashtags | | "hollywood movies" | "AU" | Entertainment news with Australian perspective | Country-Specific Considerations United States (US)**: Focus on national news and global impact. Hashtags: #USA, #American, #Breaking, #News. Time zone considerations for optimal posting. India (IN)**: Emphasis on regional relevance. Hashtags: #India, #Indian, #News, #Breaking. Cultural context in content generation. United Kingdom (GB)**: British perspective and terminology. Hashtags: #UK, #British, #News, #Breaking. Focus on European context. Australia (AU)**: Australian angle and regional focus. Hashtags: #Australia, #Australian, #News, #Breaking. Pacific region context. 📊 Reading the Results Google Sheets Data The output sheet contains: Complete tweet content with hashtags and formatting. Direct tweet URLs for easy access and sharing. Chronological record of all published content. Audit trail for content management. Email Notifications Success emails include: Confirmation that content was published. Direct link to view the tweet. Professional formatting for easy reference. X (Twitter) Posts Published content features: AI-optimized messaging within 260 character limit. Relevant hashtags based on topic and country. Engaging format designed for social media. Professional tone suitable for news sharing. 🔧 Customization Options Expanding Social Media Platforms Add more platforms to the publishing workflow: // Add LinkedIn publishing { "node": "LinkedIn", "type": "n8n-nodes-base.linkedin", "parameters": { "text": "={{ $json.output }}", "additionalFields": {} } } // Add Facebook posting { "node": "Facebook", "type": "n8n-nodes-base.facebook", "parameters": { "pageId": "YOUR_PAGE_ID", "message": "={{ $json.output }}" } }
by explorium
Explorium Prospects Search Chatbot Template Download the following json file and import it to a new n8n workflow: mcp\_to\_prospects\_to\_csv.json Overview This n8n workflow creates a chatbot that understands natural language requests for finding business prospects and automatically: Interprets your query using AI (Claude Sonnet 3.7) Converts it to proper Explorium API filters Validates the API request structure Fetches prospect data from Explorium Exports results as a downloadable CSV file Perfect for sales teams, recruiters, and business development professionals who need to quickly find and export targeted prospect lists without learning complex API syntax. Key Features Natural Language Interface**: Simply describe who you're looking for in plain English Smart Query Translation**: AI converts your request to valid API parameters Built-in Validation**: Ensures API calls meet Explorium's requirements Error Recovery**: Automatically retries with corrections if validation fails Pagination Support**: Handles large result sets automatically CSV Export**: Clean, formatted output ready for CRM import Conversation Memory**: Maintains context for follow-up queries Example Queries The chatbot understands queries like: "Find marketing directors at SaaS companies in New York with 50-200 employees" "Get me CTOs from fintech startups in California" "Show me sales managers at healthcare companies with revenue over $10M" "Find engineers at Microsoft with 3-5 years experience" "Get customer service leads from e-commerce companies in Europe" Prerequisites Before setting up this workflow, ensure you have: n8n instance with chat interface enabled Anthropic API key for Claude Explorium API credentials (Bearer token) - Get explorium api key Basic understanding of n8n chat workflows Supported Filters The chatbot can search using these criteria: Company Filters Size**: 1-10, 11-50, 51-200, 201-500, 501-1000, 1001-5000, 5001-10000, 10001+ employees Revenue**: Ranges from $0-500K up to $10T+ Age**: 0-3, 3-6, 6-10, 10-20, 20+ years Location**: Countries, regions, cities Industry**: Google categories, NAICS codes, LinkedIn categories Name**: Specific company names Prospect Filters Job Level**: CXO, VP, Director, Manager, Senior, Entry, etc. Department**: Sales, Marketing, Engineering, Finance, HR, etc. Experience**: Total months and current role duration Location**: Country and region codes Contact Info**: Filter by email/phone availability Installation & Setup Step 1: Import the Workflow Copy the workflow JSON from the template In n8n: Workflows → Add Workflow → Import from File Paste the JSON and click Import Step 2: Configure Anthropic Credentials Click on the Anthropic Chat Model1 node Under Credentials, click Create New Add your Anthropic API key Name: "Anthropic API" Save credentials Step 3: Configure Explorium Credentials You'll need to set up Explorium credentials in two places: For MCP Client: Click on the MCP Client node Under Credentials, create new Header Auth Add your authentication header (usually Authorization: Bearer YOUR_TOKEN) Save credentials For API Calls: Click on the Prospects API Call node Use the same Header Auth credentials created above Verify the API endpoint is correct Step 4: Activate the Workflow Save the workflow Click the Active toggle to enable it The chat interface will now be available Step 5: Access the Chat Interface Click on the When chat message received node Copy the webhook URL Access this URL in your browser to start chatting How It Works Workflow Architecture Chat Trigger: Receives natural language queries from users Memory Buffer: Maintains conversation context AI Agent: Interprets queries and generates API parameters Validation: Checks API structure against Explorium requirements API Call: Fetches prospect data with pagination Data Processing: Formats results for CSV export File Conversion: Creates downloadable CSV file Processing Flow User Query → AI Interpretation → Validation → API Call → CSV Export ↑ ↓ └──── Error Correction Loop ←──────┘ Validation Rules The workflow validates: Filter keys are allowed by Explorium API Values match expected formats (e.g., valid country codes) Range filters have proper gte/lte values No duplicate values in arrays Required structure is maintained Usage Guide Basic Conversation Flow Start with your query: "Find me VPs of Sales at software companies in the US" Bot processes and responds: Generates API filters Validates the structure Fetches data Returns CSV download link Refine if needed: "Can you also include directors and filter for companies with 100+ employees?" Query Tips Be specific**: Include job titles, departments, company details Use standard terms**: "CTO" instead of "Chief Technology Officer" Specify locations**: Use country names or standard codes Include size/revenue**: Helps narrow results effectively Advanced Queries Combine multiple criteria: "Find engineering managers and senior engineers at B2B SaaS companies in New York and California with 50-500 employees and revenue over $5M who have been in their role for at least 1 year" Output Format The CSV file includes: Prospect ID Name (first, last, full) Location (country, region, city) LinkedIn profile Experience summary Skills and interests Company details Job information Business ID Troubleshooting Common Issues "Validation failed" errors Check that your query uses supported filter values Ensure location names are spelled correctly Verify company sizes/revenues match allowed ranges No results returned Broaden your search criteria Check if the company exists in Explorium's database Verify filter combinations aren't too restrictive Chat not responding Ensure workflow is activated Check all credentials are properly configured Verify webhook URL is accessible Large result sets timing out Try adding more specific filters Limit results by location or company size Use the size parameter (max 10,000) Error Messages The bot provides clear feedback: Invalid filters**: Shows which filters aren't supported Value errors**: Lists correct options for each field API failures**: Explains connection or authentication issues Performance Optimization Best Practices Start broad, then narrow: Begin with basic criteria and add filters Use business IDs: When targeting specific companies Limit by contact info: Add has_email: true for actionable leads Batch by location: Process regions separately for large searches API Limits Maximum 10,000 results per search Pagination handles up to 100 records per page Rate limits apply based on your Explorium subscription Customization Options Modify AI Behavior Edit the AI Agent system message to: Change response format Add custom filters Adjust interpretation logic Include additional instructions Extend Functionality Add nodes to: Send results via email Import directly to CRM Schedule recurring searches Create custom reports Integration Ideas Connect to Slack for team queries Add to CRM workflows Create lead scoring systems Build automated outreach campaigns Security Considerations API credentials are stored securely in n8n Chat sessions are isolated No prospect data is stored permanently CSV files are generated on-demand Support Resources For issues with: n8n platform**: Check n8n documentation Explorium API**: Contact Explorium support Anthropic/Claude**: Refer to Anthropic docs Workflow logic**: Review node configurations
by Calistus Christian
What this workflow does Automatically triages risky AWS misconfigurations and alerts your team. Pipeline: Security Hub or AWS Config -> EventBridge rules -> SNS (HTTP) -> n8n Webhook -> Normalize -> AI Prioritizer -> Airtable (log) -> Gmail (email) Normalizes incoming findings (S3 / Security Groups / IAM / RDS) into a consistent JSON. Uses an LLM to assign a priority (P0–P3) with rationale and remediation steps. Upserts the finding into Airtable (avoids duplicates). Emails a compact incident summary to your inbox. This can be swapped for Microsoft Teams or Slack, etc. Category: Security / Cloud / Alerting Time to set up: ~10–15 minutes Difficulty: Beginner–Intermediate Cost: Mostly free (n8n CE + AWS SNS/EventBridge; OpenAI + Airtable/Gmail as used) What you’ll need An n8n instance reachable over HTTP. AWS account (one region) with permissions to create SNS topics and EventBridge rules. Security Hub** enabled (or AWS Config rules that emit compliance events). n8n credentials: OpenAI, Airtable, Gmail. Nodes used Webhook** (POST /aws-misconfig) Code:** SNS Handler (token check, confirm/unwrap) IF:** route mode === "confirm" vs notification HTTP Request:** SNS SubscriptionConfirmation (GET) Code:** Normalize Finding Message a model:** AI Prioritizer (JSON out) Airtable:** Create/Upsert Gmail:** Send message Edit Fields:** final JSON response Setup steps Import and activate the workflow in n8n. Webhook Respond: When Last Node Finishes -> First Entry JSON. Append a shared secret to the URL, e.g. ?token=MY_SUPER_TOKEN, and keep the check in the SNS Handler code node. Create an SNS topic (e.g., misconfig-events) in the same region as your EventBridge rules. Create EventBridge rules targeting the SNS topic: Rule A (Security Hub): source = aws.securityhub, detail-type = Security Hub Findings - Imported Rule B (AWS Config): source = aws.config, detail-type = Config Rules Compliance Change Create an SNS subscription with Protocol = HTTP and Endpoint = your production webhook URL: http://YOUR_HOST:5678/webhook/aws-misconfig?token=MY_SUPER_TOKEN (The workflow auto-confirms the subscription on first POST.) Configure Airtable (Upsert on Finding ID) and Gmail recipients.