by PUQcloud
Setting up n8n workflow Overview The Docker n8n WHMCS module uses a specially designed workflow for n8n to automate deployment processes. The workflow provides an API interface for the module, receives specific commands, and connects via SSH to a server with Docker installed to perform predefined actions. Prerequisites You must have your own n8n server. Alternatively, you can use the official n8n cloud installations available at: n8n Official Site Installation Steps Install the Required Workflow on n8n You have two options: Option 1: Use the Latest Version from the n8n Marketplace The latest workflow templates for our modules are available on the official n8n marketplace. Visit our profile to access all available templates: PUQcloud on n8n Option 2: Manual Installation Each module version comes with a workflow template file. You need to manually import this template into your n8n server. n8n Workflow API Backend Setup for WHMCS/WISECP Configure API Webhook and SSH Access Create a Basic Auth Credential for the Webhook API Block in n8n. Create an SSH Credential for accessing a server with Docker installed. Modify Template Parameters In the Parameters block of the template, update the following settings: server_domain – Must match the domain of the WHMCS/WISECP Docker server. clients_dir – Directory where user data related to Docker and disks will be stored. mount_dir – Default mount point for the container disk (recommended not to change). Do not modify the following technical parameters: screen_left screen_right Deploy-docker-compose In the Deploy-docker-compose element, you have the ability to modify the Docker Compose configuration, which will be generated in the following scenarios: When the service is created When the service is unlocked When the service is updated nginx In the nginx element, you can modify the configuration parameters of the web interface proxy server. The main section allows you to add custom parameters to the server block in the proxy server configuration file. The main\_location section contains settings that will be added to the location / block of the proxy server configuration. Here, you can define custom headers and other parameters specific to the root location. Bash Scripts Management of Docker containers and all related procedures on the server is carried out by executing Bash scripts generated in n8n. These scripts return either a JSON response or a string. All scripts are located in elements directly connected to the SSH element. You have full control over any script and can modify or execute it as needed.
by Jez
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Uncover new business leads with this AI-Powered Prospect Discovery Agent! This n8n workflow acts as a specialized intelligent assistant that, given a business type and location, uses multiple search strategies to identify a list of potential prospect companies and their websites. Stop manually trawling through search results! This agent automates the initial phase of lead generation by: Understanding your target business profile (type, location, keywords). Strategically using web search tools (Brave Search, Google Gemini Search) to find relevant businesses. Performing quick validations to confirm relevance. Returning a clean, structured JSON list of prospect names and their website URLs. How it Works: The workflow is built around an AI agent powered by Google Gemini. This agent is equipped with tools like: Brave Web Search:** For broad initial sourcing of potential business candidates. Google Gemini Search:** For advanced, context-aware discovery and finding businesses mentioned in various online sources. Brave Local Search (Selective):** For quick verification of local presence or finding website URLs for identified names. Jina AI Web Page Scraper (Very Selective):** For extremely rapid relevance checks on uncertain websites by scanning page content for keywords. The agent's system prompt guides it to use these tools efficiently to build a list of prospects without getting bogged down in deep research on any single one at this discovery stage. Use Cases: Lead Generation:** Automatically generate lists of potential clients based on industry and location. Market Research:** Identify key players or types of businesses in a specific geographical area. Sales Development:** Provide SDRs with initial lists of companies to research further. Called as a Sub-Workflow:** Designed to be easily integrated as a "tool" into more complex orchestrating AI agents (e.g., a BNI Pitch Planner that first needs to identify who to target). Setup: Import the workflow. Configure Credentials: You'll need n8n credentials for: Google Gemini (for the Chat model and the Gemini Search/Vertex AI Search tool). Brave Search (e.g., via Smithery MCP, or adapt if you have direct API access). Jina AI (for the web scraper). Assign these to the respective nodes. Review System Prompt: The prospect_discovery_agent node contains a detailed system prompt. You can fine-tune this to adjust its search strategies or the strictness of its matching. Inputs: This workflow is triggered by an "Execute Workflow Trigger" node (prospect_discovery_workflow). It expects the following inputs: business_type (string): e.g., "artisan bakery" location_query (string): e.g., "Portland, Oregon" desired_num_prospects (number): e.g., 5 additional_keywords (string, optional): e.g., "organic, gluten-free" To Use (as a Sub-Workflow/Tool): This workflow is typically called by another n8n workflow (e.g., using a "Tool Workflow" node from the Langchain nodes). The calling workflow would provide the inputs listed above. The "Prospect Discovery" workflow will then execute and its final node (the prospect_discovery_agent) will output a JSON array of found prospects, like: [ { "business_name": "Rose Petal Bakery", "website_url": "https://rosepetalbakerypdx.com" }, { "business_name": "The Daily Bread Artisans", "website_url": "https://dailybreadpdx.com" } ] If no prospects are found, it returns an empty array []. This template provides a powerful and focused tool for automating the initial stages of prospect identification.
by Incrementors
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 📦 Multi-Platform Price Finder: Scraping Prices with Bright Data & Telegram An intelligent n8n automation that fetches real-time product prices from marketplaces like Amazon, Wayfair, Lowe's, and more using Bright Data's dataset, and sends promotional messages via Telegram using AI—perfect for price tracking, deal alerts, and affiliate monetization. 📋 Overview This automation tracks product prices across top e-commerce platforms using Bright Data and sends out alerts via Telegram based on the best available deals. The workflow is designed for affiliate marketers, resellers, and deal-hunting platforms who want real-time competitive pricing. ✨ Key Features 🔎 Multi-Platform Scraping: Supports Amazon, Wayfair, Lowe's, and more ⚡ Bright Data Integration: Access to structured product snapshots 📢 AI-Powered Alerts: Generates Telegram-ready promo messages using AI 🧠 Lowest Price Logic: Filters and compares products across sources 📈 Data Merge & Processing: Combines multiple sources into a single stream 🔄 Keyword-Driven Search: Searches using dynamic keywords from form input 📦 Scalable Design: Built for multiple platform processing simultaneously 🧼 Clean Output: Strips unnecessary formatting before publishing 🎯 What This Workflow Does Input Search Keywords**: User-defined keyword(s) from a form trigger Platform Sources**: Wayfair, Lowe's, Amazon, etc. Bright Data API Key**: Needed for authenticated scraping Processing Steps User Input via n8n form trigger (keyword-based) Bright Data API Trigger for each marketplace Status Polling: Wait until scraping snapshot is ready Data Retrieval: Fetches JSON results from Bright Data snapshot Data Cleaning & Normalization: Price, title, and URL are extracted Merging Products from all platforms Find Lowest Price Product using custom JS logic AI Prompt Generation via Claude/Anthropic Telegram Formatting and alert message creation Output 🛍️ Product Title 💰 Final Price 🔗 Product URL ✉️ Promotional Message (for Telegram/notifications) 🚀 Setup Instructions Step 1: Import Workflow Open n8n > Workflows > + Add Workflow Import the provided JSON file Step 2: Configure Bright Data Add credentials under Credentials → Bright Data API Set the appropriate dataset_id for each platform Ensure dataset includes title, price, and url fields Step 3: Enable Keyword Trigger Use the built-in Form Trigger node Input: Single keyword field (SearchHere) Step 4: Telegram or AI Integration Modify prompt node for your language or tone Add Telegram webhook or integration where needed 📖 Usage Guide Adding Keywords Trigger the form with a product keyword like iPhone 15 Wait for workflow to fetch best deals and generate Telegram message Understanding AI-Powered Output AI creates a short, engaging message like: > "🔥 Deal Alert: Get the iPhone 15 for just ₹74,999! Limited stock—Check it out: [link]" Debugging Output Output node shows cleaned JSON with title, price, url, and message If no valid results, debug message is returned with sample structure info 🔧 Customization Options Add More Marketplaces Clone any HTTP Request node (e.g., for Wayfair) Update dataset_id and required output fields Modify Price Logic Update the Code1 node to change comparison (e.g., highest price instead of lowest) Change Message Format Edit the AI Agent prompt to customize tone/language Add emoji, CTAs, or markdown formatting as needed 🧪 Test & Activation Add a few sample keywords via form trigger Run manually or set as a webhook for external app input Check final AI-generated message in output node 🚨 Troubleshooting | Issue | Solution | |-------|----------| | No Data Returned | Ensure keyword matches real products | | Status Not 'Ready' | Bright Data delay; add Wait nodes | | Invalid API Key | Check Bright Data credentials | | AI Errors | Adjust prompt or validate input fields | 📊 Use Cases 💰 Affiliate Campaigns: Show best deals across platforms 🛒 Deal Pages: Post live offers with product links 🧠 Competitor Analysis: Track cross-platform pricing 🔔 Alert Bots: Send real-time alerts to Telegram or Slack ✅ Quick Setup Checklist [x] Bright Data API credentials configured [x] n8n form trigger enabled [x] Claude or AI model connected [x] All HTTP requests working [x] AI message formatting verified 🌐 Example Output { "title": "Apple iPhone 15 Pro Max", "price": 1199, "url": "https://amazon.com/iphone-15", "message": "🔥 Grab the Apple iPhone 15 Pro Max for just $1199! Limited deal—Check it out: https://amazon.com/iphone-15" } 📬For any questions or support, please contact: 📧 <info@incrementors.com> or fill out this form: https://www.incrementors.com/contact-us/
by Davide
This workflow is a highly advanced multimodal AI assistant designed to operate through WhatsApp. It can understand and respond to text, images, voice messages, and PDF documents by combining OpenAI models with smart logic to adapt to the content received. 🎯 Core Features 📥 1. Automatic Message Type Detection Using the Input type node, the bot detects whether the user has sent: Text Voice messages Images Files (PDF) Other unsupported content 💬 2. Smart Text Message Handling Text messages are processed by an OpenAI GPT-4o-mini agent with a customized system prompt. Replies are concise, accurate, and formatted for mobile readability. 🖼️ 3. Image Analysis & Description Images are downloaded, converted to base64, and analyzed by an image-aware AI model. The output is a rich, structured description, designed for visually impaired users or visual content interpretation. 🎙️ 4. Voice Message Transcription & Reply Audio messages are downloaded and transcribed using OpenAI Whisper. The transcribed text is analyzed and answered by the AI. Optionally, the AI reply can be converted back to voice using OpenAI's text-to-speech, and sent as an audio message. 📄 5. PDF Document Extraction & Summary Only PDFs are allowed (filtered via MIME type). The document’s content is extracted and combined with the user's message. The AI then provides a relevant summary or answer. 🧠 6. Contextual Memory Each user has a personalized session ID with a memory window of 10 interactions. This ensures a more natural and contextual conversation flow. How It Works Thisworkflow is designed to handle incoming WhatsApp messages and process different types of inputs (text, audio, images, and PDF documents) using AI-powered analysis. Here’s how it functions: Trigger: The workflow starts with the **WhatsApp Trigger node, which listens for incoming messages (text, audio, images, or documents). Input Routing: The **Input type (Switch node) checks the message type and routes it to the appropriate processing branch: Text: Directly forwards the message to the AI agent for response generation. Audio: Downloads the audio file, transcribes it using OpenAI, and sends the transcription to the AI agent. Image: Downloads the image, analyzes it with OpenAI’s GPT-4 model, and generates a detailed description. PDF Document: Downloads the file, extracts text, and processes it with the AI agent. Unsupported Formats: Sends an error message if the input is not supported. AI Processing: The **AI Agent1 node, powered by OpenAI, processes the input (text, transcribed audio, image description, or PDF content) and generates a response. Response Handling**: For audio inputs, the AI’s response is converted back into speech (using OpenAI’s TTS) and sent as a voice message. For other inputs, the response is sent as a text message via WhatsApp. Memory: The **Simple Memory node maintains conversation context for follow-up interactions. Setup Steps To deploy this workflow in n8n, follow these steps: Configure WhatsApp API Credentials: Set up WhatsApp Business API credentials (Meta Developer Account). Add the credentials in the WhatsApp Trigger, Get Image/Audio/File URL, and Send Message nodes. Set Up OpenAI Integration: Provide an OpenAI API key in the Analyze Image, Transcribe Audio, Generate Audio Response, and AI Agent1 nodes. Adjust Input Handling (Optional): Modify the Switch node ("Input type") to handle additional message types if needed. Update the "Only PDF File" IF node to support other document formats. Test & Deploy: Activate the workflow and test with different message types (text, audio, image, PDF). Ensure responses are correctly generated and sent back via WhatsApp. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Gede Suparsa
This template demonstrates how to provide an interactive chatbot for your work history based off your CV. Unanswered questions and follow-up email contacts are sent to you via Telegram. Use case: link on your profile to not only show off you AI workflow skills but also to provide an interactive chatbot about your work history for prospective employers. Good to Know It will require access to an OpenAI API Key (free for low usage) and setting up a bot in Telegram (free). How it Works The n8n inbuilt chat node will be hosted on n8n services to provide the chat interface. You will upload your CV either exported from LinkedIn or exported yourself to Microsoft OneDrive along with a simple text file explaining some other information about you. On each chat interaction the PDF and text file are used as tools to get context information for the chatbot to respond. If a question cannot be answered reliably, a subworkflow will be called to capture that question and send it to you as a telegram message. If the person chatting supplies their email address, this will be sent to you via a Telegram message along with other information the user provides. How to use After importing the template, create the subworkflows so that they can be used a Tools by the AI Node. Don't forget to add the Execute sub-workflow trigger. Setup credentials for Open AI, OneDrive and telegram. Upload your CV & text file summary to OneDrive and replace the document IDs in the get_documents sub-workflow. Activate the workflow so that publicly available chat will get generated on n8n.
by Budi SJ
Automated Financial Reporting Using Google Vision OCR, Telegram & Google Sheets This workflow automates the process of recording financial transactions from photos of receipts or shopping receipts. Users simply send an image of the receipt via Telegram. The image is processed using the Google Vision API to detect text, then extracted and structured by LLM via OpenRouter. The final result is saved to Google Sheets and also displayed to the user via a Telegram bot. 🧾 Google Sheets Template Create a Google Sheet using this template: Financial Reporting 🛠️ Key Features The workflow starts when a user sends a photo of a receipt to the Telegram bot. The image is converted to text using the Google Vision API's OCR. Data processing with LLM (OpenRouter) helps identify and structure transaction elements such as: date, vendor name & address, receipt/invoice number, item list (product name, quantity, unit price, total), and transaction category. Cleaned and structured data is automatically recorded to Google Sheets per item. The system also sends a summary of the recording results in an easy to read text format. Users can also send text messages to the bot to query stored transaction data, which will be answered by a Google Sheets-based AI Agent. 🔧 Requirements Active Telegram Bot + API Token Google Vision API Key OpenRouter Account + API Key Google Sheets connected to n8n 🧩 Setup Instructions Replace all API keys and tokens with your own in the relevant nodes. Google Vision API Key: Set in 'Set Vision API' node. Telegram Bot Token: Set in 'Set Telegram Token' node and all Telegram nodes. OpenRouter API Key: Set in all OpenRouter nodes. Google Sheets: Connect your own Google Sheets credential. Use the provided Google Sheets template or your own. Activate the workflow after configuration. (Optional) Review sticky notes for step-by-step explanations.
by Anna Bui
🎯 LinkedIn ICP Lead Qualification Automation Automatically identify and qualify ideal customer prospects from LinkedIn post reactions using AI-powered profile analysis and intelligent data enrichment. Perfect for sales teams and marketing professionals who want to convert LinkedIn engagement into qualified leads without manual research. This workflow transforms post reactions into actionable prospect data with AI-driven ICP classification. Good to know LinkedIn Safety**: Only use cookie-free Apify actors to avoid account detection and suspension risks Daily Processing Limits**: Scrape maximum 1 page of reactions per day (50-100 profiles) to stay under LinkedIn's radar Apify actors cost approximately $0.01-0.05 per profile scraped - budget accordingly for daily processing Includes intelligent rate limiting to prevent API restrictions and maintain LinkedIn account safety AI classification requires clear definition of your Ideal Customer Profile criteria Processing too many profiles or running too frequently will trigger LinkedIn's anti-scraping measures Always monitor your LinkedIn account health and Apify usage patterns for any warning signs How it works Scrapes LinkedIn post reactions using Apify's specialized actor to identify engaged users Extracts and cleans profile data including names, job titles, and LinkedIn URLs Checks against existing Airtable records to prevent duplicate processing and save costs Creates new prospect records with basic information for tracking purposes Enriches profiles with comprehensive LinkedIn data including company details and experience Aggregates and formats profile data for AI analysis and classification Uses AI to analyze prospects against your ICP criteria with detailed reasoning Updates records with ICP classification results and extracted email addresses Implements smart batching and delays to respect API rate limits throughout the process How to use IMPORTANT**: Select cookie-free Apify actors only to avoid LinkedIn account suspension Set up Apify API credentials in both HTTP Request nodes for safe LinkedIn scraping Configure Airtable OAuth2 authentication and select your prospect tracking base Replace the LinkedIn post URL with your target post in the initial scraper node Daily Usage**: Process only 1 page of reactions per day (typically 50-100 profiles) maximum Customize the AI classification prompt with your specific ICP criteria and job titles Test with a small batch first to verify setup and monitor both API costs and LinkedIn account health Schedule workflow to run daily rather than processing large batches to maintain account safety Requirements Apify account with API access and sufficient credits for profile scraping Airtable account with OAuth2 authentication configured OpenAI or compatible AI model credentials for prospect classification LinkedIn post URL with reactions to analyze (minimum 10+ reactions recommended) Clear definition of your Ideal Customer Profile criteria for accurate AI classification Customising this workflow Safety First**: Always verify Apify actors are cookie-free before configuring to protect your LinkedIn account Modify ICP classification criteria in the AI prompt to match your specific target customer profile Set up daily scheduling (not hourly/frequent) to respect LinkedIn's usage patterns and avoid detection Adjust rate limiting delays based on your comfort level with LinkedIn scraping frequency Add additional data fields to Airtable schema for storing custom prospect information Integrate with CRM systems like HubSpot or Salesforce for automatic lead import Set up Slack notifications for new qualified prospects or daily summary reports Create email marketing sequences in tools like Mailchimp for nurturing qualified leads Add lead scoring based on company size, industry, or engagement level for prioritization Consider rotating between different LinkedIn posts to diversify your prospect sources while maintaining daily limits
by Muhammad Nouman
How it works This workflow turns a Google Drive folder into a fully automated YouTube publishing pipeline. Whenever a new video file is added to the folder, the workflow generates all YouTube metadata using AI, uploads the video to your YouTube channel, deletes the original file from Drive, sends a Telegram confirmation, and can optionally post to Instagram and Facebook using permanent system tokens. High-level flow: Detects new video uploads in a specific Google Drive folder. Downloads the file and uses AI to generate: • a polished first-person YouTube description • an SEO-optimized YouTube title • high-ranking YouTube tags Uploads the video to YouTube with the generated metadata. Deletes the original Drive file after upload. Sends a Telegram notification with video details. (Optional) Posts to Instagram & Facebook using permanent system user tokens. Set up steps Setup usually takes a few minutes. Add Google Drive OAuth2 credentials for the trigger and download/delete nodes. Add your OpenAI (or Gemini) API credentials for title/description/tag generation. Add YouTube OAuth2 credentials in the YouTube Upload node. Add Facebook/Instagram Graph API credentials if enabling cross-posting. Replace placeholder IDs (Drive folder ID, Page ID, IG media endpoint). Review sticky notes in the workflow—they contain setup guidance and token info. Activate the Google Drive trigger to start automated uploads.
by Airtop
Extract Facebook Group Posts with Airtop Use Case Extracting content from Facebook Groups allows community managers, marketers, and researchers to gather insights, monitor discussions, and collect engagement metrics efficiently. This automation streamlines the process of retrieving non-sponsored post data from group feeds. What This Automation Does This automation extracts key post details from a Facebook Group feed using the following input parameters: Facebook Group URL**: The URL of the Facebook Group feed you want to scrape. Airtop Profile**: The name of your Airtop Profile authenticated to Facebook. It returns up to 5 non-sponsored posts with the following attributes for each: Post text Post URL Page/profile URL Timestamp Number of likes Number of shares Number of comments Page or profile details Post thumbnail How It Works Form Trigger: Collects the Facebook Group URL and Airtop Profile via a form. Browser Automation: Initiates a new browser session using Airtop. Navigates to the provided Facebook Group feed. Uses an AI prompt to extract post data, including interaction metrics and profile information. Structured Output: The results are returned in a defined JSON schema, ready for downstream use. Setup Requirements Airtop API Key — Free to generate. An Airtop Profile logged into Facebook. Next Steps Integrate With Analytics Tools**: Feed the output into dashboards or analytics platforms to monitor community engagement. Automate Alerts**: Trigger notifications for posts matching certain criteria (e.g., high engagement, keywords). Combine With Comment Automation**: Extend this to reply to posts or engage with users using other Airtop automations. Let me know if you’d like this saved as a .md file or included in your Airtop automation library. Read more about how to extract posts from Facebook groups
by Manuel
Who is this template for? This workflow template is ideal for anyone using Notion for project management and Clockify for time tracking. The workflow automatically adds all new clients from Notion to Clockify. How it works Scans your Notion client table every minute for new clients Adds all new clients to your Clockify workspace Set up Steps Set up the Notion trigger node by adding your Notion API credentials as described in the n8n Notion docs. Go to your Notion clients page/table and give your integration permission to acces the data on this page. Go back to n8n and select your Notion client page in the Notion trigger node. Set up the Clockify node by adding your Clockify API credentials as described in the n8n Clockify docs, select your Clockify workspace and map your client name column from Notion to the Clockify "Client Name" field.
by Jay Emp0
Overview Fetch Multiple Google Analytics GA4 metrics daily, post to Discord, update previous day’s entry as GA data finalizes over seven days. Benefits Automates daily traffic reporting Maintains single message per day, avoids channel clutter Provides near–real-time updates by editing prior messages Use Case Teams tracking website performance via Discord (or any chat tool) without manual copy–paste. Marketing managers, community moderators, growth hackers. If your manager asks you for daily marketing report every morning, you can now automate it Notes google analytics node in n8n does not provide real time data. The node updates previous values for the next 7 days discord node on n8n does not have features to update an exisiting message by message id. So we have used the discord api for this most businesses use multiple google analytics properties across their digital platforms Core Logic Schedule trigger fires once a day. Google Analytics node retrieves metrics for date ranges (past 7 days) Aggregate node collates all records. Discord node fetches the last 10 messages in the broadcast channel Code node maps existing Discord messages by to the google analytics data using the date fields For each GA record: If no message exists → send new POST to the discord channel If message exists and metrics changed, send an update patch to the existing discord message Batch loops + wait nodes prevent rate-limit. Setup Instructions Import workflow JSON into n8n. Follow the n8n guide to Create Google Analytics OAuth2 credential with access to all required GA accounts. Follow the n8n guide to Create Discord OAuth2 credential for “Get Messages” operations. Follow the Discord guide to Create HTTP Header Auth credential named “Discord-Bot” with header Key: Authorization Value: Bot <your-bot-token> In the two Set nodes in the beginning of the flow, assign discord_channel_id and google_analytics_id. Get your discord channel id by sending a text on your discord channel and then copy message link Paste the text below and you will see your message link in the form of https://discord.com/channels/server_id/channel_id/message_id , you will want to get the channel_id which is the number in the middle Find your google analytics id by going to google analytics dashboard, seeing the properties in the top right and copy paste that number to the flow Adjust schedule trigger times to your preferred report hour. Activate workflow. Customization Replace Discord HTTP Request nodes with Slack, ClickUp, WhatsApp, Telegram integrations by swapping POST/PATCH endpoints and authentication.
by Marcial Ambriz
Remixed Backup your workflows to GitHub from Solomon's work. Check out his templates. How it works This workflow will backup your typebots to GitHub. It uses the Typebot API to export all typebots. It then loops over the data, checks in GitHub to see if a file exists that uses the credential's ID. Once checked it will: update the file on GitHub if it exists; create a new file if it doesn't exist; ignore if it's the same. In addition, it also checks if any flow have been deleted from typebot workspace. If a flow no longer exists in workspace, the corresponding file will be removed from the repository to keep everything in sync. Who is this for? People wanting to backup their typebots(flows) outside the server for safety purposes or to migrate to another server.