by PUQcloud
Overview The Docker InfluxDB WHMCS module uses a specially designed workflow for n8n to automate deployment processes. The workflow provides an API interface for the module, receives specific commands, and connects via SSH to a server with Docker installed to perform predefined actions. Prerequisites You must have your own n8n server. Alternatively, you can use the official n8n cloud installations available at: n8n Official Site Installation Steps Install the Required Workflow on n8n You have two options: Option 1: Use the Latest Version from the n8n Marketplace The latest workflow templates for our modules are available on the official n8n marketplace. Visit our profile: PUQcloud on n8n Option 2: Manual Installation Each module version comes with a workflow template file. You need to manually import this template into your n8n server. n8n Workflow API Backend Setup for WHMCS/WISECP 1. Configure API Webhook and SSH Access Create a Basic Auth Credential for the Webhook API block in n8n. Create an SSH Credential for accessing a server with Docker installed. 2. Modify Template Parameters In the Parameters block of the template, update the following settings: server_domain – Must match the domain of the WHMCS/WISECP Docker server. clients_dir – Directory where user data related to Docker and disks will be stored. mount_dir – Default mount point for the container disk (recommended not to change). Do not modify the following technical parameters: screen_left screen_right Deploy-docker-compose In the Deploy-docker-compose element, you can modify the Docker Compose configuration. This is generated in the following scenarios: When the service is created When the service is unlocked When the service is updated nginx In the nginx element, you can modify configuration parameters of the web interface proxy server. The main section allows you to add custom parameters to the server block in the proxy server configuration file. The main_location section contains settings that will be added to the location / block of the configuration. Here, you can define custom headers and parameters. Bash Scripts Management of Docker containers and related procedures is done by executing Bash scripts generated in n8n. These scripts return either JSON or plain strings. All scripts are located in elements directly connected to the SSH element. You have full control over any script and can modify or execute it as needed.
by PUQcloud
Overview The Docker NextCloud WHMCS module leverages a sophisticated workflow for n8n, designed to automate the comprehensive deployment, configuration, and management processes for NextCloud and NextCloud Office services. Through its intuitive API interface, the workflow securely receives commands and orchestrates predefined tasks via SSH on your Docker-hosted server, ensuring streamlined operations and efficient management. Prerequisites You must deploy your own dedicated n8n server to manage workflows effectively. Alternatively, you may opt for the official n8n cloud-based solutions accessible via: n8n Official Site Your Docker server must be accessible via SSH with necessary permissions. Installation Steps Install the Required Workflow on n8n You can select from two convenient installation options: Option 1: Use the Latest Version from the n8n Marketplace The latest workflow templates are continuously updated and available on the n8n marketplace. Explore all templates provided by PUQcloud directly here: PUQcloud on n8n Option 2: Manual Installation Each module version includes a bundled workflow template file. Import this workflow file directly into your n8n server manually. n8n Workflow API Backend Setup for WHMCS Configure API Webhook and SSH Access Create a secure Basic Auth Credential for Webhook API interactions within n8n. Create an SSH Credential within n8n to securely communicate with the Docker host. Modify Template Parameters Adjust and update the following critical parameters to match your deployment specifics: server_domain – Set this to the domain of your WHMCS Docker server. clients_dir – Specify the directory where user data and related resources will be stored. mount_dir – The standard mount point for container storage (recommended to remain unchanged). Do not alter the following technical parameters to avoid workflow disruption: screen_left, screen_right. Deploy-docker-compose Configuration Fine-tune Docker Compose configurations tailored specifically for these critical operational scenarios: Initial service provisioning and setup Service suspension and subsequent unlocking Service configuration updates Routine service maintenance tasks nginx Configuration Management Enhance and customize proxy server configurations using the dedicated nginx workflow element: main**: Define specialized parameters within the server configuration block. main_location**: Set custom headers, caching policies, and routing rules for the root location. Bash Script Automation Automate Docker container management and related server tasks through dynamically generated Bash scripts within n8n. Scripts execute securely via SSH and provide responses in JSON or plain text formats for easy parsing and logging. Scripts are conveniently linked directly to the SSH action elements. You retain complete flexibility to adapt or extend these scripts as necessary to meet your precise operational requirements.
by Juan Carlos Cavero Gracia
Description This n8n automation template provides an end-to-end solution for generating a series of themed images for Instagram and TikTok carousels using OpenAI's GPT Image (via the image generation API) and automatically publishing them to both platforms. It uses a sequence of prompts to create a narrative or themed carousel, generating each image based on the previous one, and then posts them with an AI-generated caption. Who Is This For? Social Media Managers:** Quickly create and schedule engaging image carousels for Instagram and TikTok. Content Creators:** Automate the visual content creation process for thematic posts or storytelling carousels. Digital Marketers:** Efficiently produce visual assets for campaigns that require sequential imagery. Small Businesses:** Generate unique promotional content for social media without needing advanced design skills. What Problem Does This Workflow Solve? Manually creating a series of related images for a carousel and then publishing them across multiple platforms can be repetitive and time-consuming. This workflow addresses these issues by: Automating Image Generation:** Uses OpenAI to generate a sequence of 5 images, where each new image is an evolution based on the previous one and a new prompt. Automating Caption Generation:** Leverages OpenAI (GPT) to create a suitable description/caption for the carousel based on the image prompts. Streamlining Multi-Platform Publishing:** Automatically uploads the generated image carousel and caption to both Instagram and TikTok. Reducing Manual Effort:** Significantly cuts down the time spent on designing individual images and manually uploading them. Ensuring Visual Cohesion:** The sequential image generation method (editing the previous image) helps maintain a consistent style or narrative across the carousel. How It Works Trigger: The workflow is initiated manually (can be adapted to a schedule or webhook). Define Prompts: Five distinct prompts are pre-set within the workflow to guide the generation of each image in the carousel. AI Caption Generation: OpenAI (GPT-4.1) generates a concise (≤ 90 characters for TikTok) description for the social media posts based on all five image prompts. Sequential AI Image Generation: Image 1: OpenAI's image generation API (specified as gpt-image-1) creates the first image based on prompt1. Image 2-5: For each subsequent image, the workflow uses the OpenAI image edits API. It takes the previously generated image and a new prompt (prompt2 for image 2, prompt3 for image 3, and so on) to create the next image in the sequence. Images are converted from base64 JSON to binary format. Content Aggregation: The five generated binary image files (named photo1 through photo5) are merged. Multi-Platform Distribution: The merged images and the AI-generated description are sent to api.upload-post.com for publishing as a carousel to Instagram. The same content is sent to api.upload-post.com for publishing as a carousel to TikTok, with an option to automatically add music. The TikTok description is truncated if it exceeds 90 characters. Setup Accounts & API Keys: You will need: An n8n instance. An OpenAI API key. An API key for upload-post.com. Configure Credentials: Add your OpenAI API key to the "OpenAI" credentials in n8n. This will be used by the "Generate Description for Tiktok and Instagram" node and the HTTP Request nodes calling the OpenAI image generation/edit APIs. In the "POST TO INSTAGRAM" and "POST TO TIKTOK" nodes, replace "Apikey add_api_key_here" with your actual upload-post.com API key. Update the user field in the "POST TO INSTAGRAM" and "POST TO TIKTOK" nodes if "upload_post" is not your user identifier for that service. Customize Prompts: Modify the five prompts (prompt1 to prompt5) in the "Set All Prompts" node to define the story or theme of your image carousel. Review Image Generation Parameters: In the "Set API Variables" node, you can adjust: size_of_image (e.g., "1024x1536" for vertical carousels). openai_image_model (ensure this matches a valid OpenAI model identifier for image generation/edits, like dall-e-2 or dall-e-3 if gpt-image-1 is a placeholder). response_format_image (should generally remain b64_json for this workflow). (Optional) TikTok Auto Music: The "POST TO TIKTOK" node has an auto_add_music parameter set to true. Change this to false if you prefer to add music manually or not at all. Requirements Accounts:** n8n, OpenAI, upload-post.com. API Keys & Credentials:** API Keys for OpenAI and https://upload-post.com. (Potentially) Paid Plans:** OpenAI and upload-post.com usage may incur costs depending on your volume and their respective pricing models. This template empowers you to automate the creation and distribution of visually consistent image carousels, saving time and enhancing your social media presence.
by Akram Kadri
Who is this for? This workflow template is ideal for marketers, designers, content creators, and developers who need to generate custom text-based images dynamically. Whether you want to create social media graphics, placeholder images, or text-based LinkedIn carousels, this workflow provides a simple, no-code solution using an API that requires no authentication. What problem does this workflow solve? Creating text-based images often requires design software or complex integrations with graphic tools. This workflow eliminates that hassle by allowing users to generate images with custom text, font styles, colors, and background colors using a simple HTTP request. It’s perfect for automating image generation without relying on external tools or manual effort. What this workflow does This workflow leverages an HTTP request to a free API that generates text-based images dynamically. Here's what it enables you to do: Define custom image text Set image dimensions (width x height) Choose a background color and text color using hex codes Select a font family and font size Specify the image format (PNG, JPG, or WebP) The generated image can be used immediately, making it ideal for automating content creation workflows. Setup Open the workflow in n8n. Modify the Set node to define your preferred image properties: text: The message displayed on the image. size: Image dimensions (e.g., 500x300 pixels). backgroundColor: Hex color code for the background. textColor: Hex color code for the text. fontFamily: Select from available font options (e.g., Pacifico, Ubuntu). fontSize: Define the text size. type: Choose the image format (PNG, JPG, or WebP). Execute the workflow to generate an image. The HTTP request returns the generated image, ready for use. How to customize this workflow 1. Adjust the Set node values to match your desired design. 2. Use dynamic data for text, allowing personalized images based on user input. 3. Automate image delivery by adding email or social media posting nodes. 4. Integrate this workflow into larger automation sequences, such as content marketing pipelines.
by Cyril Nicko Gaspar
🔍 Email Lookup with Google Search from Postgres Database This N8N workflow is designed to enrich seller data stored in a Postgres database by performing automated Google search lookups. It uses Bright Data's Web Unlocker to bypass search result restrictions and the HTML Extract node to parse and extract relevant information from webpages. The main purpose of this workflow is to discover missing contact details, company domains, and secondary emails for businesses or sellers based on existing database entries. 🎯 Problem This Workflow Solves Manually searching for missing seller or business details—like secondary emails, websites, or domain names—can be time-consuming and inefficient, especially for large datasets. This workflow automates the search and data enrichment process, significantly reducing manual effort while improving the quality and completeness of your seller database. ✅ Prerequisites Before using this template, make sure the following requirements are met: ✔️ A Bright Data account with access to the Web Unlocker or Amazon Scraper API ✔️ A valid Bright Data API key ✔️ An active PostgreSQL database with seller data ✔️ N8N self-hosted instance (recommended for using community nodes like n8n-nodes-brightdata) ✔️ Installed n8n-nodes-brightdata package (custom node for Bright Data integration) ⚙️ Setup Instructions Step 1: Prepare Your Postgres Table Create a table in Postgres with the following structure (you can adjust field names if needed): CREATE TABLE sellers ( seller_id SERIAL PRIMARY KEY, seller_name TEXT, primary_email TEXT, company_info TEXT, trade_name TEXT, business_address TEXT, coc_number TEXT, vat_number TEXT, commercial_register TEXT, secondary_email TEXT, domain TEXT, seller_slug TEXT, source TEXT ); Step 2: Setup Web Unlocker on Bright Data Go to your Bright Data dashboard. Navigate to Proxies & Scraping → Web Unlocker. Create a new zone, selecting Web Unlocker API under Scraping Solutions. Whitelist your server IP if required. Step 3: Generate API Key In the Bright Data dashboard, go to the API section. Generate a new API key. In N8N, create HTTP Request Credentials using Bearer Authentication with the API key. Step 4: Install the Bright Data Node in N8N In your N8N self-hosted instance, go to Settings → Community Nodes. Search and install n8n-nodes-brightdata. 🔄 Workflow Functionality 🔁 Trigger: Can be set to run on a schedule (e.g., daily) or manually. 📥 Read: Fetches seller records from the Postgres table. 🌐 Search: Uses Bright Data to perform a Google search based on seller_name, company_info, or trade_name. 🧾 Extract: Parses the HTML content using the HTML Extract node to identify potential websites and email addresses. 📝 Update: Writes enriched data (like domain or secondary_email) back to the Postgres table. 💡 Use Cases Lead enrichment for e-commerce sellers Domain and contact info discovery for B2B databases Email and web domain verification for CRM systems Market research automation 🛠️ Customization Tips You can enhance the parsing logic in the HTML Extract node to look for phone numbers, LinkedIn profiles, or social media links. Modify the search query logic to include additional parameters like location or industry for more refined results. Integrate additional APIs (e.g., Hunter.io, Clearbit) for email validation or social profile enrichment. Add filtering to skip entries that already have domain or secondary_email.
by David Roberts
This workflow allows you to send multi-step email campaigns using n8n, Gmail and Google Sheets. You define a sequence of emails, and a list of email addresses to send them to. The first email is sent to everyone, but the later emails in the sequence are only sent if no-one has replied to the thread yet. This means you only need to worry about replying to people who respond to your email, not chasing people who don’t. Requirements A list of emails in a Google sheet. You can define extra info that will be available to your email templates (e.g. name, company, etc.) A sequence of emails to send, plus how long to wait to send each one, e.g. On day 0:** “Hi, {name} — nice to meet you at the conference. Was wondering whether {company} would be interested in a quick call about X?” On day 3:** “Hi, {name}, just wanted to check in on this. Let me know if there’s any interest!” On day 7:** “{name}, just wanted to give this one last try” A Gmail account (although you could also swap out that part for any other email service) How it works When sending the emails, n8n inserts a hidden attribute in each one that tags it as being part of the email campaign. It then regularly looks through the email threads with that tag and checks whether: No-one has replied yet It’s time to send the next message The workflow doesn’t send emails on weekends. Notes This workflow is not intended for spam! Please use responsibly You can use this workflow for multiple different campaigns by making copies of the workflow and changing the sequence / Google Sheet that it uses
by Don Jayamaha Jr
📡 This workflow serves as the central Alpha Vantage API fetcher for Tesla trading indicators, delivering cleaned 20-point JSON outputs for three timeframes: 15min, 1hour, and 1day. It is required by the following agents: Tesla 15min, 1h, 1d Indicators Tools Tesla Financial Market Data Analyst Tool ✅ Requires an Alpha Vantage Premium API Key 🚀 Used as a sub-agent via webhook endpoints triggered by other workflows 📈 What It Does For each timeframe (15min, 1h, 1d), this tool: Triggers 6 technical indicators via Alpha Vantage: RSI MACD BBANDS SMA EMA ADX Trims the raw response to the latest 20 data points Reformats into a clean JSON structure: { "indicator": "MACD", "timeframe": "1hour", "data": { "timestamp": "...", "macd": 0.32, "signal": 0.29 } } Returns results via Webhook Respond for the calling agent 📂 Required Credentials 🔑 Alpha Vantage Premium API Key Set up under Credentials > HTTP Query Auth Name: Alpha Vantage Premium Query Param: apikey Get yours here: https://www.alphavantage.co/premium/ 🛠️ Setup Steps Import Workflow into n8n Name it: Tesla_Quant_Technical_Indicators_Webhooks_Tool Add HTTP Query Auth Credential Name: Alpha Vantage Premium Param key: apikey Value: your Alpha Vantage key Publish and Use the Webhooks This workflow exposes 3 endpoints: /15minData → used by 15m Indicator Tool /1hourData → used by 1h Indicator Tool /1dayData → used by 1d Indicator Tool Connect via Execute Workflow or HTTP Request Ensure caller sends webhook trigger correctly to the path 🧱 Architecture Summary Each timeframe section includes: | Component | Details | | ------------------ | --------------------------------------------- | | 📡 Webhook Trigger | Entry node (/15minData, /1hourData, etc.) | | 🔄 API Calls | 6 nodes fetching indicators via Alpha Vantage | | 🧹 Formatters | JS Code nodes to clean and trim responses | | 🧩 Merge Node | Consolidates cleaned JSONs | | 🚀 Webhook Respond | Returns structured data to calling workflow | 🧾 Sticky Notes Overview ✅ Webhook Entry: Instructions per timeframe ✅ API Call Summary: Alpha Vantage endpoint for each indicator ✅ Format Nodes: Explain JSON parsing and cleaning ✅ Merge Logic: Final output format ✅ Webhook Response: What gets returned to caller All stickies follow n8n standard color-coding: Blue = Webhook flow Yellow = API request group Purple = Formatters Green = Merge step Gray = Workflow overview and usage 🔐 Licensing & Support © 2025 Treasurium Capital Limited Company This agent is part of the Tesla Quant AI Trading System and protected under U.S. copyright. For support: 🔗 Don Jayamaha – LinkedIn 🔗 n8n Creator Profile 🚀 Use this API tool to feed Tesla technical indicators into any AI or trading agent across 15m, 1h, and 1d timeframes. Required for all Tesla Quant Agent indicator tools.
by ist00dent
This n8n template lets you automatically pull market data for the cryptocurrencies from CoinGecko every hour, calculate custom volatility and market-health metrics, classify each coin’s price action into buy/sell/hold/neutral signals with risk ratings, and expose both individual analyses and a portfolio summary via a webhook. It’s perfect for crypto analysts, DeFi builders, or portfolio managers who want on-demand insights without writing a single line of backend code. 🔧 How it works Schedule Trigger fires every hour (or interval you choose). HTTP Request (CoinGecko) fetches the top 10 coins by market cap, including 24 h, 7 d, and 30 d price change percentages. Split In Batches ensures each coin is processed sequentially. Function (Calculate Market Metrics) computes: A weighted volatility score Market-cap-to-volume ratio Price-to-ATH ratio Composite market score IF & Switch nodes categorize each coin’s 24 h price action (up >5%, down >5%, high volatility, or stable) and append: signal (BUY/SELL/HOLD/NEUTRAL) riskRating (High/Medium/Low/Unknown) recommendation & investmentStrategy guidance NoOp & Merge nodes consolidate each branch back into a single data stream. Function (Generate Portfolio Summary) aggregates all analyses into: A Markdown portfolioSummary Counts of buy/sell/hold/neutral signals Risk distribution Webhook Response returns the full JSON payload with individual analyses and the summary for downstream consumers. 👤 Who is it for? This workflow is ideal for: Crypto researchers and analysts who need scheduled market insights DeFi and trading bot developers looking to automate signal generation Portfolio managers seeking a no-code overview of top assets Automation engineers exploring API integration and data enrichment 📑 Data Structure When you trigger the webhook, you’ll receive a JSON object containing: individualAnalyses: Array of { coin, symbol, currentPrice, priceChanges, marketMetrics, signal, riskRating, recommendation } portfolioSummary: Markdown report summarizing signals, risk distribution, and top opportunity marketSignals: Counts of each signal type riskDistribution: Counts of each risk rating timestamp: ISO string of analysis time ⚙️ Setup Instructions Import: In n8n Editor → click “Import from JSON” → paste this workflow JSON. Configure Schedule: Double-click the Schedule Trigger → set your desired interval (default: every hour). Webhook Path: Open the Webhook node → choose a unique path (e.g., /crypto‐analysis) and “POST”. Activate: Save and activate the workflow. Test: Open the webhook url to other tab or use cURL curl -X POST https://<your-n8n-host>/webhook/<path> You’ll get back a JSON payload with both portfolioSummary and individualAnalyses. 📝 Tips Rate-Limit Handling: If CoinGecko returns 429, insert a Delay node (e.g., 500 ms) after the HTTP Request. Batch Size: Default is 1 coin at a time; you can bump it to parallelize. Customization: Tweak volatility weightings or add new metrics directly in the “Calculate Market Metrics” Function node. Extension: Swap CoinGecko for another API by updating the HTTP Request URL and field mappings.
by Andrey
⚠️ DISCLAIMER: This workflow uses the HDW LinkedIn community node, which is only available on self-hosted n8n instances. It will not work on n8n.cloud. Overview This n8n workflow automates the enrichment of CRM contact data with professional insights from LinkedIn profiles. The workflow integrates with both Pipedrive and HubSpot CRMs, finding LinkedIn profiles that match your contacts and updating your CRM with valuable information about their professional background and recent activities. Key Features Multi-CRM Support**: Works with both Pipedrive and HubSpot AI-Powered Data Enrichment**: Uses an advanced AI agent to analyze and summarize professional information Automated Triggers**: Activates when new contacts are added or when enrichment is requested Comprehensive Profile Analysis**: Captures LinkedIn profile summaries and post activity How It Works Triggers The workflow activates in three scenarios: When a new contact is created in CRM When a contact is updated in CRM with an enrichment flag LinkedIn Data Collection Process Email Lookup: First tries to find the LinkedIn profile using the contact's email Advanced Search: If email lookup fails, uses name and company details to find potential matches Profile Analysis: Collects comprehensive profile information Post Analysis: Gathers and analyzes the contact's recent LinkedIn activity CRM Updates The workflow updates your CRM with: LinkedIn profile URL Professional summary (skills, experience, background) Analysis of recent LinkedIn posts and activity Setup Instructions Requirements Self-hosted n8n instance with the HDW LinkedIn community node installed API access to OpenAI (for GPT-4o) Pipedrive and/or HubSpot account HDW API key https://app.horizondatawave.ai Installation Steps Install the HDW LinkedIn Node: npm install n8n-nodes-hdw Follow the detailed instructions at: https://www.npmjs.com/package/n8n-nodes-hdw Configure Credentials: OpenAI: Add your OpenAI API key Pipedrive: Connect your Pipedrive account (if using) HubSpot: Connect your HubSpot account (if using) HDW LinkedIn: Add your API key from https://app.horizondatawave.ai CRM Custom Fields Setup: For Pipedrive: Go to Settings → Data Fields → Contact Fields → + Add Field Create the following custom fields: LinkedIn Profile: Field type - Large text Profile Summary: Field type - Large text LinkedIn Posts Summary: Field type - Large text Need Enrichment: Field type - Single option (Yes/No) Detailed instructions for creating custom fields in Pipedrive: https://support.pipedrive.com/en/article/custom-fields For HubSpot: Go to Settings → Properties → Create property Create the following properties for Contact object: linkedin_url: Field type - Single-line text profile_summary: Field type - Multi-line text linkedin_posts_summary: Field type - Multi-line text need_enrichment: Field type - Checkbox (Boolean) Detailed instructions for creating properties in HubSpot: https://knowledge.hubspot.com/properties/create-and-edit-properties Import the Workflow: Import the "HDW_CRM_Enrichment.json" file into your n8n instance Activate Webhooks: Enable the webhook triggers for your CRM to ensure the workflow activates correctly Customization Options AI Agent Prompts You can modify the system prompts in the "Data Enrichment AI Agent" nodes to: Change the focus of profile analysis Adjust the tone and detail level of summaries Customize what information is extracted from posts CRM Field Mapping The workflow is pre-configured to update specific custom fields in Pipedrive and HubSpot. Update the field/property mappings in: "Update data in Pipedrive" nodes "Update data in HubSpot" node Troubleshooting Common Issues LinkedIn Profile Not Found**: Check if the contact's email is their work email; consider adjusting the search parameters Webhook Not Triggering**: Verify webhook configuration in your CRM Missing Custom Fields**: Ensure all required custom fields are created in your CRM with correct names Rate Limits Be aware of LinkedIn API rate limits (managed by HDW LinkedIn node) Consider implementing delays if processing large batches of contacts Best Practices Use enrichment flags to selectively update contacts rather than enriching all contacts Review and clean contact data in your CRM before enrichment Periodically review the AI-generated summaries to ensure quality and relevance
by Mark Shcherbakov
Video Guide I prepared a detailed guide that showed the whole process of integrating the Binance API and storing data in Airtable to manage funding statements associated with tokens in a wallet. Youtube Link Who is this for? This workflow is ideal for developers, financial analysts, and cryptocurrency enthusiasts who want to automate the process of managing funding statements and token prices. It’s particularly useful for those who need a systematic approach to track and report funding fees associated with tokens in their wallets. What problem does this workflow solve? Managing funding statements and token prices across multiple platforms can be cumbersome and error-prone. This workflow automates the process, allowing users to seamlessly fetch funding fees from Binance and record them alongside token prices in Airtable, minimizing manual data entry and potential discrepancies. What this workflow does This workflow integrates the Binance API with an Airtable database, facilitating the storage and management of funding statements linked to tokens in a wallet. The agent can: Fetch funding fees and current positions from Binance. Aggregate data to create structured funding statements. Insert records into Airtable, ensuring proper linkage between funding data and tokens. API Authentication: The workflow establishes authentication with the Binance API using a Crypto Node to handle API keys and signatures, ensuring secure and verified requests. Data Collection: It retrieves necessary data, including funding fees and current positions with properly formatted API requests to ensure seamless communication with Binance. Airtable Integration: The workflow inserts aggregated funding statements and token data into the corresponding Airtable records, managing token existence checks to avoid duplicate entries. Setup Set Up Airtable Database: Create an Airtable base with tables for Funding Statements and Tokens. Generate Binance API Key: Log in and create an API key with appropriate permissions. Set Up Authentication in N8N: Utilize a Crypto Node for Binance API authentication. Configure API Request to Binance: Set request method and headers for communication with the Binance API. Fetch Funding Fees and Current Positions: Retrieve funding data and current positions efficiently. Aggregate and Create Statements: Aggregate data to create detailed funding statements. Insert Data into Airtable: Input the structured data into Airtable and manage token records. Using Get Price Node: Implement a Get Price Node to maintain current token price tracking without additional setup.
by David Ashby
Complete MCP server exposing 1 IP2WHOIS Domain Lookup API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add IP2WHOIS Domain Lookup credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the IP2WHOIS Domain Lookup API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.ip2whois.com/v2 • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (1 total) 🔧 General (1 endpoints) • GET /: Lookup WHOIS Data 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native IP2WHOIS Domain Lookup API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Simeon
🔄 Reddit Content Operations via MCP Server 🧑💼 Who is this for? This workflow is built for content creators, marketers, Reddit automation enthusiasts, and AI agent developers who want structured, programmable access to Reddit content. If you're researching niche communities, tracking trends, or automating Reddit engagement — this is for you. 💡 What problem is this workflow solving? Reddit has valuable content scattered across subreddits, but manual analysis or engagement is inefficient. This workflow acts as a centralized API interface to: Query and manage Reddit posts Create, fetch, delete, and reply to comments Analyze subreddit metadata and behavior Enable AI agents to autonomously operate on Reddit data It does this using an MCP (Model Context Protocol) Server over Server-Sent Events (SSE). ⚙️ What this workflow does This template sets up a custom MCP Server that listens for JSON-based operation commands sent via SSE. Based on the operation, it routes the request to one of the following branches: 🟥 Post CRUD Create a new Reddit post Search posts across subreddits Fetch posts by ID Delete existing posts 🟩 Comment CRUD Create or reply to comments Fetch multiple comments from posts Delete specific comments 🟦 Subreddit Read Operations Get information about subreddits List subreddit posts Retrieve subreddit rules 🛠 Setup Import this workflow into your self-hosted n8n instance. Configure Reddit credentials (OAuth2). Connect your input system to the MCP Server Trigger node via SSE. Send operation payloads to the server like this: { "operation": "post_search", "params": { "query": "AI agents", "subreddit": "machinelearning" } } The workflow will route to the appropriate node based on operation type. 🧩 Supported Operations post_create post_get_many post_search post_delete post_get_by_id comment_create comment_reply comment_get_many comment_delete subreddit_get_about subreddit_get_many subreddit_get_rules 🧠 How to customize this workflow to your needs Add new operations to the operation_switch node for additional API functionality. Chain results into Notion, Slack, Airtable, or external APIs. Integrate with OpenAI/GPT to summarize posts or filter content. Add logic to score and sort content by engagement, sentiment, or keywords. 🟨 Sticky Notes Each operation group is color-coded (Posts, Comments, Subreddits). Sticky Notes explain the purpose and dependencies of each section. Easy to maintain and extend with clear logical separation. ⚠️ This template uses a custom MCP Server node and only works in self-hosted n8n. 🖼 Workflow Preview