by Aitor | 1Node
Talk to Your Apps: Building a Personal Assistant MCP Server with Google Gemini Wouldn't it be cool to just tell your computer or phone to "schedule a meeting with Sarah next Tuesday at 3 PM" or "find John Doe's email address" and have it actually do it? That's the dream of a personal assistant! With n8n and the power of MCP and AI models like Google Gemini, you can actually build something pretty close to that. We've put together a workflow that shows you how you can use a natural language chat interface to interact with your other apps, like your CRM, email, and calendar. What You Need to Get Started Before you dive in, you'll need a few things: n8n:** An n8n instance (either cloud or self-hosted) to build and run your workflow. Google Gemini Access:** Access to the Google Gemini model via an API key. Credentials for Your Apps:** API keys or login details for the specific CRM, Email, and Calendar services you want to connect (like Google Sheets for CRM, Gmail, Google Calendar, etc., depending on your chosen nodes). A Chat Interface:** A way to send messages to n8n to trigger the workflow (e.g., via a chat app node or webhook). How it Works (In Simple Terms) Imagine this workflow is like a helpful assistant who sits between you and your computer. Step 1: You Talk, the AI Agent Listens It all starts when you send a message through your connected chat interface. Think of this as you speaking directly to your assistant. Step 2: The Assistant's Brain (Google Gemini) Your message goes straight to the assistant's "brain." In this case, the brain is powered by a smart AI model like Google Gemini. In our template we are using the latest Gemini 2.5 Pro. But this is totally up to you. Experiment and track which model fits the kind of tasks you will pass to the agent. Its job is to understand exactly what you're asking for. Are you asking to create something? Are you asking to find information? Are you asking to update something? The brain also uses a "memory" so it can remember what you've talked about recently, making the conversation feel more natural. We are using the default context window, which is the past 5 interactions. Step 3: The Assistant Decides What Tool to Use Once the brain understands your request, the assistant figures out the best way to help you. It looks at the request and thinks, "Okay, to do this, I need to use one of my tools." Step 4: The Assistant's Toolbox (MCP & Your Apps) Here's where the "MCP" part comes in. Think of "MCP" (Model Context Protocol) as the assistant's special toolbox. Inside this toolbox are connections to all the different apps and services you use – your CRM for contacts, your email service, and your calendar. The MCP system acts like a manager for these tools, making them available to the assistant whenever they're needed. Step 5: Using the Right Tool for the Job Based on what you asked for, the assistant picks the correct tool from the toolbox. If you asked to find a contact, it grabs the "Get Contact" node from the CRM section. If you wanted to schedule a meeting, it picks the "Create Event" node from the Calendar section. If you asked to draft an email, it uses the "Draft Email" node. Step 6: The Tool Takes Action Now, the node or set of nodes get to work! It performs the action you requested within the specific app. The CRM tool finds or adds the contact. The Email tool drafts the message. The Calendar tool creates the event. Step 7: Task Completed! And just like that, your request is handled automatically, all because you simply told your assistant what you wanted in plain language. Why This is Awesome This kind of workflow shows the power of combining AI with automation platforms like n8n. You can move beyond clicking buttons and filling out forms, and instead, interact with your digital life using natural conversation. n8n makes it possible to visually build these complex connections between your chat, the AI brain, and all your different apps. Taking it Further (Possible Enhancements) This is just the start! You could enhance this personal assistant by: Connecting more apps and services (task managers, project tools, etc.). Adding capabilities to search the web or internal documents. Implementing more sophisticated memory or context handling. Getting a notification when the AI agent is done completing each task such as in Slack or Microsoft Teams. Allowing the assistant to ask clarifying questions if needed. Building a robust prompt for the AI agent. Ready to Automate Your Workflow? Imagine the dozens of hours your team could save weekly by automating repetitive tasks through a simple, natural language interface. Need help? Feel free to contact us at 1 Node. Get instant access to a library of free resources we created.
by Davide
This workflow allows users to convert a 2D image into a 3D model by integrating multiple AI and web services. The process begins with a user uploading or providing an image URL, which is then sent to a generative AI model capable of interpreting the content and generating a 3D representation in .glb format. The model is then stored and a download link is returned to the user. Main Steps Trigger Node: Initiates the workflow either via HTTP request, webhook, or manual execution. Image Upload or Input: The image is acquired via direct upload or URL input. API Integration: The image is sent to a 3D generation API (e.g., a service like Kaedim, Luma Labs, or a custom AI model). Model Generation: The external API processes the image and creates a 3D model. File Storage: The resulting 3D model is stored in cloud storage (e.g., S3, Google Drive, or a local server). Response to User: A download link for the 3D model is returned to the user via the same communication channel (HTTP response, email, or chat). Advantages Automation**: Eliminates the need for manual 3D modeling, saving time for artists, developers, and designers. AI-Powered**: Leverages AI to generate realistic and usable 3D models from simple 2D inputs. Scalability**: Can be triggered automatically and scaled up to handle many requests via n8n's automation. Integration-Friendly**: Easily extendable with other services like Discord, Telegram, or marketplaces for 3D assets. No-Code Configuration**: Built with n8n’s visual interface, making it editable without programming knowledge. How It Works Trigger: The workflow can be started manually ("When clicking ‘Test workflow’") or automatically at scheduled intervals ("Schedule Trigger"). Data Retrieval: The "Get new image" node fetches data from a Google Sheet, including the model image, product image, and product ID. 3D Image Creation: The "Create 3D Image" node sends the image data to the Fal.run API (Trellis) to generate a 3D model. Status Check: The workflow periodically checks the request status ("Get status" and "Wait 60 sec.") until the job is marked as "COMPLETED." Result Processing: Once completed, the 3D model URL is retrieved ("Get Url 3D image"), the file is downloaded ("Get File 3D image"), and uploaded to Google Drive ("Upload 3D Image"). Sheet Update: The final 3D model URL is written back to the Google Sheet ("Update result"). Set Up Steps Prepare Google Sheet: Create a Google Sheet with columns: IMAGE MODEL and 3D RESULT (empty). Example sheet: Google Sheet Template. Obtain Fal.run API Key: Sign up at Fal.ai and get an API key. Configure the Authorization header in the "Create 3D Image" node with Key YOURAPIKEY. Configure Workflow Execution: Run manually via the Test workflow button. For automation, set up the Schedule Trigger node (e.g., every 5 minutes). Verify Credentials: Ensure Google Sheets, Google Drive, and Fal.run API credentials are correctly set in n8n. Once configured, the workflow processes new entries in the Google Sheet, generates 3D models, and updates the results automatically. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Davide
Drive-to-Store is a multi-channel marketing strategy that includes both the web and the physical context, with the aim of increasing the number of customers and sales in physical stores. This strategy guides potential customers from the online world to the physical point of sale through the provision of a coupon that can be spent in the store or on an e-commerce site. The basic idea is to have a landing page with a form and a series of unique coupons to assign to leads as a "reward" for filling out the form. This workflow is ideal for businesses looking to automate lead generation and management, especially when integrating with CRM systems like SuiteCRM and using Google Sheets for data tracking. How It Works Form Submission: The workflow starts with the On form submission node, which triggers when a user submits a form on a landing page. The form collects the user's name, surname, email, and phone number. Form Data Processing: The Form Fields node extracts and sets the form data (name, surname, email, and phone) for use in subsequent steps. Duplicate Lead Check: The Duplicate Lead? node checks if the submitted email already exists in a Google Sheets document. If the email is found, the workflow responds with a "duplicate lead" message (Respond KO node) and stops further processing. Coupon Retrieval: If the email is not a duplicate, the Get Coupon node retrieves a coupon code from the Google Sheets document based on the lead's email. Lead Creation in SuiteCRM: The Create Lead SuiteCRM node creates a new lead in SuiteCRM using the form data and the retrieved coupon code. The lead includes: First name, last name, email, phone number, and coupon code. Google Sheets Update: The Update Sheet node updates the Google Sheets document with the newly created lead's details, including: Name, surname, email, phone, coupon code, lead ID, and the current date and time. Response to Webhook: The Respond OK node sends a success response back to the webhook, indicating that the lead was created successfully. Set Up Steps Configure Form Trigger: Set up the On form submission node to collect user data (name, surname, email, and phone) via a web form. Set Up Google Sheets Integration: Configure the Duplicate Lead?, Get Coupon, and Update Sheet nodes to interact with the Google Sheets document. Ensure the document contains columns for email, coupon, lead ID, and other relevant fields. Set Up SuiteCRM Authentication: Configure the Token SuiteCRM node with the appropriate client credentials (client ID and client secret) to obtain an access token from SuiteCRM. Set Up Lead Creation in SuiteCRM: Configure the Create Lead SuiteCRM node to send a POST request to SuiteCRM's API to create a new lead. Include the form data and coupon code in the request body. Set Up Webhook Responses: Configure the Respond OK and Respond KO nodes to send appropriate JSON responses back to the webhook based on whether the lead was created or if it was a duplicate. Test the Workflow: Submit a test form to ensure the workflow correctly checks for duplicates, retrieves a coupon, creates a lead in SuiteCRM, and updates the Google Sheets document. Activate the Workflow: Once tested, activate the workflow to automate the process of handling form submissions and lead creation. Key Features Duplicate Lead Check**: Prevents duplicate leads by checking if the email already exists in the Google Sheets document. Coupon Assignment**: Retrieves a coupon code from Google Sheets and assigns it to the new lead. SuiteCRM Integration**: Automatically creates a new lead in SuiteCRM with the form data and coupon code. Data Logging**: Logs all lead details in a Google Sheets document for tracking and analysis. Webhook Responses**: Provides immediate feedback on whether the lead was created successfully or if it was a duplicate.
by Mutasem
Use case Slackbots are super powerful. At n8n, we have been using them to get a lot done.. But it can become hard to manage and maintain many different operations that a workflow can do. This is the base workflow we use for our most powerful internal Slackbots. They handle a lot from running e2e tests for Github branch to deleting a user. By splitting the workflow into many subworkflows, we are able to handle each command seperately, making it easier to debug as well as support new usecases. In this template, you can find eveything to setup your own Slackbot (and I made it simple, there's only one node to configure 😉). After that, you need to build your commands directly. This bot can create a new thread on an alerts channel and respond there. Or reply directly to the user. It responds for help request to return a help page. It automatically handles unknown commands. It also supports flags and environment variables. For example /cloudbot-test info mutasem --full-info -e env=prod would give you the following info, when calling subworkflow. How to setup Add Slack command and point it up to the webhook. For example. Add the following to the Set config node alerts_channel with alerts channel to start threads on instance_url with this instance url to make it easy to debug slack_token with slack bot token to validate request slack_secret_signature with slack secret signature to validate request help_docs_url with help url to help users understand the commands Build other workflows to call and add them to commands in Set Config. Each command must be mapped to a workflow id with an Execute Workflow Trigger node Activate workflow 🚀 How to adjust Add your own commands. Depending on your need, you might need to lock down who can call this.
by ainabler
Overall Description & Potential << What Does This Flow Do? >> Overall, this workflow is an intelligent sales outreach automation engine that transforms raw leads from a form or a list into highly personalized, ready-to-send introductory email drafts. The process is: it starts by fetching data, enriches it with in-depth AI research to uncover "pain points," and then uses those research findings to craft an email that is relevant to the solutions you offer. This system solves a key problem in sales: the lack of time to conduct in-depth research on every single lead. By automating the research and drafting stages, the sales team can focus on higher-value activities, like engaging with "warm" prospects and handling negotiations. Using Google Sheets as the main dashboard allows the team to monitor the entire process—from lead entry, research status, and email drafts, all the way to the send link—all within a single, familiar interface. << Potential Future Enhancements >> This workflow has a very strong foundation and can be further developed into an even more sophisticated system: Full Automation (Zero-Touch): Instead of generating a manual-click link, the output from the AI Agent can be directly piped into a Gmail or Microsoft 365 Email node to send emails automatically. A Wait node could be added to create a delay of a few minutes or hours after the draft is created, preventing instant sending. Automated Follow-up Sequences: The workflow can be extended to manage follow-up emails. By using a webhook to track email opens or replies, you could build logic like: "If the intro email is not replied to within 3 days, trigger the AI Agent again to generate follow-up email #1 based on a different template, and then send it." AI-Powered Lead Scoring: After the research stage, the AI could be given the additional task of scoring leads (e.g., 1-10 or High/Medium/Low Priority) based on how well the target company's profile matches your ideal customer profile (ICP). This helps the sales team prioritize the most promising leads. Full CRM Integration: Instead of Google Sheets, the workflow could connect directly to HubSpot, Salesforce, or Pipedrive. It would pull new leads from the CRM, perform the research, draft the email, and log all activities (research results, sent emails) back to the contact's timeline in the CRM automatically. Multi-Channel Outreach: Beyond email, the AI could be instructed to draft personalized LinkedIn Connection Request messages or WhatsApp messages. The workflow could then use the appropriate APIs to send these messages, expanding your outreach beyond just email.
by NanaB
This n8n workflow provides a comprehensive solution for user authentication and management, leveraging Airtable as the backend database. It includes flows for user sign-up and login, aswell as the sample crud operations retrieving user details, and updating user information. Youtube Video of me explaining the flow: https://www.youtube.com/watch?v=gKcGfyq3dPM How it Works User Sign-Up Flow Receives POST request: A webhook listens for POST requests containing new user details (email, first name, last name, password). Checks for existing email: The workflow queries Airtable to see if the submitted email already exists. Handles email in use: If the email is found, it responds with {"response": "email in use"}. Creates new user: If the email is unique, the password is SHA256 hashed (Base64 encoded), and the user's information (including the hashed password) is stored in Airtable. A successful response of {"response": "success"} is then sent. User Login Flow Receives POST request: A webhook listens for POST requests with user email and password for login. Verifies user existence: It checks Airtable for a user with the provided email. If no user is found, it responds with a failure message ("wrong email"). Compares passwords: If a user is found, the submitted password is hashed (SHA256, Base64 encoded) and compared with the stored hashed password in Airtable. Responds with JWT or error: If passwords match, a JWT token containing the user's ID and email is issued. If they don't match, a "wrong password" response is sent. Flows for a Logged-In User These flows require a JWT-authenticated request. Get User Details:** Webhook (GET): Receives a JWT-authenticated request. Airtable (Read): Fetches the current user’s record using the jwtPayload.id. Set Node ("Specify Current Details"): Maps fields like "First Name," "Last Name," "Email," and "Date" from Airtable to a standard output format. Update User Details:** Webhook (POST): Receives updated user data (email, name, password). Airtable (Upsert): Updates the record matching jwtPayload.id using the submitted fields. Set Node ("Specify New Details"): Outputs the updated data in a standard format. Set Up Steps (Approx. 5 Minutes) Step 1: Set up your Airtable Base and Table You'll need an Airtable Base and a table to store your user data. Ensure your table has at least the following columns: Email** (Single Line Text) First Name** (Single Line Text) Last Name** (Single Line Text) Password** (Single Line Text - this will store the hashed password) Date** (Date - optional, for user sign-up date) Step 2: Obtain an Airtable Personal Access Token Go to the Airtable website and log in to your account. Navigate to your personal access token page (usually found under your developer settings or by searching for "personal access tokens"). Click "Create new token." Give your token a name (e.g., "n8n User Management"). Grant necessary permissions: Scope: data.records:read, data.records:write for the specific base you will be using. Base: Select the Airtable base where your user management table resides. Generate the token and copy it immediately. You won't be able to see it again. Store it securely. Step 3: Create a JWT Auth Credential in n8n In your n8n instance, go to "Credentials" (usually found in the left-hand sidebar). Click "New Credential" and search for "JWT Auth". Give the credential a name (e.g., "UserAuthJWT"). For the "Signing Secret," enter a strong, random string of characters. This secret will be used to sign and verify your JWT tokens. Keep this secret highly confidential. Save the credential. Customization Options This workflow is designed to be highly adaptable: Database Integration**: Easily switch from Airtable to other databases like PostgreSQL, MySQL, MongoDB, or even Google Sheets by replacing the Airtable nodes with the appropriate database nodes in n8n. Authentication Methods**: Extend the authentication to include multi-factor authentication (MFA), social logins (Google, Facebook), or integrate with existing identity providers (IdP) by adding additional nodes. User Profile Fields**: Add or remove user profile fields (e.g., phone number, address, user roles) by adjusting the Airtable table columns and the Set nodes in the workflow. Notification System**: Integrate notification systems (e.g., email, SMS) for events like new user sign-ups, password resets, or account changes. Admin Panel**: Build an admin panel using n8n to manage users directly, including functionalities for adding, deleting, or updating user records, and resetting passwords. This workflow provides a solid foundation for building robust user management systems, adaptable to a wide range of applications and security requirements. Need Assistance or Customization? Do you have specific integrations in mind, or are you looking to add more user management features to this workflow? If you need help setting this up, or want to adapt it for a unique use case, don't hesitate to reach out! You can contact me directly at nanabrownsnr@gmail.com. I'd be glad to assist you.
by Hugues Stock
What does this template do? This workflow sets a small "lock" value in Redis so that only one copy of a long job can run at the same time. If another trigger fires while the job is still busy, the workflow sees the lock, stops early, and throws a clear error. This protects your data and keeps you from hitting rate limits. Because the workflow also stores simple progress flags ("working", "loading", "finishing"), you can poll the current status and show live progress for very long jobs. Use Case Great when the same workflow can be called many times in parallel (for example by webhooks, cron jobs, or nested Execute Workflow calls) and you need an "only run once at a time" guarantee without building a full queue system. What the Workflow Does ⚡ Starts through Execute Workflow Trigger called by another workflow 🔄 A Switch sends the run to Get, Set, or Unset actions 💾 Redis reads or writes a key named process_status_<key> with a time‑to‑live (default 600 s) 🚦 If nodes check the key and decide to continue or stop ⏱️ Wait nodes stand in for the slow part of your job (replace these with your real work) 📈 Updates the key with human‑readable progress values that another workflow can fetch with action = get 🏁 When done, the lock is removed so the next run can start Apps & Services Used Redis Core n8n nodes (Switch, If, Set, Wait, Stop and Error) Pre‑requisites A Redis server that n8n can reach Redis credentials stored in n8n A second workflow that calls this one and sends: action set to get, set, or unset key set to a unique name for the job Optional timeout in seconds Customization Tips Increase or decrease the TTL in the Set Timeout node to match how long your job usually runs Add or rename status values ("working", "loading", "finishing", and so on) to show finer progress Replace Stop and Error with a Slack or email alert, or even push the extra trigger into a queue if you prefer waiting instead of failing Use different Redis keys if you need separate locks for different tasks Build a small "status endpoint" workflow that calls this one with action = get to display real‑time progress to users Additional Use Cases 🛑 Telegram callback spam filter If a Telegram bot sends many identical callbacks in a burst, call this workflow first to place a lock. Only the first callback will proceed; the rest will exit cleanly until the lock clears. This keeps your bot from flooding downstream APIs. 🧩 External API rate‑limit protection Run heavy API syncs one after the other so parallel calls do not break vendor rate limits. 🔔 Maintenance window lock Block scheduled maintenance tasks from overlapping, making sure each window finishes before the next starts.
by Nazmy
Bearer Token Validation This n8n template helps you manage and validate tokens easily using: n8n as your backend workflow engine Airtable as your lightweight token store 🚀 What It Does Stores user tokens securely in Airtable with expiry or usage metadata. Validates incoming tokens in your workflows (e.g., webhook APIs). Rejects invalid or expired tokens automatically for security. Can be extended to generate, rotate, or revoke tokens for user management. How It Works Webhook node receives requests with a Bearer header. Airtable Query looks up the provided token. Validation Logic (Code node): Checks if the token exists. Verifies expiry or usage limits if configured. Returns success if valid, or error if error with describing the issue. Note: This is the simplest way to do auth, just for simplification Why Use This No need for a full backend to manage secure token validation. Clean, modular, and ready for your SaaS workflows. Enjoy building secure automations with n8n + Airtable! 🚀 Built by: Nazmy
by Don Jayamaha Jr
This workflow powers the Binance Spot Market Quant AI Agent, acting as the Financial Market Analyst. It fuses real-time market structure data (price, volume, kline) with multiple timeframe technical indicators (15m, 1h, 4h, 1d) and returns a structured trading outlook—perfect for intraday and swing traders who want actionable analysis in Telegram. 🔗 Requires the following sub-workflows to function: • Binance SM 15min Indicators Tool • Binance SM 1hour Indicators Tool • Binance SM 4hour Indicators Tool • Binance SM 1day Indicators Tool • Binance SM Price/24hStats/Kline Tool ⚙️ How It Works Triggered via webhook (typically by the Quant AI Agent). Extracts user symbol + timeframe from input (e.g., "DOGE outlook today"). Calls all linked sub-workflows to retrieve indicators + live price data. Merges the data and formats a clean trading report using GPT-4o-mini. Returns HTML-formatted message suitable for Telegram delivery. 📥 Sample Input { "message": "SOLUSDT", "sessionId": "654321123" } ✅ Telegram Output Format 📊 SOLUSDT Market Snapshot 💰 Price: $156.75 📉 24h Stats: High $160.10 | Low $149.00 | Volume: 1.1M SOL 🧪 4h Indicators: • RSI: 58.2 (Neutral-Bullish) • MACD: Crossover Up • BB: Squeezing Near Upper Band • ADX: 25.7 (Rising Trend) 📈 Resistance: $163 📉 Support: $148 🔍 Use Cases | Scenario | Outcome | | ------------------------------- | --------------------------------------------------------- | | User asks for “BTC outlook” | Returns 1h + 4h + 1d indicators + live price + key levels | | Telegram bot prompt: “DOGE now” | Returns short-term 15m + 1h analysis snapshot | | Strategy trigger inside n8n | Enables other workflows to consume structured signal data | 🎥 Watch Tutorial: 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding or redistribution permitted. 🔗 For support: LinkedIn – Don Jayamaha
by n8n Team
Who this template is for This template is for developers or teams who need to convert CSV data into JSON format through an API endpoint, with support for both file uploads and raw CSV text input. Use case Converting CSV files or raw CSV text data into JSON format via a webhook endpoint, with error handling and notifications. This is particularly useful when you need to transform CSV data into JSON as part of a larger automation or integration process. How this workflow works Receives POST requests through a webhook endpoint at /tool/csv-to-json Uses a Switch node to handle different input types: File uploads (binary data) Plain text CSV data JSON format data Processes the CSV data: For files: Uses the Extract From File node For raw text: Converts the text to CSV using a custom Code node that handles both comma and semicolon delimiters Aggregates the processed data and returns: Success response (200): Converted JSON data Error response (500): Error message with details In case of errors, sends notifications to a Slack error channel with execution details and a link to debug Set up steps Configure the webhook endpoint by deploying the workflow Set up Slack integration for error notifications: Update the Slack channel ID (currently set to "C0832GBAEN4") Configure OAuth2 authentication for Slack Test the endpoint using either: CURL for file uploads: bash Copy curl -X POST "https://yoururl.com/webhook-test/tool/csv-to-json" \ -H "Content-Type: text/csv" \ --data-binary @path/to/your/file.csv Or send raw CSV data as text/plain content type
by Don Jayamaha Jr
A professional-grade AI automation system for spot market trading insights on Binance. It analyzes multi-timeframe technical indicators, live price/order data, and crypto sentiment, then delivers fully formatted Telegram-style trading reports. 🎥 Watch Tutorial: 🧩 Required Workflows You must install and activate all of the following workflows for the system to function correctly: | ✅ Workflow Name | 📌 Function Description | | -------------------------------------------------- | -------------------------------------------------------------------------------- | | Binance Spot Market Quant AI Agent | Final AI orchestrator. Parses user prompt and generates Telegram-ready reports. | | Binance SM Financial Analyst Tool | Calls indicator tools and price/order data tools. Synthesizes structured inputs. | | Binance SM News and Sentiment Analyst Webhook Tool | Analyzes crypto sentiment, gives summary and headlines via POST webhook. | | Binance SM Price/24hrStats/OrderBook/Kline Tool | Pulls price, order book, 24h stats, and OHLCV klines for 15m–1d. | | Binance SM 15min Indicators Tool | Calculates 15m RSI, MACD, BBANDS, ADX, SMA/EMA from Binance kline data. | | Binance SM 1hour Indicators Tool | Same as above but for 1h timeframe. | | Binance SM 4hour Indicators Tool | Same as above but for 4h timeframe. | | Binance SM 1day Indicators Tool | Same as above but for 1d timeframe. | | Binance SM Indicators Webhook Tool | Technical backend. Handles all webhook logic for each timeframe tool. | ⚙️ Installation Instructions Step 1: Import Workflows Open your n8n Editor UI Import each workflow JSON file one by one Activate them or ensure they're called via Execute Workflow Step 2: Set Credentials OpenAI API Key** (GPT-4o recommended) Binance endpoints** are public (no auth required) Step 3: Configure Webhook Endpoints Deploy Binance SM Indicators Webhook Tool Ensure the following paths are reachable: /webhook/15m /webhook/1h /webhook/4h /webhook/1d Step 4: Telegram Integration Create a Telegram bot using @BotFather Add your Telegram API token to n8n credentials Replace the Telegram ID placeholder with your own Step 5: Final Trigger Trigger the Binance Spot Market Quant AI Agent manually or from Telegram The agent: Extracts the trading pair (e.g. BTCUSDT) Calls all tools for market data and sentiment Generates a clean, HTML-formatted Telegram report 💬 Telegram Report Output Format BTCUSDT Market Report Spot Strategy • Action: Buy • Entry: $63,800 | SL: $61,200 | TP: $66,500 • Rationale: MACD Crossover (1h) RSI Rebound from Oversold (15m) Sentiment: Bullish Leverage Strategy • Position: Long 3x • Entry: $63,800 • SL/TP zones same as above News Sentiment: Slightly Bullish • "Bitcoin rallies as ETF inflows surge" – CoinDesk • "Whales accumulate BTC at key support" – NewsBTC 🧠 System Overview [Telegram Trigger] → [Session + Auth Logic] → [Binance Spot Market Quant AI Agent] → [Financial Analyst Tool + News Tool] → [All Technical Indicator Tools (15m, 1h, 4h, 1d)] → [OrderBook/Price/Kline Fetcher] → [GPT-4o Reasoning] → [Split & Send Message to Telegram] 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding or resale permitted. 🔗 For support: LinkedIn – Don Jayamaha
by Raquel Giugliano
This workflow automates currency rate uploads into SAP Business One via Service Layer, using flexible input sources such as JSON (API), SQL Server, Google Sheets, or manual values. It leverages logic branching, AI validation, and logging for complete control and traceability. ++⚙️ HOW IT WORKS:++ 🔹 1. Receive Data via Webhook The workflow listens on the endpoint /formulario-datos via HTTP POST. The request body should include: origen: one of JSON, SQL, GoogleSheets, or Manual Depending on the value, the flow branches accordingly. 🔹 2. Authenticate with SAP Business One A POST request is sent to SAP B1’s Login endpoint. A session cookie (B1SESSION) is retrieved and used in all subsequent API calls. 🔹 3. Switch by Origin The flow branches into four processing paths based on origen: JSON: The payload is normalized using OpenAI to extract an array of rates. Each rate is sent to SAP individually after parsing. SQL: The SQL query provided in the payload is executed on a connected Microsoft SQL Server. The results are checked by AI to validate the date format. If valid, rates are sent to SAP. GoogleSheets: Rates are pulled from a connected spreadsheet. Each entry is sent to SAP in sequence. Manual: Uses currency, rate, and rateDate directly from the webhook payload. Sends the result directly to SAP. 🔹 4. AI-Powered Enhancements (Optional but enabled) Normalize JSON: Uses OpenAI (LangChain node) to convert any messy structure into a uniform array under the key rate. Date Formatting: Another OpenAI call ensures RateDate is in yyyyMMdd format (required by SAP), converting from ISO, timestamp, or other formats. 🔹 5. Send to SAP Business One (Service Layer) All paths send a POST request to: /SBOBobService_SetCurrencyRate With a payload such as: { "Currency": "USD", "Rate": "0.92", "RateDate": "20250612" } 🔹 6. Log Results All success/failure results are appended to a Google Sheets log (LOGS_N8N) The log includes method, URL, sent payload, status code, and message. ++🛠 SETUP STEPS:++ 1️⃣ Create Required Credentials: Go to Credentials > + New Credential and configure: SAP Business One (Service Layer) Type: HTTP Request Auth or Token Base URL: https://<your-host>:50000/b1s/v1/ Provide Username, Password, and CompanyDB via variables or fields Google Sheets OAuth2 connection to a Google account with access Microsoft SQL Server SQL login credentials and host OpenAI API key with access to models like GPT-4o 2️⃣ Environment Variables (Recommended) Set these variables in n8n → Settings → Variables: SAP_URL=https://<host>:50000/b1s/v1/ SAP_USER=your_username SAP_PASSWORD=your_password SAP_COMPANY_DB=your_companyDB 3️⃣ Prepare Google Sheets Sheet 1: RATE (for charging the data) Columns: Currency, Rate, RateDate Sheet 2: LOGS_N8N (to save the logs, success or failed) Columns: workflow, method, url, json, status_code, message 4️⃣ Activate and Test Deploy the webhook and grab the URL. ++✅ BONUS++ Built-in AI assistance for input validation and structure Logs all results for compliance and audit Flexible integration paths: perfect for hybrid or transitional systems