by Lakindu Siriwardana
📄 Automated Lease Renewal Offer by Email ✅ Features Automated Lease Offer Generation using AI (Ollama model). Duplicate File Check to avoid reprocessing the same customer. Personalized Offer Letter creation based on customer details from Supabase. PDF/Text File Conversion for formatted output. Automatic Google Drive Management for storing and retrieving files. Email Sending with generated offer letter attached. Seamless Integration with Supabase, Google Drive, Gmail, and AI LLM. ⚙️ How It Works Trigger: Workflow starts on form submission with customer details. Customer Lookup: Searches Supabase for customer data. Updates customer information if needed. File Search & Duplication Check: Looks for existing lease offer files in Google Drive. If duplicate found, deletes old file before proceeding. AI Lease Offer Creation: Uses the LLM Chain (offerLetter) to generate a customized lease renewal letter. File Conversion: Converts AI-generated text into a downloadable file format. Upload to Drive: Saves the new lease offer in Google Drive. Email Preparation: Uses Basic LLM Chain-email to draft the email body. Downloads the offer file from Drive and attaches it. Email Sending: Sends the renewal offer email via Gmail to the customer. 🛠 Setup Steps Supabase Connection: Add Supabase credentials in n8n. Ensure a customers table exists with relevant columns. 🔜Future Steps Add specific letter template (organization template). PDF offer letter
by Jimleuk
This n8n template uses a Telegram chatbot to conduct a Product Satisfaction Survey and fetches questions and stores answers in a Google sheet. It augments an AI Agent to ask follow-up questions to engage the user and uncover more insights in their responses. This template is intended to demonstrate how you'd realistically approach a workflow where there is structured conversation (static questions) but you still want to include an free-form element (follow-up questions) which can only be accomplished via AI. Check out an example Survey results: https://docs.google.com/spreadsheets/d/e/2PACX-1vQWcREg75CzbZd8loVI12s-DzSTj3NE_02cOCpAh7umj0urazzYCfzPpYvvh7jqICWZteDTALzBO46i/pubhtml?gid=0&single=true How it works A chat session is started with the user who needs to enter the bot command "/next" to start the survey. Once started, the template pulls in questions from a google sheet to ask the user. Questions are asked in sequence from left column to right column. When the user answers the question, a text classifier node is used to determine if a follow-up question could be asked. If so, a mini conversation is initiated by the AI agent to get more details. If not, the survey proceeds to the next question. All answers and mini-conversations are recorded in the Google Sheet under the respective question. When all questions are answered, the template will stop the survey and give the user a chance to restart. How to use You'll need to setup a Telegram bot (see docs) Create a google sheet with an ID column. Populate the rest of the columns with your survey questions (see sample) Ensure you have a Redis instance to capture state. Either self-host or sign-up to Upstash for a free account. Update the "Set Variable" node with your google sheet ID and survey title. Share your bot to allow others to participate in your survey. Requirements Telegram for Chatbot Google Sheets for Survey questions and answers Redis for State Management and Chat Memory Community+ license and above for Execution data node - you can remove this node if you don't have this licence. Customising this workflow Not using Telegram? This template technically works with other chat apps such as Whatsapp, wechat and even n8n's hosted chat! This state management pattern can also be applied to other use-cases and scenarios. Try it for other types of surveys!
by Vishal Kumar
Trigger The workflow runs when a GitLab Merge Request (MR) is created or updated. Extract & Analyze It retrieves the code diff and sends it to Claude AI or GPT-4o for risk assessment and issue detection. Generate Report AI produces a structured summary with: Risk levels Identified issues Recommendations Test cases Notify Developers The report is: Emailed to developers and QA teams Posted as a comment on the GitLab MR Setup Guide Connect GitLab Add GitLab API credentials Select repositories to track Configure AI Analysis Enter Anthropic (Claude) or OpenAI (GPT-4o) API key Set Up Notifications Add Gmail credentials Update the email distribution list Test & Automate Create a test MR to verify analysis and email delivery Key Benefits Automated Code Review** – AI-driven risk assessment and recommendations Security & Compliance** – Identifies vulnerabilities before code is merged Integration with GitLab CI/CD** – Works within existing DevOps workflows Improved Collaboration** – Keeps developers and QA teams informed Developed by Quantana, an AI-powered automation and software development company.
by Joseph LePage
Compare Local Ollama Vision Models for Image Analysis using Google Docs Process images using locally hosted Ollama Vision Models to extract detailed descriptions, contextual insights, and structured data. Save results directly to Google Docs for efficient collaboration. Who is this for? This workflow is ideal for developers, data analysts, marketers and AI enthusiasts who need to process and analyze images using locally hosted Ollama Vision Language Models. It’s particularly useful for tasks requiring detailed image descriptions, contextual analysis, and structured data extraction. What problem is this workflow solving? / Use Case The workflow solves the challenge of extracting meaningful insights from images in exhaustive detail, such as identifying objects, analyzing spatial relationships, extracting textual elements, and providing contextual information. This is especially helpful for applications in real estate, marketing, engineering, and research. What this workflow does This workflow: Downloads an image file from Google Drive. Processes the image using multiple Ollama Vision Models (e.g., Granite3.2-Vision, Gemma3, Llama3.2-Vision). Generates detailed markdown-based descriptions of the image. Saves the output to a Google Docs file for easy sharing and further analysis. Setup Ensure you have access to a local instance of Ollama. https://ollama.com/ Pull the Ollama vision models. Configure your Google Drive and Google Docs credentials in n8n. Provide the image file ID from Google Drive in the designated node. Update the list of Ollama vision models Test the workflow by clicking ‘Test Workflow’ to trigger the process. How to customize this workflow to your needs Replace the image source with another provider if needed (e.g., AWS S3 or Dropbox). Modify the prompts in the "General Image Prompt" node to suit specific analysis requirements. Add additional nodes for post-processing or integrating results into other platforms like Slack or HubSpot. Key Features: Detailed Image Analysis**: Extracts comprehensive details about objects, spatial relationships, text elements, and contextual settings. Multi-Model Support**: Utilizes multiple vision models dynamically for optimal performance. Markdown Output**: Formats results in markdown for easy readability and documentation. Google Drive Integration**: Seamlessly downloads images and saves results to Google Docs.
by Jaruphat J.
Overview This workflow automatically saves files received via LINE Messaging API into Google Drive and logs the file details into a Google Sheet. It checks the file type against allowed types, organizes files into date-based folders and (optionally) file type–specific subfolders, and sends a reply message back to the LINE user with the file URL or an error message if the file type is not permitted. Who is this for? Developers & IT Administrators: Looking to integrate LINE with Google Drive and Sheets for automated file management. Businesses & Marketing Teams: That want to automatically archive media files and documents received from users via LINE. Anyone Interested in No-Code Automation: Users who want to leverage n8n’s capabilities without heavy coding. What Problem Does This Workflow Solve? Automated File Organization: Files received from LINE are automatically checked for allowed file types, then stored in a structured folder hierarchy in Google Drive (by date and/or file type). Data Logging: Each file upload is recorded in a Google Sheet, providing an audit trail with file names, upload dates, URLs, and types. Instant Feedback: Users receive an immediate reply via LINE confirming the file upload, or an error message if the file type is not allowed. What This Workflow Does 1. Receives Incoming Requests: A webhook node ("LINE Webhook Listener") listens for POST requests from LINE, capturing file upload events and associated metadata. 2. Configuration Loading: A Google Sheets node ("Get Config") reads configuration data (e.g., parent folder ID, allowed file types, folder organization settings, and credentials) from a pre-defined sheet. Data Merging & Processing: The "Merge Event and Config Data" and "Process Event and Config Data" nodes merge and structure the event data with configuration settings. A "Determine Folder Info" node calculates folder names based on the configuration. If Store by Date is enabled, it uses the current date (or a specified date) as the folder name. If Store by File Type is also enabled, it uses the file’s type (e.g., image) for a subfolder. 4. Folder Search & Creation: The workflow searches for an existing date folder ("Search Date Folder"). If the date folder is not found, an IF node ("Check Existing Date Folder") routes to a "Create Date Folder" node. Similarly, for file type organization, the workflow uses a "Search FileType Folder" node (with appropriate conditions) to look for a subfolder, or creates it if not found. The "Set Date Folder ID" and "Set Image Folder ID" nodes capture and merge the resulting folder IDs. Finally, the "Config final ParentId" node sets the final target folder ID based on the configuration conditions: Store by Date: TRUE, Store by File Type: TRUE: Use the file type folder (inside the date folder). Store by Date: TRUE, Store by File Type: FALSE: Use the date folder. Store by Date: FALSE, Store by File Type: TRUE: Use the file type folder. Store by Date: FALSE, Store by File Type: FALSE: Use the Parent Folder ID from the configuration. 5. File Retrieval and Validation: A HTTP Request node ("Get File Binary Content") fetches the file’s binary data from the LINE API. A Function node ("Validate File Type") checks if the file’s MIME type is included in the allowed list (e.g., "audio|image|video"). If not, it throws an error that is captured for the reply. 6. File Upload and Logging: The "Upload File to Google Drive" node uploads the validated binary file to the final target folder. After a successful upload, the "Log File Details to Google Sheet" node logs details such as file name, upload date, Google Drive URL, and file type into a designated Google Sheet. 7. User Feedback: The "Check Reply Enabled Flag" node checks if the reply feature is enabled. Finally, the "Send LINE Reply Message" node sends a reply message back to the LINE user with either the file URL (if the upload was successful) or an error message (if the file type was not allowed). Setup Instructions 1. Google Sheets Setup: Create a Google Sheet with two sheets:** config: Include columns for Parent Folder Path, Parent Folder ID, Store by Date (boolean), Store by File Type (boolean), Allow File Types (e.g., “audio|image|video”), CurrentDate, Reply Enabled, and CHANNEL ACCESS TOKEN. fileList: Create headers for File Name, Date Uploaded, Google Drive URL, and File Type. For an example of the required format, check this Google Sheets template: Google Sheet Template 2. Google Drive Credentials: Set up and authorize your Google Drive credentials in n8n. 3. LINE Messaging API: Configure your LINE Developer Console webhook to point to the n8n Webhook URL ("Line Chat Bot" node). Ensure you have the proper Channel Access Token stored in your Google Sheet. 4. n8n Workflow Import: Import the provided JSON file into your n8n instance. Verify node connections and update any credential references as needed. 5. Test the Workflow: Send a test message via LINE to confirm that files are properly validated, uploaded, logged, and that reply messages are sent. How to Customize This Workflow Allowed File Types: Adjust the "Validate File Type" field in your config sheet to control which file types are accepted. Folder Structure: Modify the logic in the "Determine Folder Info" and subsequent folder nodes to change how folders are structured (e.g., use different date formats or add additional categorization). Logging: Update the "Log File Details to Google Sheet" node if you wish to log additional file metadata. Reply Messages: Customize the reply text in the "Send LINE Reply Message" node to include more detailed information or instructions.
by lin@davoy.tech
This workflow template, "Chinese Translator via Line x OpenRouter (Text & Image)" is designed to provide seamless Chinese translation services directly within the LINE messaging platform. By integrating with OpenRouter.ai and advanced language models like Qwen, this workflow translates text or images containing Chinese characters into pinyin and English translations, making it an invaluable tool for language learners, travelers, and businesses operating in multilingual environments. This template is ideal for: Language Learners: Who want to practice Chinese by receiving instant translations of text or images. Travelers: Looking for quick translations of Chinese signs, menus, or documents while abroad. Educators: Teaching Chinese language courses and needing tools to assist students with translations. Businesses: Operating in multilingual markets and requiring efficient communication tools. Automation Enthusiasts: Seeking to build intelligent chatbots that can handle language translation tasks. What Problem Does This Workflow Solve? Translating Chinese text or images into English and pinyin can be challenging, especially for beginners or those without access to reliable translation tools. This workflow solves that problem by: Automatically detecting and translating text or images containing Chinese characters. Providing accurate translations in both pinyin and English for better comprehension. Supporting multiple input formats (text, images) to cater to diverse user needs. Sending replies directly to users via the LINE messaging platform , ensuring accessibility and ease of use. What This Workflow Does 1) Receive Messages via LINE Webhook The workflow is triggered when a user sends a message (text, image, or other types) to the LINE bot. 2) Display Loading Animation A loading animation is displayed to reassure the user that their request is being processed. 3) Route Input Types The workflow uses a Switch node to determine the type of input (text, image, or unsupported formats). If the input is text , it is sent to the OpenRouter.ai API for translation. If the input is an image , the workflow extracts the image content, converts it to base64, and sends it to the API for translation. Unsupported formats trigger a polite response indicating the limitation. 4) Translate Content Using OpenRouter.ai The workflow leverages Qwen models from OpenRouter.ai to generate translations: For text inputs, it provides Chinese characters , pinyin , and English translations . For images, it extracts and translates using the qwen-VL model which can take images 5) Reply with Translations The translated content is formatted and sent back to the user via the LINE Reply API. Setup Guide Pre-Requisites Access to the LINE Developers Console to configure your webhook and channel access token. An OpenRouter.ai account with credentials to access Qwen models. Basic knowledge of APIs, webhooks, and JSON formatting. Step-by-Step Setup 1) Configure the LINE Webhook: Go to the LINE Developers Console and set up a webhook to receive incoming messages. Copy the Webhook URL from the Line Webhook node and paste it into the LINE Console. Remove any "test" configurations when moving to production. 2) Set Up OpenRouter.ai: Create an account on OpenRouter.ai and obtain your API credentials. Connect your credentials to the OpenRouter nodes in the workflow. 3) Test the Workflow: Simulate sending text or images to the LINE bot to verify that translations are processed and replied correctly. How to Customize This Workflow to Your Needs Add More Languages: Extend the workflow to support additional languages by modifying the API calls. Enhance Image Processing: Integrate more advanced OCR tools to improve text extraction from complex images. Customize Responses: Modify the reply format to include additional details, such as grammar explanations or cultural context. Expand Use Cases: Adapt the workflow for specific industries, such as tourism or e-commerce, by tailoring the translations to relevant vocabulary. Why Use This Template? Real-Time Translation: Provides instant translations of text and images, improving user experience and accessibility. Multimodal Support: Handles both text and image inputs, catering to diverse user needs. Scalable: Easily integrate into existing systems or scale to support multiple users and workflows. Customizable: Tailor the workflow to suit your specific audience or industry requirements.
by Abdullah Maftah
Auto Source LinkedIn Candidates with GPT-4 Boolean Search & Google X-ray How It Works: User Input: The user pastes a job description or ideal candidate specifications into the workflow. Boolean Search String Generation: OpenAI processes the input and generates a precise LinkedIn Boolean search string formatted as: site:linkedin.com/in ("Job Title" AND "Skill1" AND "Skill2") This search string is optimized to find relevant LinkedIn profiles matching the provided criteria. Google Sheet Creation: A new Google Sheet is automatically created within a specified document to store extracted LinkedIn profile URLs. Google Search Execution: The workflow sends a search request to Google using an HTTP node with the generated Boolean string. Iterative Search & Data Extraction: The workflow retrieves the first 10 results from Google. If the desired number of LinkedIn profiles has not been reached, the workflow loops, fetching the next set of 10 results until the if condition is met. Data Storage: The workflow extracts LinkedIn profile URLs from the search results and saves them to the newly created Google Sheet for further review. Setup Steps: 1. API Key Configuration Under "Credentials", add your OpenAI API key from your OpenAI account settings. This key is used to generate the LinkedIn Boolean search string. 2. Adjust Search Parameters Navigate to the "If" node and update the condition to define the desired number of LinkedIn profiles to extract. The default is 50, but you can set it to any number based on your needs. 3. Establish Google Sheets Connection Connect your Google Sheets account** to the workflow. Create a document** to store the sourced LinkedIn profiles. The workflow automatically creates a new sheet for each new search, so no manual setup is needed. 4. Authenticate Google Search Google search requires authentication** for better results. Use the Cookie-Editor browser extension to export your header string and enable authenticated Google searches within the workflow. 5. Run the Workflow Execute* the workflow and monitor the *Google Sheet** for newly added LinkedIn profiles. Benefits: ✅ Automates profile sourcing, reducing manual search time. ✅ Generates precise LinkedIn Boolean search strings tailored to job descriptions. ✅ Extracts and saves LinkedIn profiles efficiently for recruitment efforts. This solution leverages OpenAI and advanced search techniques to enhance your talent sourcing process, making it faster and more accurate! 🚀
by Carlos Contreras
Introduction This workflow is designed to create and attach notes or comments to any record in your Odoo instance. It acts as a sub-workflow that can be triggered by a main workflow to log messages or comments in a centralized manner. By leveraging the powerful Odoo API, this template ensures that updates to records are handled efficiently, providing an organized way to document important information related to your business processes. Setup Instructions Import the Workflow: Import the provided JSON file into your n8n instance. Odoo Credentials: Ensure you have valid Odoo API credentials (e.g., "Roodsys Odoo Automation Account") configured in n8n. Node Configuration: Verify that the "Odoo" node (consider renaming it to "Odoo Record Manager" for clarity) is set up with your server details and authentication parameters. Check that the workflow trigger ("When Executed by Another Workflow") is configured to receive input parameters from the parent workflow. Execution Trigger: This workflow is designed to be initiated by another workflow. Make sure the main workflow supplies the required inputs. Workflow Details Trigger Node: The workflow begins with the "When Executed by Another Workflow" node, which accepts three inputs: rec_id: A numeric identifier for the Odoo record. message: The text of the comment or note. model: The specific Odoo model (e.g., rs.deployment.action.log) where the note should be attached. Odoo Node: The second node in the workflow calls the Odoo API to create a new log message. It maps the inputs as follows: message_type is set to "comment". model is assigned the provided model name. res_id is assigned the record ID (rec_id). body is assigned the message content. Additional Information: A sticky note node is included to provide a brief overview of the workflow’s purpose directly within the interface. Input Parameters Record ID (rec_id): The unique identifier of the record in Odoo where the note will be added. Message (message): The content of the comment or note that is to be logged. Model (model): The Odoo model name indicating the context in which the note should be created (e.g., rs.deployment.action.log). Usage Examples Internal Logging: Use the workflow to attach internal comments or logs to specific records, such as customer profiles, orders, or deployment logs. Audit Trails: Create a comprehensive audit trail by documenting changes or important events in Odoo records. Integration with Other Workflows: Link this workflow with other automation processes in n8n (like email notifications, data synchronization, or reporting) to create a seamless integration across your systems. Pre-conditions The Odoo instance must be accessible and correctly configured. API permissions and user roles should be validated to ensure that the workflow has the necessary access rights. The workflow expects inputs from an external trigger or parent workflow. Customization & Integration This template offers several customization options to tailor it to your needs: Field Customization: Modify or add new fields to match your logging or commenting requirements. Node Renaming: Rename nodes for better clarity and consistency within your workflow ecosystem. Integration Possibilities: Easily integrate this workflow with other processes in n8n, such as triggering notifications or synchronizing data across different systems. This sub-workflow receives data from a main workflow (for example, a record ID, a message, and the Odoo model) and creates a new note (or comment) in the corresponding Odoo record. Essentially, it acts as a centralized point for logging comments or notes in a specific Odoo model, ensuring that the information remains organized and easy to track. Your model must inherit from _inherit = ['portal.mixin', 'mail.thread.main.attachment']
by Anna Bui
🎯 Universal Meeting Transcript to LinkedIn Content Automatically transform your meeting insights into engaging LinkedIn content with AI Perfect for coaches, consultants, sales professionals, and content creators who want to share valuable insights from their meetings without the manual effort of content creation. How it works Calendar trigger detects when your coaching/meeting ends Waits for meeting completion, then sends you a form via email You provide the meeting transcript and specify post preferences AI analyzes the transcript using your personal brand guidelines Generates professional LinkedIn content based on real insights Creates organized Google Docs with both transcript and final post Sends you links to review and publish your content How to use Connect your Google Calendar and Gmail accounts Update the calendar filter to match your meeting types Customize the AI prompts with your brand voice and style Replace email addresses with your own Test with a sample meeting transcript Requirements Google Calendar (for meeting detection) Gmail (for form delivery and notifications) Google Drive & Docs (for content storage) LangChain AI nodes (for content generation) Good to know AI processing may incur costs based on your LangChain provider Works with any meeting platform - just copy/paste transcripts Can be adapted to use webhooks from recording tools like Fireflies.ai Memory nodes store your brand guidelines for consistent output Happy Content Creating!
by Miquel Colomer
This n8n workflow template automates the process of finding LinkedIn profiles for a person based on their name, and company. It scrapes Google search results via Bright Data, parses the results with GPT-4o-mini, and delivers a personalized follow-up email with insights and suggested outreach steps. 🚀 What It Does Accepts a user-submitted form with a person’s full name, and company. Performs a Google search using Bright Data to find LinkedIn profiles and company data. Uses GPT-4o-mini to parse HTML results and identify matching profiles. Filters and selects the most relevant LinkedIn entry. Analyzes the data to generate a buyer persona and follow-up strategy. Sends a styled email with insights and outreach steps. 🛠️ Step-by-Step Setup Deploy the form trigger to accept person data (name, position, company). Build a Google search query from user input. Scrape search results using Bright Data. Extract HTML content using the HTML node. Use GPT-4o-mini to parse LinkedIn entries and company insights. Filter for matches based on user input. Merge relevant data and generate personalized outreach content. Send email to a predefined address. Show a final confirmation message to the user. 🧠 How It Works: Workflow Overview Trigger:** When User Completes Form Search:** Edit Url LinkedIn, Get LinkedIn Entry on Google, Extract Body and Title, Parse Google Results Matching:** Extract Parsed Results, Filter, Limit, IF LinkedIn Profile is Found? Fallback:** Form Not Found if no match Company Lookup:** Edit Company Search, Get Company on Google, Parse Results, Split Out Content Generation:** Merge, Create a Followup for Company and Person Email Delivery:** Send Email, Form Email Sent 📨 Final Output An HTML-styled email (using Tailwind CSS) with: Matched LinkedIn profile Company insights Persona-based outreach strategy 🔐 Credentials Used BrightData account** for scraping Google search results OpenAI account** for GPT-4o-mini-powered parsing and content generation SMTP account** for sending follow-up emails ❓Questions? Template and node created by Miquel Colomer and n8nhackers. Need help customizing or deploying? Contact us for consulting and support.
by Luke
Automatically backs up your workflows to Github and generates documentation in a Notion database. Weekly run, uses the "internal-infra" tag to look for new or recently modified workflows Uses a Notion database page to hold the workflow summary, last updated date, and a link to the workflow Uses OpenAI's 4o-mini to generate a summarization of what the workflow does Stores a backup of the workflow in GitHub (recommend a private repo) Sends notification to Slack channel for new or updated workflows Who is this for Anyone seeking backup of their most important workflows Anyone seeking version control for their most important workflows Credentials required N8N: You will need an N8N credential created so the workflow can query the N8N instance to find all active workflows with the "internal-infra" tag Notion: You will need an Notion credential created OpenAI: You will need an OpenAI credential, unless you intend on rewiring this with your AI of choice (ollama, openrouter, etc.) GitHub: You will need an GitHub credential Slack: You will require an Slack credential, recommend a Bot / access token configuration Setup Notion Create a database with the following columns. Column type is specified in [type]. Workflow Name [text] isActive (dev) [checkbox] Error workflow setup [checkbox] AI Summary [text] Record last update [date/time] URL (dev) [text/url] Workflow created at [date/time] Workflow updated at [date/time] Slack Create a channel for updates to be posted into Github Create a private repo for your workflows to be exported into N8N Download & install the template Configure the blocks to use your N8N, Notion, OpenAI & Slack credentials for your own Edit the "Set Fields" block and change the URL to that of your N8N instance (cloud or self-hosted) Edit the "Add to Notion" action and specify the Database page you wish to update Edit the Slack actions to specify the Channel you want slack notifications posted to Edit the GitHub actions to specify the Repository Owner & Repository Name Sample output in Notion Workflow diagram
by Jimleuk
This n8n template demonstrates an approach to perform bot-to-human handoff using Human-in-the-loop functionality as a switch. In this experiment, we play with the idea of states we want our agent to be in which controls it's interacton with the user. First state** - the agent is onboarding the user by collecting their details for a sales inquiry. After which, they are handed-off / transferred to a human to continue the call. Second state** - the agent is essentially "deactivated" as further messages to the bot will not reach it. Instead, a canned response is given to the user. The human agent must "reactivate" the bot by completing the human-in-the-loop form and give a summary of their conversation with the user. Third state** - the agent is "reactivated" with context of the human-to-user conversation and is set to provide after sales assistance. An tool is made available to the agent to again delegate back to the human agent when requested. How it works This template uses telegram to handle the interaction between the user and the agent. Each user message is checked for a session state to ensure it is guided to the right stage of the conversation. For this, we can use Redis as a simple key-value store. When no state is set, the user is directed through an onboarding step to attain their details. Once complete, the agent will "transfer" the user to a human agent - technically, all this involves is an update to the session state and a message to another chat forwarding the user's details. During this "human" state, the agent cannot reply to the user and must wait until the human "transfers" the conversation back. The human can do this by replying to "human-in-the-loop" message with a summary of their conversation with the user. This session state now changes to "bot" and the context is implanted in the agent's memory so that the agent can respond to future questions. At this stage of the conversation, the agent is now expected to handle and help the user with after-sales questions. The user can at anytime request transfer back to the human agent, repeating the previous steps as necessary. How to use Plan your user journey! Here is a very basic example of a sales inquiry with at most 3 states. More thought should be developed when many more states are involved. You may want to better log and manage session states so no user is left in limbo. Try connecting the user and sessions to your CRM. Note, the Onboarding agent and After-Sales agent have separate chat memories. When adding more agents, it is recommend to continue having separate chat memories to help focus between states. Requirements Telegram for chatbot & interface Redis for session store and chat memory OpenAI for AI agent Customising this workflow Not using Telegram? This template works with Whatsapp and other services with equivalent functionality.