by Alejandro AR
This simple workflow will fetch a YouTube playlist every n minutes and send the new items s to a collection in Raindrop. You can connect any application at the end of the flow. Make sure to authenticate to YouTube using Google Auth, and to Raindrop using an API. Update the Playlist ID and the Raindrop collection.
by Sirisak Chantanate
Workflow Overview: Extract text from image using AI is worth because you need no code. It incorporates Google Gemini 2.0 Flash model for important text extraction from image. If you code without AI, you have to use multiple condition and may cause a lot of bug but with Google Gemini, you don't need any coding and if the Pay Slip is different, Gemini will extract it automatically. Workflow description: User uses Line Messaging API to send Pay Slip image or message to the chatbot, create Line Business ID from here: Line Business Classify the message which is image or text If the message is Pay Slip image, it will process using Gemini 2.0 Flash EXP and extract important information and response in JSON format without coding by using the following prompt: Analyze image and then return in JSON Response that has the only following value: Status, From, To, Date, Amount To get Google AI Studio API Key, you can find from the following link: Google AI Studio API Key Create Google Sheets which include the fileds (Status, From, To, Date, Amount) that we have created related to the AI prompt Google Sheets as the following example: If the message is text, it will process using Gemini 2.0 Flash EXP model as the AI Assistant else if the message is image, it will extract the important fields then reply to the User and insert into Google Sheets Key Features: Extract text from image with No Code** Without N8N, we have to write code to extract text from image, but with N8N and Google Gemini 2.0 Flash EXP together, we don't need to code and it will process all slip vendors or other document vendors. Multipurpose Chatbot** this chatbot accept both text and image so we don't have to create many chatbot accounts Reduce human error** this workflow let any officer to verify document status when the job ends Note: You can change the information by changing your prompt and also Google Sheets Column names relatively.
by bangank36
This workflow retrieves all users from n8n, compares them against entries in a Google Sheets spreadsheet, and automatically creates new users when needed. Once new users are created, invitation emails are sent automatically. You can trigger the workflow manually or set it to run on a schedule to ensure continuous synchronization. Spreadsheet Template This workflow is designed to work with a Google Sheets structure inspired by Squarespace's newsletter block connection. You can modify the node settings to adapt to a different column format. 👉 Clone the sample sheet here Suggested columns: Submitted On Email Address Name Requirements Credentials To use this workflow, you need: n8n API Key – to update users from n8n. Google Sheets API credentials – Required to get data from a spreadsheet. Configure Your n8n Instance To make this workflow work with your n8n instance, update the API endpoint: 🔧 Edit Global node 👇 Change n8n_url to match your instance URL: Authentication Guide Explore More Templates 👉 Check out my other n8n templates
by PiAPI
What this workflow does? This workflow converts orthographic three-view drawings into 360° rotation videos through PiAPI's GPT-4o-Image and Kling APIs (unofficial). The workflow could be set with our 3D Figurine Orthographic Views workflow for generation. Who is the workflow for? Designers**: Generate inspiration into 3D designs and make them spin to gain concrete details in a efficient way. Online shoppers**: Show protential products from all angles in videos and preview overall texture of models. Content Creators** (including toy bloggers): Make fun videos of collectible models. Step-by-step Instructions 1.Fill in basic params with your X-API-Key of your PiAPI account and 3-View image url. 2.Click test workflow. 3.Get the final video in the last node. Use Case Input Image Output Video
by Sarfaraz Muhammad Sajib
This n8n workflow demonstrates how to build an automated AI chat system using OpenRouter.ai. It includes a manual trigger, sets a model and user message, sends a POST request to the OpenRouter chat API, and summarizes the response. Workflow Steps: Manual Trigger – Starts the workflow when executed manually. Set Node – Defines: Model: mistralai/mistral-small-3.2-24b-instruct:free Message: What is the meaning of life? HTTP Request – Sends a POST request to https://openrouter.ai/api/v1/chat/completions using Bearer Token Authentication with the model and message as JSON. Summarize – Extracts and summarizes the AI’s response (choices[0].message.content). Use Cases: AI chatbot automation Content summarization Testing AI prompts in real-time Educational demos using OpenRouter.ai Lightweight conversational tools with no external server
by Rajeet Nair
📖 Description 🔹 How it works This workflow introduces an AI + Human-in-the-Loop pipeline for employee timesheet management. It combines the power of Google Drive, AI (OCR + LLM), and Gmail with a human review step to ensure accuracy and compliance. AI-Powered File Discovery Scans a Google Drive folder for new or updated timesheet files (PDF, Word, Excel, Images). AI Data Extraction Uses OCR and LLM (Mistral) to intelligently read and extract structured data. Supports multiple formats: PDF, Word (DOC/DOCX), Excel (XLS/XLSX), and Image files (JPG, PNG, scanned documents). Creates clean JSON with file details and timesheet logs (date, hours worked, tasks, notes). Smart Data Formatting Converts AI output into a clear HTML summary table for easy review. Flags potential anomalies (missing hours, duplicate dates, irregular entries). Human-in-the-Loop Verification Sends an approval email via Gmail containing: File metadata AI-generated HTML summary JSON attachment of raw extracted data HR/Managers review the summary and approve/reject before final actions occur. Post-Approval Automation (optional) Approved records can be saved in a separate Google Drive folder. Employees or HR receive confirmation emails. ⚙️ Set up steps Connect Credentials Add Google Drive and Gmail credentials in n8n. Configure Mistral (or any LLM) API credentials. Configure Google Drive In the “Search files and folders” node, replace the folderId with your company’s timesheet folder ID. Customize Extraction Schema Sticky notes explain how JSON output is structured. Adapt it for your organization’s needs (e.g., overtime, project codes). Set Up Human Verification Emails Update Gmail node recipients to your HR or approval team. Customize the email body (AI summary + JSON file attached). Activate & Test Enable the workflow. Upload a sample timesheet to trigger the AI + human verification loop. ⚡ Result: A robust AI + Human-in-the-Loop workflow that reduces repetitive data entry, prevents payroll errors, and gives HR full confidence before final approval.
by Davide
This workflow dynamically chooses between two new powerful Anthropic models — Claude Opus 4 and Claude Sonnet 4 — to handle user queries, based on their complexity and nature, maintaining scalability and context awareness with Anthropic web search function and Think tool. Key Advantages 🔁 Dynamic Model Selection Automatically routes each user query to either Claude Sonnet 4 (for routine tasks) or Claude Opus 4 (for complex reasoning), ensuring optimal performance and cost-efficiency. 🧠 AI Agent with Tool Use The AI agent can utilize a web search tool to retrieve up-to-date information and a Think tool for complex reasoning processes — improving response quality. 📎 Memory Integration Uses session-based memory to maintain conversational context, making interactions more coherent and human-like. 🧮 Built-in Calculation Tool Handles numeric queries using an integrated calculator tool, reducing the need for external processing. 📤 Structured Output Parser Ensures outputs are always well-structured and formatted in JSON, which improves consistency and downstream integrations. 🌐 Web Search Capability Supports real-time information retrieval for current events, statistics, or details not available in the AI’s base knowledge. Components Overview Trigger**: Listens for new chat messages. Routing Agent**: Analyzes the message and returns the best model to use. AI Agent**: Handles the conversation, decides when to use tools. Tools**: web_search for internet queries Think for reasoning Calculator for math tasks Models Used**: claude-sonnet-4-20250514: Optimized for general and business logic tasks. claude-opus-4-20250514: Best for deep, strategic, and analytical queries. How It Works Dynamic Model Selection The workflow begins when a chat message is received. The Anthropic Routing Agent analyzes the user's query to determine the most suitable model (either Claude Sonnet 4 or Claude Opus 4) based on the query's complexity and requirements. The routing agent uses predefined criteria to decide: Claude Sonnet 4: Best for standard tasks like real-time workflow routing, data validation, and routine business logic. Claude Opus 4: Reserved for complex scenarios requiring deep reasoning, advanced analysis, or high-impact decisions. Query Processing and Response Generation The selected model processes the query, leveraging tools like web_search for real-time information retrieval, Think for internal reasoning, and Calculator for numerical tasks. The AI Agent coordinates these tools, ensuring the response is accurate and context-aware. A Simple Memory node retains session context for coherent multi-turn conversations. The final response is formatted and returned to the user without intermediate steps or metadata. Set Up Steps Node Configuration Trigger: Configure the "When chat message received" node to handle incoming user queries. Routing Agent: Set up the "Anthropic Routing Agent" with the system message defining model selection logic. Ensure it outputs a JSON object with prompt and model fields. AI Model Nodes: Link the "Sonnet 4 or Opus 4" node to dynamically use the selected model. The "Sonnet 3.7" node powers the routing agent itself. Tool Integration Attach the "web_search" HTTP tool to enable internet searches, ensuring the API endpoint and headers (e.g., anthropic-version) are correctly configured. Connect auxiliary tools (Think, Calculator) to the "AI Agent" for extended functionality. Add the "Simple Memory" node to maintain conversation history. Credentials Provide an Anthropic API key to all nodes requiring authentication (e.g., model nodes, web search). Testing Activate the workflow and test with sample queries to verify: Correct model selection (e.g., Sonnet for simple queries, Opus for complex ones). Proper tool usage (e.g., web searches trigger when needed). Memory retention across chat turns. Deployment Once validated, set the workflow to active for live interactions. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by WeblineIndia
This smart automation workflow created by the AI development team at WeblineIndia, helps with the daily collection and storage of weather data. Using the OpenWeatherMap API and Airtable, this solution gathers vital weather details such as temperature, humidity, and wind speed. The automation ensures daily updates, creating a dependable historical record of weather patterns for future reference and analysis. Steps: Set Schedule Trigger Configure a Cron node to trigger the workflow daily, for example, at 7 AM. Fetch Weather Data (HTTP Request) Use the HTTP Request node to retrieve weather data from the OpenWeatherMap API. Include your API key and query parameters (e.g., q=London, unit=metric) to specify the city and desired units. Parse Weather Data Utilize a JSON Parse node to extract key weather details, such as temperature, humidity, and wind speed, from the API response. Store Data in Airtable Use the Airtable node to insert the parsed data into the designated Airtable table. Ensure proper mapping of fields like temperature, humidity, and wind speed. Save and Execute Save the workflow and activate it to ensure weather data is fetched and stored automatically every day. Outcome This robust solution, developed by WeblineIndia, reliably collects and archives daily weather data, providing businesses and individuals with an accessible record of weather trends for analysis and decision-making. About WeblineIndia We specialize in creating custom automation solutions and innovative software workflows to help businesses streamline operations and achieve efficiency. This weather data fetcher is just one example of our expertise in delivering value through technology.
by digi-stud.io
Adobe developer API Did you know that Adobe provides an API to perform all sort of manipulation on PDF files : Split PDF, Combine PDF OCR Insert page, delete page, replace page, reorder page Content extraction (text content, tables, pictures) ... The free tier allows up to 500 PDF operation / month. As it comes directly from Adobe, it works often better than other alternatives. Adobe documentation: https://developer.adobe.com/document-services/docs/overview/pdf-services-api/howtos/ https://developer.adobe.com/document-services/docs/overview/pdf-extract-api/gettingstarted/ What does this workflow do The API is a bit painful to use. To perform a transformation on a PDF it requires to Authenticate and get a temporal token Register a new asset (file) Upload you PDF to the registered asset Perform a query according to the transformation requested Wait for the query to be proccessed by Adobe backend Download the result This workflow is a generic wrapper to perform all these steps for any transformation endpoint. I usually use it from other workflow with an Execute Workflow node. Examples are given in the workflow. Example use case This service is useful for example to clean PDF data for an AI / RAG system. My favorite use-case is to extract table as images and forward images to an AI for image recognition / description which is often more accuarate than feedind raw tabular data to a LLM.
by Davide
Workflow Overview This workflow automates the creation and management of a custom OpenAI Assistant for a travel agency ("Travel with us"), leveraging Google Drive for document storage. How It Works 1. Create the OpenAI Assistant Node**: OpenAI Action: Creates a custom assistant named "Travel with us" Assistant using the gpt-4o-mini model. Instructions: Respond only using the provided document (e.g., agency-specific info). Stay friendly, brief, and focused on travel-related queries. Ignore irrelevant questions politely. Credentials: Requires OpenAI API key. 2. Upload Agency Document Google Drive Node**: Action: Downloads a Google Doc as a PDF. OpenAI2 Node**: Action: Uploads the PDF to OpenAI with purpose: "assistants". Output: Generates a file_id. 3. Update the Assistant with the Document OpenAI Node**: Action: Updates the assistant to include the uploaded file. 4. Chat Interaction Chat Trigger**: Activates when a message is received ("When chat message received"). OpenAI Assistant Node**: Action: Uses the updated assistant to respond to user queries. Memory: Window Buffer Memory retains chat context for coherent conversations. Set Up Steps Prepare the Document: Store your travel agency guide in Google Drive (e.g., as a Google Doc). Update the Google Drive node with your document’s ID. Configure Credentials: Google Drive: Connect via OAuth2 (googleDriveOAuth2Api). OpenAI: Add your API key to all OpenAI nodes. Customize the Assistant: Modify the instructions in the OpenAI node to reflect your agency’s needs. Ensure the document includes FAQs, policies, and travel info. Test the Workflow: Trigger manually ("Test workflow") to create the assistant and upload the file. Send a chat message (e.g., "What are your travel packages?") to test responses. Dependencies Google Drive Account**: To store and retrieve the agency document. OpenAI API Access**: For assistant creation and file uploads.
by David Olusola
Overview A comprehensive educational workflow that demonstrates practical JavaScript usage in n8n's Code node through real-world business scenarios. Perfect for learning data manipulation, transformation, and automation patterns that you can immediately apply to client projects. What This Template Teaches: Data Filtering & Transformation - Filter employees by age, calculate bonuses, format contact information Statistical Analysis - Generate team statistics, averages, role distributions, and KPIs Multi-Format Export - Create CSV files, email lists, and API-ready payloads from raw data n8n Best Practices - Proper JSON handling, return formats, and data flow patterns How It Works: Manual Trigger starts the workflow with sample employee data Set Sample Data provides realistic business data (employees with roles, salaries, ages) Three Code Node Examples process the same data differently: Filter & Transform: Creates adult employee list with calculated bonuses Calculate Stats: Generates comprehensive team analytics and reports Format for Export: Prepares data for external systems (APIs, emails, CSV) Key Learning Points: Access input data using items[0].json.propertyName Return proper n8n format with [{ json: data }] structure Use JSON.parse() for string-to-object conversion Apply JavaScript array methods (filter, map, reduce) for data processing Handle multiple output scenarios and data aggregation Perfect For: n8n beginners learning Code node fundamentals Developers transitioning to n8n automation Client demos showing data processing capabilities Team training and onboarding sessions Foundation for building custom business automation workflows Business Use Cases: Transform this template for lead qualification, customer segmentation, report generation, data enrichment, and API integrations. Each Code node pattern can be adapted for different industries and automation needs.
by Deborah
Use n8n to bring data from any API to your AI. This workflow uses the Chat Trigger to provide the chat interface, and the Custom n8n Workflow Tool to call a second workflow that calls the API. The second workflow uses AI functionality to refine the API request based on the user's query. It then makes an API call, and returns the response to the main workflow. This workflow is used in Advanced AI examples | Call an API to fetch data in the documentation. To use this workflow: Load it into your n8n instance. Add your credentials as prompted by the notes. Requires n8n 1.28.0 or above