by Yaron Been
This workflow provides automated access to the Creativeathive Lemaar Door Urban AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Creativeathive Lemaar Door Urban model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Creativeathive/lemaar-door-urban AI model Creativeathive Lemaar Door Urban**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Agent Circle
This N8N template demonstrates how to use our tool to collect a list of videos from any YouTube channel - including video URLs, titles, descriptions, thumbnail links, publish dates, and more - all saved cleanly into a connected Google Sheet. Use cases are many: Whether you're a YouTube content strategist tracking competitors, a marketing team building dashboards from video metadata, or an automation pro connecting YouTube to downstream workflows, researchers and analysts collecting structured video data at scale, this tool gives you what you need! How It Works The workflow begins when you click Execute Workflow or Test Workflow manually in N8N. It reads the list of full channel URLs, custom channel URLs or channel IDs from the Channel URLs tab in the connected Google Sheet. Only the channels with the Ready status will be processed. A Switch node detects whether the input is a full/custom channel URL, or a raw channel ID, and routes it accordingly. If the input is already a channel ID, the tool prepares the data structure before sending it to the YouTube API call in the next step. If the input is a full channel URL or a custom channel URL, the workflow extracts the username, then sends a HTTP Request to the YouTube API to retrieve the corresponding Channel ID, and prepares the data structure before continuing. Once the valid Channel ID is set, the tool sends a request to YouTube API endpoint to retrieve a list of public videos. By default, the number of videos extracted per channel is limited to 10. The API response is checked for success: If successful, the video data is split into individual entries, cleaned, and added to the Videos tab in the connected Google Sheet. The original rows' status in the Channel URLs tab is marked as Finished. If an error occurs, the rows' status in the Channel URLs tab is marked as Error for later review. How To Use Download the workflow package. Import the workflow package into your N8N interface. Duplicate the YouTube - Get Channel Videos Google Sheet template into your Google Sheets account. Set up Google Cloud Console credentials in the following nodes in N8N, ensuring enabled access and suitable rights to Google Sheets and YouTube services: For Google Sheets access, ensure each node is properly connected to the correct tab in your connected Google Sheet: Node Google Sheets - Get Channel URLs → connected to the Channel URLs tab; Node Google Sheets - Update Data → connected to the Videos tab; Node Google Sheets - Update Data - Success → connected to the Channel URLs tab; Node Google Sheets - Update Data - Error → connected to the Channel URLs tab. For YouTube access, set up a GET method to connect to YouTube API in the following nodes: Node HTTP Request - Get Channel ID; Node HTTP Request - Get Channel Videos. In your connected Google Sheet, enter the full channel URLs, custom channel URLs or channel IDs that you want to crawl and set the rows' status to Ready. Run the workflow by clicking Execute Workflow or Test Workflow in N8N. View the results in your connected Google Sheet: Successful fetches will update the original rows' status to Finished and the videos' information show up in the Videos tab. If any URL or ID fails, the rows' status will be marked as Error. Requirements Basic setup in Google Cloud Console (OAuth or API Key method enabled) with enabled access to YouTube and Google Sheets. How To Customize By default, the workflow is manually triggered in N8N. However, you can automate the process by adding a Google Sheets trigger that monitors new entries automatically. If you want to fetch more video metadata like durations, or view counts, you can expand the HTTP Request and post-processing nodes to include those. The workflow, by default, collects up to 10 videos per channel. If you’d like to fetch more, in the connected Google Sheet, simply enter your desired video number limit in Column C in the Channel URLs tab. The tool will use that value when calling the YouTube API. Need Help? Join our community on different platforms for support, inspiration and tips from others. Website: https://www.agentcircle.ai/ Etsy: https://www.etsy.com/shop/AgentCircle Gumroad: http://agentcircle.gumroad.com/ Discord Global: https://discord.gg/d8SkCzKwnP FB Page Global: https://www.facebook.com/agentcircle/ FB Group Global: https://www.facebook.com/groups/aiagentcircle/ X: https://x.com/agent_circle YouTube: https://www.youtube.com/@agentcircle LinkedIn: https://www.linkedin.com/company/agentcircle
by David Roberts
LangChain is a framework for building AI functionality that users large language models. By leveraging the functionality of LangChain, you can write even more powerful workflows. This workflow shows how you can write LangChain code within n8n, including importing LangChain modules. The workflow itself produces a summary of a YouTube video, when given the video's ID. Note that to use this template, you need to be on n8n version 1.19.4 or later.
by Akhil Varma Gadiraju
Workflow: HubSpot Contact Email Validation with Hunter.io Overall Goal This workflow retrieves contacts from HubSpot that have an email address but haven't yet had their email validated by Hunter. It then iterates through each of these contacts, uses Hunter.io to verify their email, updates the contact record in HubSpot with the validation status and date, and finally sends a summary email notification upon completion. How it Works (Step-by-Step Breakdown) Node: "When clicking ‘Test workflow’" (Manual Trigger) Type:** n8n-nodes-base.manualTrigger Purpose:** Start the workflow manually via the n8n interface. Output:** Triggers workflow execution. Node: "HubSpot" (HubSpot) Type:** n8n-nodes-base.hubspot Purpose:** Fetch contacts from HubSpot. Configuration:** Authentication: App Token Operation: Search for contacts Return All: True Filter Groups: Contact HAS_PROPERTY email Contact NOT_HAS_PROPERTY hunter_email_validation_status Output:** List of contact objects. Node: "Loop Over Items" (SplitInBatches) Type:** n8n-nodes-base.splitInBatches Purpose:** Process each contact one-by-one. Configuration:** Options > Reset: false Output:** Output 1 to "Hunter" Output 2 to "Send Email" Node: "Hunter" (Inside the loop) Type:** n8n-nodes-base.hunter Purpose:** Verify email with Hunter.io Configuration:** Operation: Email Verifier Email: {{ $json.properties.email }} Node: "Add Hunter Details (Contact)" (HTTP Request - Inside the loop) Type:** n8n-nodes-base.httpRequest Purpose:** Update HubSpot contact. Configuration:** Method: PATCH URL: https://api.hubapi.com/crm/v3/objects/contacts/{{ $('Loop Over Items').item.json.id }} Headers: Content-Type: application/json Body (JSON): { "properties": { "hunter_email_validation_status": "{{ $json.status }}", "hunter_verification_date": "{{ $now.format('yyyy-MM-dd') }}" } } Node: "Wait" (Inside the loop) Type:** n8n-nodes-base.wait Purpose:** Avoid API rate limits. Configuration:** Wait for 1 second. Node: "Replace Me" (NoOp - Inside the loop) Type:** n8n-nodes-base.noOp Purpose:** Junction node to complete the loop. Node: "Send Email" (After the loop completes) Type:** n8n-nodes-base.emailSend Purpose:** Send summary notification. Configuration:** From Email: test@gmail.com To Email: akhilgadiraju@gmail.com Subject: "Email Verification Completed for Your HubSpot Contacts" HTML: Formatted confirmation message Sticky Notes "HubSpot": Create custom properties (hunter_email_validation_status, hunter_verification_date). "Add Hunter Details": Ensure field names match HubSpot properties. "Wait": Prevent API rate limits. How to Customize It Trigger Replace Manual Trigger with Schedule Trigger (Cron) for automation. Optionally use HubSpot Trigger for new contact events. HubSpot Node Create matching custom properties. Adjust filters and returned properties as needed. Hunter Node Minimal customization needed. HTTP Request Node Update JSON property names if renaming in HubSpot. Customize date format as needed. Wait Node Adjust wait time to balance speed and API safety. Email Node Customize email addresses, subject, and body. Add dynamic contact count with a Set or Function node. Error Handling Add Error Trigger nodes. Use If nodes inside loop to act on certain statuses. Use Cases Clean your email list. Enrich CRM data. Prep verified lists for campaigns. Automate contact hygiene on a schedule. Required Credentials HubSpot App Token Used by: HubSpot node and HTTP Request node Create a Private App in HubSpot with required scopes. Hunter API Used by: Hunter node SMTP Used by: Email Send node Configure host, port, username, and password. Made with ❤️ using n8n by Akhil.
by Akhil Varma Gadiraju
AI-Powered GitHub Commit Reviewer Overview Workflow Name: AI-Powered GitHub Commit Reviewer Author: Akhil Purpose: This n8n workflow triggers on a GitHub push event, fetches commit diffs, formats them into HTML, runs an AI-powered code review using Groq LLM, and sends a detailed review via email. How It Works (Step-by-Step) 1. GitHub Trigger Node Type**: n8n-nodes-base.githubTrigger Purpose**: Initiates the workflow on GitHub push events. Repo**: akhilv77/relevance Output**: JSON with commit and repo details. 2. Parser Node Type**: n8n-nodes-base.set Purpose**: Extracts key info (repo ID, name, commit SHA, file changes). 3. HTTP Request Node Type**: n8n-nodes-base.httpRequest Purpose**: Fetches commit diff details using GitHub API. Auth**: GitHub OAuth2 API. 4. Code (HTML Formatter) Node Type**: n8n-nodes-base.code Purpose**: Formats commit info and diffs into styled HTML. Output**: HTML report of commit details. 5. Groq Chat Model Node Type**: @n8n/n8n-nodes-langchain.lmChatGroq Purpose**: Provides the AI model (llama-3.1-8b-instant). 6. Simple Memory Node Type**: @n8n/n8n-nodes-langchain.memoryBufferWindow Purpose**: Maintains memory context for AI agent. 7. AI Agent Node Type**: @n8n/n8n-nodes-langchain.agent Purpose**: Executes AI-based code review. Prompt**: Reviews for bugs, style, grammar, and security. Outputs styled HTML. 8. Output Parser Node Type**: n8n-nodes-base.code Purpose**: Combines commit HTML with AI review into one HTML block. 9. Gmail Node Type**: n8n-nodes-base.gmail Purpose**: Sends review report via email. Recipient**: akhilgadiraju@gmail.com 10. End Workflow Node Type**: n8n-nodes-base.noOp Purpose**: Marks the end. Customization Tips GitHub Trigger**: Change repo/owner or trigger events. HTTP Request**: Modify endpoint to get specific data. AI Agent**: Update the prompt to focus on different review aspects. Groq Model**: Swap for other supported LLMs if needed. Memory**: Use dynamic session key for per-commit reviews. Email**: Change recipient or email styling. Error Handling Use Error Trigger nodes to handle failures in: GitHub API requests LLM generation Email delivery Use Cases Instant AI-powered feedback on code pushes. Pre-human review suggestions. Security and standards enforcement. Developer onboarding assistance. Required Credentials | Credential | Used By | Notes | |-----------|---------|-------| | GitHub API (ID PSygiwMjdjFDImYb) | GitHub Trigger | PAT with repo and admin:repo_hook | | GitHub OAuth2 API | HTTP Request | OAuth2 token with repo scope | | Groq - Akhil (ID HJl5cdJzjhf727zW) | Groq Chat Model | API Key from GroqCloud | | Gmail OAuth2 - Akhil (ID wqFUFuFpF5eRAp4E) | Gmail | Gmail OAuth2 for sending email | Final Note Made with ❤️ using n8n by Akhil.
by n8n Team
This n8n workflow, which runs every Monday at 5:00 AM, initiates a comprehensive process to monitor and analyze network security by scrutinizing IP addresses and their associated ports. It begins by fetching a list of watched IP addresses and expected ports through an HTTP request. Each IP address is then processed in a sequential loop. For every IP, the workflow sends a GET request to Shodan, a renowned search engine for internet-connected devices, to gather detailed information about the IP. It then extracts the data field from Shodan's response, converting it into an array. This array contains information on all ports Shodan has data for regarding the IP. A filter node compares the ports returned from Shodan with the expected list obtained initially. If a port doesn't match the expected list, it is retained for further processing; otherwise, it's filtered out. For each such unexpected port, the workflow assembles data including the IP, hostnames from Shodan, the unexpected port number, service description, and detailed data from Shodan like HTTP status code, date, time, and headers. This collected data is then formatted into an HTML table, which is subsequently converted into Markdown format. Finally, the workflow generates an alert in TheHive, a popular security incident response platform. This alert contains details like the title indicating unexpected ports for the specific IP, a description comprising the Markdown table with Shodan data, medium severity, current date and time, tags, Traffic Light Protocol (TLP) set to Amber, a new status, type as 'Unexpected open port', the source as n8n, a unique source reference combining the IP with the current Unix time, and enabling follow and JSON parameters options. This comprehensive workflow thus aids in the proactive monitoring and management of network security.
by Rajeet Nair
Overview This workflow automates customer support ticket processing using AI-powered analysis, classification, and intelligent routing. It processes incoming tickets from email or webhook, translates messages when needed, analyzes sentiment and urgency, and routes tickets to auto-reply or escalation flows. The system also updates CRM platforms and logs observability metrics for monitoring. This enables faster response times, improved customer experience, and scalable support operations. How It Works Input Sources Receives tickets via: IMAP Email Trigger Webhook endpoint Workflow Configuration Defines: CRM/Helpdesk API endpoint Escalation webhook URL Observability logging endpoint Data Cleaning & Normalization Extracts and cleans HTML content Normalizes ticket data: Ticket ID User email Message content Timestamp Source channel Language Detection & Translation Detects the original language Translates message into English if needed Returns confidence score AI Support Intelligence Classifies ticket into: Sentiment (positive/neutral/negative) Urgency (low → critical) Category (billing, bug, technical, etc.) Generates: Short summary Churn risk score (0–1) Recommended action path Decision Routing Routes tickets based on AI output: Auto Reply → Generate response Escalate / Critical → Send to team Auto Reply Flow AI Reply Generation Drafts professional response using ticket context Keeps tone empathetic and actionable CRM/Helpdesk Update Sends structured ticket data to CRM: Priority Category Sentiment Churn risk Draft reply Escalation Flow Escalation Handling Sends high-priority tickets to support team Includes full ticket context and analysis Observability & Monitoring Metrics Logging Tracks: Response time Escalation status Category & urgency Sentiment & churn risk Sends data to observability endpoint (optional) Setup Instructions Email / Webhook Setup Configure IMAP credentials OR webhook endpoint (support-ticket) AI Model Setup Add Anthropic or OpenAI credentials Connect models to: Translation agent Intelligence agent Reply generator CRM / Helpdesk Integration Set API endpoint URL Configure headers and authentication Escalation Setup Add webhook URL for team notifications (Slack, internal API, etc.) Observability (Optional) Configure logging endpoint for metrics tracking Customize Prompts Adjust system messages for: Translation Classification Reply generation Use Cases AI-powered customer support automation SaaS support ticket triaging Multi-language support systems Helpdesk automation with CRM integration Customer churn risk detection workflows Requirements Anthropic or OpenAI API key Email (IMAP) or webhook source CRM/helpdesk system API Optional observability/logging service n8n instance Key Features Multi-channel ticket ingestion (email + webhook) Automatic language detection and translation AI-based sentiment, urgency, and category classification Intelligent routing (auto-reply vs escalation) AI-generated support replies CRM integration for structured ticket updates Observability and performance tracking Summary A powerful AI-driven support automation workflow that processes, analyzes, and routes customer tickets intelligently. It reduces manual workload, improves response speed, and enables scalable, data-driven support operations.
by Rahul Joshi
Description Automate your weekly social media analytics with this end-to-end AI reporting workflow. 📊🤖 This system collects real-time Twitter (X) and Facebook metrics, merges and validates data, formats it with JavaScript, generates an AI-powered HTML report via GPT-4o, saves structured insights in Notion, and shares visual summaries via Slack and Gmail. Perfect for marketing teams tracking engagement trends and performance growth. 🚀💬 What This Template Does 1️⃣ Starts manually or on-demand to fetch the latest analytics data. 🕹️ 2️⃣ Retrieves follower, engagement, and post metrics from both X (Twitter) and Facebook APIs. 🐦📘 3️⃣ Merges and validates responses to ensure clean, complete datasets. 🔍 4️⃣ Runs custom JavaScript to normalize and format metrics into a unified JSON structure. 🧩 5️⃣ Uses Azure OpenAI GPT-4o to generate a visually rich HTML performance report with tables, emojis, and insights. 🧠📈 6️⃣ Saves the processed analytics into a Notion “Growth Chart” database for centralized trend tracking. 🗂️ 7️⃣ Sends an email summary report to the marketing team, complete with formatted HTML insights. 📧 8️⃣ Posts a concise Slack update comparing platform performance and engagement deltas. 💬 9️⃣ Logs any validation or API errors automatically into Google Sheets for debugging and traceability. 🧾 Key Benefits ✅ Centralizes all social metrics into a single automated flow. ✅ Delivers AI-generated HTML reports ready for email and dashboard embedding. ✅ Reduces manual tracking with Notion and Slack syncs. ✅ Ensures data reliability with built-in validation and error logging. ✅ Gives instant, visual insights for weekly marketing reviews. Features Multi-platform analytics integration (Twitter X + Facebook Graph API). JavaScript node for dynamic data normalization. Azure OpenAI GPT-4o for HTML report generation. Notion database update for long-term trend storage. Slack and Gmail nodes for instant sharing and communication. Automated error capture to Google Sheets for workflow reliability. Visual, emoji-enhanced reporting with HTML formatting and insights. Requirements Twitter OAuth2 API credentials for access to public metrics. Facebook Graph API access token for page insights. Azure OpenAI API key for GPT-4o report generation. Notion API credentials with write access to “Growth Chart” database. Gmail OAuth2 credentials for report dispatch. Slack Bot Token with chat:write permission for posting analytics summaries. Google Sheets OAuth2 credentials for maintaining the error log. Environment Variables TWITTER_API_KEY FACEBOOK_ACCESS_TOKEN AZURE_OPENAI_API_KEY NOTION_GROWTH_DB_ID GMAIL_REPORT_RECIPIENTS SLACK_REPORT_CHANNEL_ID GOOGLE_SHEET_ERROR_LOG_ID Target Audience 📈 Marketing and growth teams tracking cross-platform performance 💡 Social media managers needing automated reporting 🧠 Data analysts compiling weekly engagement metrics 💬 Digital agencies managing multiple brand accounts 🧾 Operations and analytics teams monitoring performance KPIs Step-by-Step Setup Instructions 1️⃣ Connect all API credentials (Twitter, Facebook, Notion, Gmail, Slack, and Sheets). 2️⃣ Paste your Facebook Page ID and Twitter handle in respective API nodes. 3️⃣ Verify your Azure OpenAI GPT-4o connection and prompt text for HTML report generation. 4️⃣ Update your Notion database structure to match “Growth Chart” property names. 5️⃣ Add your marketing email in the Gmail node and test delivery. 6️⃣ Specify the Slack channel ID where summaries will be posted. 7️⃣ Optionally, connect a Google Sheet tab for error tracking (error_id, message). 8️⃣ Execute the workflow once manually to validate data flow. 9️⃣ Activate or schedule it for weekly or daily analytics automation. ✅
by Maximiliano Rojas-Delgado
Turn Your Ideas into Videos—Right from Google Sheets! This workflow helps you make cool 5-second videos using Fal.AI and Kling 2.1, just by typing your idea into a Google Sheet. You can even choose if you want your video to have sound or not. It’s super easy—no tech skills needed! And the best? 4x Cheaper than Veo3 model with similar quality! Why use this? Just type your idea in a sheet—no fancy tools or uploads. Get a video link back in the same sheet. Works with or without sound—your choice! How does it work? You write your idea, pick the video shape, and say if you want sound (true or false) in the Google Sheet. n8n reads your idea and asks Fal.AI to make your video. When your video is ready, the link shows up in your sheet. What do you need? A Google account and Google Sheets connected with service account (check this link for reference) A copy of the following Google Spreadsheet: Spreadsheet to copy An OpenAI API key A Fal.AI account with some money in it That’s it! Just add your ideas and let the workflow make the videos for you. Have fun creating! if you have any questions, just contact me at max@nervoai.com
by Yulia
Free template for voice & text messages with short-term memory This n8n workflow template is a blueprint for an AI Telegram bot that processes both voice and text messages. Ready to use with minimal setup. The bot remembers the last several messages (10 by default), understands commands and provides responses in HTML. You can easily swap GPT-4 and Whisper for other language and speech-to-text models to suit your needs. Core Features Text: send or forward messages Voice: transcription via Whisper Extend this template by adding LangChain tools. Requirements Telegram Bot API OpenAI API (for GPT-4 and Whisper) 💡 New to Telegram bots? Check our step-by-step guide on creating your first bot and setting up OpenAI access. Use Cases Personal AI assistant Customer support automation Knowledge base interface Integration hub for services that you use: Connect to any API via HTTP Request Tool Trigger other n8n workflows with Workflow Tool
by Samir Saci
Context Hey! I'm Samir, a Supply Chain Data Scientist from Paris who spent six years in China studying and working while struggling to learn Mandarin. I know the challenges of mastering a complex language like Chinese and my greatest support was flash cards. Therefore, I designed this workflow to support fellow Mandarin learners by automating flashcard creation using n8n, so they can focus more on learning and less on manual data entry. 📬 For business inquiries, you can add me on Here Who is this template for? This workflow template is designed for language learners and educators who want to automate the creation of flashcards for Mandarin (or any other language) using Google Translate API, an AI agent for phonetic transcription and generating an illustrative sentence and a free image retrieval API. Why? If you use the open-source application Anki, this workflow will help you automatically generate personalized study materials. How? Let us imagine you want to learn how to say the word Contract in Mandarin. The workflow will automatically Translate the word in Simplified Mandarin (Mandarin: 合同). Provide the phonetic transcription (Pinyin: Hétóng) Generate an example sentence (Example: 我们签订了一份合同.) Download an illustrative picture (For example, a picture of a contract signature) All these fields are automatically recorded in a Google Sheet, making it easy to import into Anki and generate flashcards instantly What do I need to start? This workflow can be used with the free tier plans of the services used. It does not require any advanced programming skills. Prerequisite A Google Drive Account with a folder including a Google Sheet API Credentials: Google Drive API, Google Sheets API and Google Translate API activated with OAuth2 credentials A free API key of pexels.com A google sheet with the columns Next Follow the sticky notes to set up the parameters inside each node and get ready to pump your learning skills. I have detailed the steps in a short tutorial 👇 🎥 Check My Tutorial Notes This workflow can be used for any language. In the AI Agent prompt, you just need to replace the word pinyin with phonetic transcription. You can adapt the trigger to operate the workflow in the way you want. These operations can be performed by batch or triggered by Telegram, email, or webhook. If you want to learn more about how I used Anki flash cards to learn mandarin: 🈷️ Blog Article about Anki Flash Cards This workflow has been created with N8N 1.82.1 Submitted: March 17th, 2025
by Harshil Agrawal
This workflow appends, lookup, updates, and reads data from a Google Sheet spreadsheet. Set node: The Set node is used to generate data that we want to add to Google Sheets. Depending on your use-case you might have data coming from a different source. For example, you might be fetching data from a WebHook call. Add the node that will fetch the data that you want to add to the Google Sheet. Use can then use the Set node to set the data that you want to add to the Google Sheets. Google Sheets node: This node will add the data from the Set node in a new row to the Google Sheet. You will have to enter the Spreadsheet ID and the Range to specify which sheet you want to add the data to. Google Sheets1 node: This node looks for a specific value in the Google Sheet and returns all the rows that contain the value. In this example, we are looking for the value Berlin in our Google Sheet. If you want to look for a different value, enter that value in the Lookup Value field, and specify the column in the Lookup Column field. Set1 node: The Set node sets the value of the rent by $100 for the houses in Berlin. We pass this new data to the next nodes in the workflow. Google Sheets2 node: This node will update the rent for the houses in Berlin with the new rent set in the previous node. We are mapping the rows with their ID. Depending on your use-case, you might want to map the values with a different column. To set this enter the column name in the Key field. Google Sheets3 node: This node returns the information from the Google Sheet. You can specify the columns that should get returned in the Range field. Currently, the node fetches the data for columns A to D. To fetch the data only for columns A to C set the range to A:C. This workflow can be broken down into different workflows each with its own use case. For example, we can have a workflow that appends new data to a Google Sheet, and another workflow that lookups for a certain value and returns that value. You can learn to build this workflow on the documentation page of the Google Sheets node.