by vinci-king-01
How it works This workflow automatically scrapes real estate listings from Zillow and sends them to a Telegram channel. Key Steps Scheduled Trigger - Runs the workflow at specified intervals to find new listings. AI-Powered Scraping - Uses ScrapeGraphAI to extract property information from Zillow. Data Formatting - Processes and structures the scraped data for Telegram messages. Telegram Integration - Sends formatted listing details to your specified Telegram channel. Set up steps Setup time: 5-10 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key. Set up Telegram connection - Connect your Telegram account and specify the target channel. Customize the Zillow URL - Update the URL to target specific locations or search criteria. Adjust schedule - Modify the trigger timing based on how frequently you want to check for new listings.
by Alex Kim
Automate Video Creation with Luma AI Dream Machine and Airtable (Part 2) Description This is the second part of the Luma AI Dream Machine automation. It captures the webhook response from Luma AI after video generation is complete, processes the data, and automatically updates Airtable with the video and thumbnail URLs. This completes the end-to-end automation for video creation and tracking. 👉 Airtable Base Template 👉 Tutorial Video Setup 1. Luma AI Setup Ensure you’ve created an account with Luma AI and generated an API key. Confirm that the API key has permission to manage video requests. 2. Airtable Setup Make sure your Airtable base includes the following fields (set up in Part 1): Use the Airtable Base Template linked above to simplify setup. Generation ID** – To match incoming webhook data. Status** – Workflow status (e.g., "Done"). Video URL** – Stores the generated video URL. Thumbnail URL** – Stores the thumbnail URL. 3. n8n Setup Ensure that the n8n workflow from Part 1 is set up and configured. Import this workflow and connect it to the webhook callback from Luma AI. How It Works 1. Webhook Trigger The Webhook node listens for a POST response from Luma AI once video generation is finished. The response includes: Video URL – Direct link to the video. Thumbnail URL – Link to the video thumbnail. Generation ID – Used to match the record in Airtable. 2. Process Webhook Data The Set node extracts the video data from the webhook response. The If node checks if the video URL is valid before proceeding. 3. Store in Airtable The Airtable node updates the record with: Video URL – Direct link to the video. Thumbnail URL – Link to the video thumbnail. Status – Marked as "Done." Uses the Generation ID to match and update the correct record. Why This Workflow is Useful ✅ Automates the completion step for video creation ✅ Ensures accurate record-keeping by matching generation IDs ✅ Simplifies the process of managing and organizing video content ✅ Reduces manual effort by automating the update process Next Steps Future Enhancements** – Adding more complex post-processing, video trimming, and multi-platform publishing.
by Open Paws
Who’s it for 🎯 This workflow is ideal for outreach specialists, fundraisers, campaigners, and professionals who want to build authentic connections by researching prospects deeply and strategically. It helps users understand prospects’ backgrounds, interests, and mutual connections to craft effective outreach. How it works / What it does ⚙️ Using the Multi-tool Research Agent subworkflow, it analyzes both the prospector’s and prospect’s profiles, social media, and online presence. The workflow verifies identities, uncovers key connection points, and generates a comprehensive HTML report with actionable insights, conversation starters, and suggested engagement tactics. How to set up 🛠️ Import this workflow and the Multi-tool Research Agent subworkflow. Configure required API credentials. Provide inputs: prospector and prospect names, social media URLs, and outreach goal. Test the workflow to ensure accurate research and report generation. Requirements 📋 n8n instance with internet access Valid API keys Multi-tool Research Agent subworkflow installed and linked Optional email node for sending reports directly How to customize 🔧 Update input parameters to suit your outreach use case. Modify research prompts in the subworkflow for tone or focus. Customize the HTML report design for branding or format preferences. Attach an email node to send reports automatically or route output as needed. Use this workflow to power personalized, strategic outreach with data-driven insights.
by Yaron Been
Automated system for monitoring and analyzing competitor activities, funding rounds, and market movements using CrunchBase data. 🚀 What It Does Tracks competitor funding rounds Monitors leadership changes Analyzes investment patterns Identifies new market entries Tracks product launches 🎯 Perfect For Startup founders Business strategists Market analysts Investment professionals Corporate development ⚙️ Key Benefits ✅ Competitive intelligence ✅ Early warning system ✅ Market trend analysis ✅ Strategic insights ✅ Time-saving automation 🔧 What You Need CrunchBase API access n8n instance Google Sheets (for data storage) Notification preferences 📊 Tracking Metrics Funding amounts and rounds Investor networks Hiring trends Market expansion Product updates 🛠️ Setup & Support Quick Setup Start tracking in 20 minutes with our step-by-step guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Gain a competitive edge with automated tracking and analysis of your competitors' activities and strategies.
by AlexAutomates
Auto-Categorize Outlook Emails with AI in n8n How It Works Trigger: The workflow starts with the Microsoft Outlook Trigger node, polling your inbox every minute for new emails. Extract & Clean Email Content: The email’s key fields (from, subject, isRead, body) are extracted. The body is converted from HTML to Markdown, then sanitized to plain text for reliable AI processing. Node Setup Details: Microsoft Outlook Trigger Resource: Message Operation: Trigger on new email Fields to Output: from, subject, isRead(optional), body Folders to Include: (Set to your Inbox or specific folder IDs) Markdown Node Input: {{$json"body"}} (HTML email body) Output Key: Email Body Markdown Purpose: Converts HTML to Markdown for easier downstream processing. Sanitize Node (Code Node) Input: Email Body Markdown from previous node Purpose: Cleans up Markdown, strips images, links, HTML tags, table formatting, and truncates to 4000 characters. Sample JS Code: // Get the markdown content from the previous node const markdownContent = $input.item.json["Email Body Markdown"]; Setup AI tools Move message and Get Folders Outlook tools are required, get contacts is optional. Set each field in the tools to "defined automatically by the model" and describe each field so the model understands how to use it. OpenRouter or other LLM models tool: You can use any client for this, but make sure to use a model that does well with tool calls (Claude, GPT-4.1, Gemini 2.5 Pro, etc.). Best Practices & Notes AI Prompt Engineering:** The AI is instructed to be conservative—never move emails from real people or saved contacts, and always explain its reasoning if it doesn’t move a message. This automation only works for NEW incoming messages. Inbox Zero:** This system is designed to help you achieve and maintain Inbox Zero by keeping only actionable items in your main inbox. Customization:** You can adjust the folder logic, add more categories, or tweak the AI prompt for your specific needs. Privacy:** All processing happens within your n8n instance; no email data is stored outside your environment except for the AI call (which only receives sanitized, minimal content).
by Mirajul Mohin
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automatically transform your video uploads into AI-powered summaries with key topic extraction and instant team notifications. What this workflow does Monitors Google Drive for new video uploads Downloads and processes videos using VLM Run AI Generates intelligent summaries with key topics extracted Posts results to Slack for immediate team access Setup Prerequisites: Google Drive account, VLM Run API credentials, Slack workspace, self-hosted n8n. You need to install VLM Run community node Quick Setup: Configure Google Drive OAuth2 and create video upload folder Add VLM Run API credentials Set up Slack integration for notifications Update folder/channel IDs in workflow nodes Test and activate Perfect for Meeting recordings and training videos Webinar summaries and educational content Content analysis and team collaboration Any video content requiring quick insights Key Benefits Asynchronous processing** handles large files without timeouts Multi-format support** for MP4, AVI, MOV, WebM, MKV Instant team updates** via Slack notifications Saves hours** of manual video review time How to customize Extend by adding: Video categorization and tagging Integration with project management tools Email notifications alongside Slack Searchable video databases with summaries This workflow transforms lengthy videos into actionable insights, making your content instantly accessible and shareable with your team.
by Rosh Ragel
This workflow processes emails received in Gmail and adds the sender's name and email address to a MySQL database. Use Cases: A sales or marketing agency can use this to automatically save client contact info to a database to build a list of leads Companies can use this to automatically save contacts to a database in case of Gmail data loss / losing access to their Gmail account Companies can build mailing lists to automatically send promotions to all of the clients who have contacted them in a given time period Before using, you need to have: Gmail credential MySQL database credential A Table in the MySQL database to store your contacts The table should have a "name" column, which allows NULL values The table should have an "email" column, which should be UNIQUE How it works: The Gmail Trigger will listen for a new email every minute For each email, the code node will extract the name and email address of the sender. If there is no name, it will return null The MySQL node will insert the new contact into a table in your database If the contact email already exists in your database, the MySQL node will update the contact name How to use: Please set up the MySQL node by selecting the correct table to store contacts in Please choose your "email" column to match on Please choose your "name" column to store names Customizing this Workflow: You can customize this workflow to save more data to MySQL. Here are some examples: In the MySQL node, click "Add Value", and choose one of the fields from the Gmail node to save in your database column. You can try saving the following items: Subject line MessageID ThreadID Snippet Recipient Info
by Satish
This n8n template demonstrates automating an appointment letter creation process using a template and then having the HR approve before emailing the appointment letter to the candidate. How it works Create an appointment letter template. e.g "Appointment Letter.doc" on Google Drive Form Submission node - Create a form trigger with the required fields that need to be capture as part of the appointment letter. Eg. Candidate Name, Position offered, Salary, Date of Joining, Candidate email, etc. Google Drive Copy node - Once the form is filled, it creates a candidate copy of the appointment letter by appending the candidate name to appointment letter. e.g. "Appointment Letter - <candidate name>.doc". This will be stored on the Google Drive Google Doc Update node - Fill the placeholders in the appointment letter with the candidate specific details such as Candidate Name, Position offered, Salary, Date of Joining, etc. Google Drive Download node - Create a PDF version of the candidate's appointment letter. e.g. "Appointment Letter - <Candidate Name>.pdf" and download it to Google Drive Google Drive Upload node - Upload the PDF to Google Drive Gmail Send Message node - Send an email to the HR requesting to review the candidate's appointment letter and 'Approve' or 'Reject' the appointment letter. This is the Human-In-The-Loop step If Node (for routing) - will return "true" if HR approves and "false" if HR rejects If HR approves, go to Step 9 and Step 10 Google Drive Download node - Get the PDF file Gmail Send Message node - Send an email to the candidate with the appointment letter (PDF) as the attachment How to use The Form trigger node is used as an example but feel free to replace this with other triggers such as Google Sheet Create an Appointment Letter Google document with the follwing fields - Date, Candidate Name, Position Name, Fixed CTC, Joining Date and To be signed by Date. See sample letter format below: <Appointment Letter.doc> (Google Document) Appointment Letter [Date] Dear [Candidate Name], Congratulations! We are pleased to offer you the [Position Name] at ABC Company. Fixed CTC - [Fixed CTC] Joining Date - [Joining Date] Requirements Google drive for upload and downloading the file Gmail for sending emails Sign the letter by - [To be signed by Date] Signature
by RedOne
This workflow is designed for e-commerce store owners, operations managers, and developers who use Shopify as their e-commerce platform and want an automated way to track and analyze their order data. It is particularly useful for businesses that: Need a centralized view of all Shopify orders Want to analyze order trends without logging into Shopify Need to share order data with team members who don't have Shopify access Want to build custom reports based on order information What Problem Is This Workflow Solving? While Shopify provides excellent order management within its platform, many businesses need their order data available in other systems for various purposes: Data accessibility**: Not everyone in your organization may have access to Shopify's admin interface Custom reporting**: Google Sheets allows for flexible analysis and report creation Data integration**: Having orders in Google Sheets makes it easier to combine with other business data Backup**: Creates an additional backup of your critical order information What This Workflow Does This n8n workflow creates an automated bridge between your Shopify store and Google Sheets: Listens for new order notifications from your Shopify store via webhooks Processes the incoming order data and transforms it into a structured format Stores each new order in a dedicated Google Sheets spreadsheet Sends real-time notifications to Telegram when new orders are received or errors occur Setup Create a Google Sheet Create a new Google Sheet to store your orders Add a sheet named "orders" with the following columns: orderId orderNumber created_at processed processed_at json customer shippingAddress lineItems totalPrice currency Set Up Telegram Bot Create a Telegram bot using BotFather (send /newbot to @BotFather) Save your bot token for use in n8n credentials Start a chat with your bot and get your chat ID (you can use @userinfobot) Configure the Workflow Set your Google Sheet ID in the "Edit Variables" node Enter your Telegram chat ID in the "Edit Variables" node Set up your Telegram API credentials in n8n Configure Shopify Webhook In your Shopify admin, go to: Settings > Notifications > Webhooks Create a new webhook for "Order creation" Set the URL to your n8n webhook URL (from the "Receive New Shopify Order" node) Set the format to JSON How to Customize This Workflow to Your Needs Additional data**: Modify the "Transform Order Data to Standard Format" function to extract more Shopify data Multiple sheets**: Duplicate the Google Sheets node to store different aspects of orders in separate sheets Telegram messages**: Customize the text in Telegram nodes to include more details or rich formatting Data processing**: Add nodes to perform calculations or transformations on order data Additional notifications**: Add more channels like Slack, Discord, or SMS Integrations**: Extend the workflow to send order data to other systems like CRMs, ERPs, or accounting software Final Notes This workflow serves as a foundation that you can build upon to create a comprehensive order management system tailored to your specific business needs.
by n8n Team
This workflow creates a Jira issue when a new ticket is created in Zendesk. Subsequent comments on the ticket in Zendesk are added as comments to the issue in Jira. Prerequisites Zendesk account and Zendesk credentials. Jira account and Jira credentials. Jira project to create issues in. How it works The workflow listens for new tickets in Zendesk. When a new ticket is created, the workflow creates a new issue in Jira. The Jira issue key is then saved in one of the ticket's fields (in setup we call this "Jira Issue Key"). The next time a comment is added to the ticket, the workflow retrieves the Jira issue key from the ticket's field and adds the comment to the issue in Jira. Setup This workflow requires that you set up a webhook in Zendesk. To do so, follow the steps below: In the workflow, open the On new Zendesk ticket node and copy the webhook URL. In Zendesk, navigate to Admin Center > Apps and integrations > Webhooks > Actions > Create Webhook. Add all the required details which can be retrieved from the On new Zendesk ticket node. The webhook URL gets added to the “Endpoint URL” field, and the “Request method” should match what is shown in n8n. Save the webhook. In Zendesk, navigate to Admin Center > Objects and rules > Business rules > Triggers > Add trigger. Give the trigger a name such as “New tickets”. Under “Conditions” in “Meet ALL of the following conditions”, add “Status is New”. Under “Actions”, select “Notify active webhook” and select the webhook you created previously. In the JSON body, add the following: { "id": "{{ticket.id}}", "comment": "{{ticket.latest_comment_html}}" } Save the Zendesk trigger. You will also need to set up a field in Zendesk to store the Jira issue key. To do so, follow the steps below: In Zendesk, navigate to Admin Center > Objects and rules > Tickets > Fields > Add field. Use the text field option and give the field a name such as “Jira Issue Key". Save the field. In n8n, open the Update ticket node and select the field you created in Zendesk.
by Tony Duffy
. Read and store IOT sensor data with the MQTT Trigger and InfluxDB tonyduffy@protonmail.com This workflow is for users wanting a practical example of how to obtain data from remote IOT systems using the MQTT protocol in an n8n environment. The template provides typical n8n node implementation and configuration settings necessary to read and store IOT data. The workflow reads the temperature and humidity data from a remote IOT system in this case a DHT22 sensor connected to a ESP32 micro controller. The data is parsed into the correct JSON format and then ingested in an InfluxDB data bucket. From there the stored temperature and humidity values can be displayed in real time. The workflow can be easily modified to read any MQTT driven device data. Remote IOT Sensor Setup The ESP32 controller with the DHT22 sensor are running on a Wokwi simulator. The simulator uses micro python to publish a MQTT "wokwi-weather" topic with the temperature and humidity payloads to an online Mosquitto MQTT broker. The n8n MQTT trigger node subscribes to the topic on the broker and reads the payload values when any changes are published. The code node then prepares the payload for JSON format. The HTTP request node ingests the data in a InfluxDB bucket How to customise this workflow to your needs Wokwi IOT ESP32 simulator You will need to setup a free account at Wokwi.com Once created search for a project "Micro-Python MQTT Weather Logger (ESP32)" Then when the MQTT weather logger project is open change lines 28 and 29 to the following 28 MQTT_CLIENT_ID = "" 29 MQTT_BROKER = "test.mosquitto.org" You then can start the simulation by clicking on the green arrow and it will connect the mosquitto broker and the "wokwi-weather" topic will be published. By clicking on the DHT22 sensor the temperature and humidity bar will appear and you can change the values to send updated payload values to the broker. InfluxDB You will require access to functioning InfluxDB database to utilise this workflow Note : You will have to provide the following for the HTTP request node to connect to InfluxDB. The URL and port of the desired InfluxDB (In this case the InfluxDB is running locally on port 8086 ie. http://localhost:8086.) InfluxDB bucket for the data. ( In this case the created bucket name is "wokwi-data") The Organization ID of the InfluxDB. This can be obtained for the InfluxDB admin page A generated API token to read and write to the InfluxDB bucket. Created from the InfluxDB admin n8n workflow. The MQTT trigger node is configured to subscribe to the "wokwi-weather" topic on the test Mosquitto MQTT broker. It reads the temperature and humidity data sent by ESP32. The code node uses Javascript to move the temperature and humidity payloads to JSON format. This is flexible and can easily modified. The HTTP request node posts the JSON payloads to the InfluxDB bucket. When the above is configured the workflow should function correctly. Thanks to the many who have downloaded this template. Let me know on what you would like to build. Contact me at tonyduffy@protonmail.com
by Matheus Pedrosa
Who is this template for? This template is ideal for n8n instance administrators, developers, and DevOps teams who need a proactive and organized way to monitor the health of their automations. If you want to be notified about failures as soon as they happen, without having to manually check execution logs, this workflow is for you. What does this template do? This workflow automates error monitoring on your n8n instance. Every hour, it performs the following steps: Queries the n8n API to fetch all executions that have failed in the last hour. Groups the errors by workflow to consolidate the information. Builds a rich message for each failed workflow, including the error count. Sends an alert to a Slack channel with a button to open the workflow directly, allowing for immediate investigation. Requirements Before you start, you will need to have the following configured in your n8n instance: n8n API Credentials:** You need to generate an API key in your n8n instance settings so the workflow can query execution data. Slack Credentials:* A configured *Slack (OAuth2 API)** credential to allow n8n to send messages to your workspace. How to set it up Setup is simple and only takes a few minutes: Config Node: In the node named "Config", you must set the value for the baseUrl to your n8n instance's URL (e.g., https://n8n.yourdomain.com). This is crucial for generating the correct workflow links in the Slack message. Schedule Trigger: The workflow is pre-configured to run every hour. You can adjust the frequency in this node to fit your needs. "Get Failed Executions" Node (HTTP Request): Under Authentication, select 'Header Auth'. In the Credentials field, select your n8n API credential. "Post to Slack" Node (Slack): Select your Slack credential. In the Channel field, enter the name of the channel where error notifications should be sent (e.g., #n8n-alerts). Activate the Workflow! After these steps, just activate the workflow to start the automatic error monitoring. How to customize the workflow You can easily customize this template: Change the Schedule:** Modify the Schedule Trigger node to run at different intervals (every 15 minutes, once a day, etc.). Change the Notification Channel:** Instead of Slack, you can replace the last node to send notifications to Discord, Microsoft Teams, Telegram, or even by email. Add More Information:** You can modify the MakeMessage node that generates the message to include more details about the errors, such as the error message or the exact time of failure.