by Marth
How it works This automation helps revive expired property listings by: Reading listing data from a Google Sheet that tracks all properties. Filtering listings where the last_activity date is older than 30 days. Generating a personalized email using OpenAI (GPT-4) to re-engage the owner. Sending the email to the property owner using Gmail or SMTP. (Optional): Updating the listing's status to followed_up in the Sheet once the email is sent. This workflow ensures no opportunity is missed by proactively reactivating cold leads. Set Up Steps Prepare your Google Sheet Create a Google Sheet with these columns: title, owner_name, email, property_type, location, last_activity Fill in sample data for testing. Connect Google Sheets in n8n Add a Google Sheets node. Use the "Read Rows" operation to load the listing data. Filter listings inactive for 30+ days Use a Set node to convert last_activity to a Date. Add an IF node or Code node to check if the listing is older than 30 days. Generate email content with OpenAI Add an OpenAI node. Use dynamic input (e.g. owner name, property type) to create a follow-up message. Send the email Add a Gmail node or SMTP node to send the email to the property owner. (Optional) Update status Use a Google Sheets "Update Row" node to change the listing's status to followed_up. Test the full workflow Manually trigger the workflow or schedule it to run daily/weekly.
by Yaron Been
Automated monitoring system that sends instant alerts when target companies make technology changes, delivered directly to your inbox or Slack. 🚀 What It Does Monitors technology stack changes Sends real-time email alerts Posts updates to Slack Tracks historical changes Filters by technology type 🎯 Perfect For Sales teams IT departments Competitive intelligence Technology vendors Market researchers ⚙️ Key Benefits ✅ Instant technology change alerts ✅ Multiple notification channels ✅ Historical tracking ✅ Customizable filters ✅ Team collaboration 🔧 What You Need BuiltWith API access Email service (SMTP/SendGrid) Slack workspace (optional) n8n instance 📊 Alerts Include Company name Technology changes Timestamp Impact assessment Direct links 🛠️ Setup & Support Quick Setup Get alerts in 15 minutes with our step-by-step guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Stay informed about technology changes that matter to your business with automated monitoring alerts and notifications.
by Abrar Sami
Auto-generate & post content using AI This workflow helps you create daily content using just a topic prompt. It writes a tweet, generates an image, and publishes across Twitter, Facebook, and LinkedIn — all on autopilot. How it works Triggers daily at 10 PM to start the flow Uses OpenAI to generate a niche topic title Writes a short-form post (tweet style) with hashtags Generates a Japanese anime-style image for visual context Saves everything in Google Sheets Publishes automatically on Twitter, LinkedIn, and Facebook Set up steps You’ll need OpenAI, Google Sheets, and social media credentials (Twitter, Facebook, LinkedIn) Takes about 10–15 minutes to configure if you already have the credentials ready Make sure your Sheet and API keys are properly linked before activating 📝 Keep detailed notes inside the workflow with sticky notes for easier handoff or collaboration.
by Teddy
Scrape Latest 20 TechCrunch Articles Who is this for? This workflow is designed for developers, researchers, and data analysts who need to track the latest trending repositories on GitHub. It is useful for anyone who wants to stay updated on popular open-source projects without manually browsing GitHub’s trending page. What problem is this workflow solving? Manually checking GitHub’s trending repositories daily can be time-consuming and inefficient. This workflow automates the extraction of trending repositories, providing structured data including repository name, author, description, programming language, and direct repository links. What this workflow does This workflow scrapes the trending repositories from GitHub’s trending page and extracts essential metadata such as repository names, languages, descriptions, and URLs. It processes the extracted data and structures it into an easy-to-use format. Setup Ensure you have n8n installed and configured. Import this workflow into your n8n instance. Run the workflow manually or schedule it to execute at regular intervals. (Optional) Customize the extracted data or integrate it with other systems. How to customize this workflow to your needs Modify the HTTP request node to target different GitHub trending categories (e.g., specific programming languages). Add further processing steps such as filtering repositories by stars, forks, or specific keywords. Integrate this workflow with Slack, email, or a database to store or notify about trending repositories. Workflow Steps Trigger execution manually using the "When clicking ‘Test workflow’" node. Send an HTTP request to fetch GitHub’s trending page using "Request to Github Trend". Extract the trending repositories box from the HTML response using "Extract Box". Extract all repository data including names, authors, descriptions, and languages using "Extract all repositories". Convert extracted data into a structured list for easier processing using "Turn to a list". Extract detailed repository information using "Extract repository data". Format and set variables to ensure clean and structured data output using "Set Result Variables". Note: Since GitHub’s trending page updates dynamically, ensure you run this workflow periodically to capture the latest trends.
by Open Paws
Who’s it for 🎯 This workflow is ideal for outreach specialists, fundraisers, campaigners, and professionals who want to build authentic connections by researching prospects deeply and strategically. It helps users understand prospects’ backgrounds, interests, and mutual connections to craft effective outreach. How it works / What it does ⚙️ Using the Multi-tool Research Agent subworkflow, it analyzes both the prospector’s and prospect’s profiles, social media, and online presence. The workflow verifies identities, uncovers key connection points, and generates a comprehensive HTML report with actionable insights, conversation starters, and suggested engagement tactics. How to set up 🛠️ Import this workflow and the Multi-tool Research Agent subworkflow. Configure required API credentials. Provide inputs: prospector and prospect names, social media URLs, and outreach goal. Test the workflow to ensure accurate research and report generation. Requirements 📋 n8n instance with internet access Valid API keys Multi-tool Research Agent subworkflow installed and linked Optional email node for sending reports directly How to customize 🔧 Update input parameters to suit your outreach use case. Modify research prompts in the subworkflow for tone or focus. Customize the HTML report design for branding or format preferences. Attach an email node to send reports automatically or route output as needed. Use this workflow to power personalized, strategic outreach with data-driven insights.
by Parth Pansuriya
AI-Powered Daily Gmail Digest Summary using LangChain & OpenRouter This n8n template helps you automatically summarize your daily Gmail messages using OpenRouter's GPT model via LangChain. It generates a structured email digest highlighting key information, tasks, issues, and action items — all delivered to your inbox every morning. Who’s it for Busy professionals who want a quick overview of their daily emails Founders or managers needing to track team or client communication Anyone looking to automate inbox triage and reduce time spent on emails How it works / What it does This n8n workflow runs every morning at 7 AM, automatically: Fetches emails from the last 24 hours Collects important fields: sender, subject, and snippets Feeds them into an AI-powered agent (OpenRouter + LangChain) The AI: Extracts key topics, tasks, deadlines, and issues Formats the info clearly with a bullet-point summary Sends the final summarized report to your inbox How to set up Clone or import the workflow into your n8n instance Replace <Your Email ID> in the Code node with your actual Gmail address (or remove if not needed) Ensure your Gmail and OpenRouter credentials are set up in n8n Update the recipient email in the Send Summary node if you want it sent to a fixed address Activate the workflow once tested How to customize the workflow Change Summary Style:** Edit the system message in the LangChain Agent to match your tone (e.g. casual, business, detailed) Adjust Digest Time:** Change the Schedule Trigger to any preferred hour Customize Recipients:** Change or add recipients dynamically or statically in the Gmail send node Filter Email Type:** Modify the Gmail query in the Code node to include filters like from:, is:unread, subject:project
by Rosh Ragel
This workflow processes emails received in Gmail and adds the sender's name and email address to a MySQL database. Use Cases: A sales or marketing agency can use this to automatically save client contact info to a database to build a list of leads Companies can use this to automatically save contacts to a database in case of Gmail data loss / losing access to their Gmail account Companies can build mailing lists to automatically send promotions to all of the clients who have contacted them in a given time period Before using, you need to have: Gmail credential MySQL database credential A Table in the MySQL database to store your contacts The table should have a "name" column, which allows NULL values The table should have an "email" column, which should be UNIQUE How it works: The Gmail Trigger will listen for a new email every minute For each email, the code node will extract the name and email address of the sender. If there is no name, it will return null The MySQL node will insert the new contact into a table in your database If the contact email already exists in your database, the MySQL node will update the contact name How to use: Please set up the MySQL node by selecting the correct table to store contacts in Please choose your "email" column to match on Please choose your "name" column to store names Customizing this Workflow: You can customize this workflow to save more data to MySQL. Here are some examples: In the MySQL node, click "Add Value", and choose one of the fields from the Gmail node to save in your database column. You can try saving the following items: Subject line MessageID ThreadID Snippet Recipient Info
by Satish
This n8n template demonstrates automating an appointment letter creation process using a template and then having the HR approve before emailing the appointment letter to the candidate. How it works Create an appointment letter template. e.g "Appointment Letter.doc" on Google Drive Form Submission node - Create a form trigger with the required fields that need to be capture as part of the appointment letter. Eg. Candidate Name, Position offered, Salary, Date of Joining, Candidate email, etc. Google Drive Copy node - Once the form is filled, it creates a candidate copy of the appointment letter by appending the candidate name to appointment letter. e.g. "Appointment Letter - <candidate name>.doc". This will be stored on the Google Drive Google Doc Update node - Fill the placeholders in the appointment letter with the candidate specific details such as Candidate Name, Position offered, Salary, Date of Joining, etc. Google Drive Download node - Create a PDF version of the candidate's appointment letter. e.g. "Appointment Letter - <Candidate Name>.pdf" and download it to Google Drive Google Drive Upload node - Upload the PDF to Google Drive Gmail Send Message node - Send an email to the HR requesting to review the candidate's appointment letter and 'Approve' or 'Reject' the appointment letter. This is the Human-In-The-Loop step If Node (for routing) - will return "true" if HR approves and "false" if HR rejects If HR approves, go to Step 9 and Step 10 Google Drive Download node - Get the PDF file Gmail Send Message node - Send an email to the candidate with the appointment letter (PDF) as the attachment How to use The Form trigger node is used as an example but feel free to replace this with other triggers such as Google Sheet Create an Appointment Letter Google document with the follwing fields - Date, Candidate Name, Position Name, Fixed CTC, Joining Date and To be signed by Date. See sample letter format below: <Appointment Letter.doc> (Google Document) Appointment Letter [Date] Dear [Candidate Name], Congratulations! We are pleased to offer you the [Position Name] at ABC Company. Fixed CTC - [Fixed CTC] Joining Date - [Joining Date] Requirements Google drive for upload and downloading the file Gmail for sending emails Sign the letter by - [To be signed by Date] Signature
by RedOne
This workflow is designed for e-commerce store owners, operations managers, and developers who use Shopify as their e-commerce platform and want an automated way to track and analyze their order data. It is particularly useful for businesses that: Need a centralized view of all Shopify orders Want to analyze order trends without logging into Shopify Need to share order data with team members who don't have Shopify access Want to build custom reports based on order information What Problem Is This Workflow Solving? While Shopify provides excellent order management within its platform, many businesses need their order data available in other systems for various purposes: Data accessibility**: Not everyone in your organization may have access to Shopify's admin interface Custom reporting**: Google Sheets allows for flexible analysis and report creation Data integration**: Having orders in Google Sheets makes it easier to combine with other business data Backup**: Creates an additional backup of your critical order information What This Workflow Does This n8n workflow creates an automated bridge between your Shopify store and Google Sheets: Listens for new order notifications from your Shopify store via webhooks Processes the incoming order data and transforms it into a structured format Stores each new order in a dedicated Google Sheets spreadsheet Sends real-time notifications to Telegram when new orders are received or errors occur Setup Create a Google Sheet Create a new Google Sheet to store your orders Add a sheet named "orders" with the following columns: orderId orderNumber created_at processed processed_at json customer shippingAddress lineItems totalPrice currency Set Up Telegram Bot Create a Telegram bot using BotFather (send /newbot to @BotFather) Save your bot token for use in n8n credentials Start a chat with your bot and get your chat ID (you can use @userinfobot) Configure the Workflow Set your Google Sheet ID in the "Edit Variables" node Enter your Telegram chat ID in the "Edit Variables" node Set up your Telegram API credentials in n8n Configure Shopify Webhook In your Shopify admin, go to: Settings > Notifications > Webhooks Create a new webhook for "Order creation" Set the URL to your n8n webhook URL (from the "Receive New Shopify Order" node) Set the format to JSON How to Customize This Workflow to Your Needs Additional data**: Modify the "Transform Order Data to Standard Format" function to extract more Shopify data Multiple sheets**: Duplicate the Google Sheets node to store different aspects of orders in separate sheets Telegram messages**: Customize the text in Telegram nodes to include more details or rich formatting Data processing**: Add nodes to perform calculations or transformations on order data Additional notifications**: Add more channels like Slack, Discord, or SMS Integrations**: Extend the workflow to send order data to other systems like CRMs, ERPs, or accounting software Final Notes This workflow serves as a foundation that you can build upon to create a comprehensive order management system tailored to your specific business needs.
by Tony Duffy
. Read and store IOT sensor data with the MQTT Trigger and InfluxDB tonyduffy@protonmail.com This workflow is for users wanting a practical example of how to obtain data from remote IOT systems using the MQTT protocol in an n8n environment. The template provides typical n8n node implementation and configuration settings necessary to read and store IOT data. The workflow reads the temperature and humidity data from a remote IOT system in this case a DHT22 sensor connected to a ESP32 micro controller. The data is parsed into the correct JSON format and then ingested in an InfluxDB data bucket. From there the stored temperature and humidity values can be displayed in real time. The workflow can be easily modified to read any MQTT driven device data. Remote IOT Sensor Setup The ESP32 controller with the DHT22 sensor are running on a Wokwi simulator. The simulator uses micro python to publish a MQTT "wokwi-weather" topic with the temperature and humidity payloads to an online Mosquitto MQTT broker. The n8n MQTT trigger node subscribes to the topic on the broker and reads the payload values when any changes are published. The code node then prepares the payload for JSON format. The HTTP request node ingests the data in a InfluxDB bucket How to customise this workflow to your needs Wokwi IOT ESP32 simulator You will need to setup a free account at Wokwi.com Once created search for a project "Micro-Python MQTT Weather Logger (ESP32)" Then when the MQTT weather logger project is open change lines 28 and 29 to the following 28 MQTT_CLIENT_ID = "" 29 MQTT_BROKER = "test.mosquitto.org" You then can start the simulation by clicking on the green arrow and it will connect the mosquitto broker and the "wokwi-weather" topic will be published. By clicking on the DHT22 sensor the temperature and humidity bar will appear and you can change the values to send updated payload values to the broker. InfluxDB You will require access to functioning InfluxDB database to utilise this workflow Note : You will have to provide the following for the HTTP request node to connect to InfluxDB. The URL and port of the desired InfluxDB (In this case the InfluxDB is running locally on port 8086 ie. http://localhost:8086.) InfluxDB bucket for the data. ( In this case the created bucket name is "wokwi-data") The Organization ID of the InfluxDB. This can be obtained for the InfluxDB admin page A generated API token to read and write to the InfluxDB bucket. Created from the InfluxDB admin n8n workflow. The MQTT trigger node is configured to subscribe to the "wokwi-weather" topic on the test Mosquitto MQTT broker. It reads the temperature and humidity data sent by ESP32. The code node uses Javascript to move the temperature and humidity payloads to JSON format. This is flexible and can easily modified. The HTTP request node posts the JSON payloads to the InfluxDB bucket. When the above is configured the workflow should function correctly. Thanks to the many who have downloaded this template. Let me know on what you would like to build. Contact me at tonyduffy@protonmail.com
by Matheus Pedrosa
Who is this template for? This template is ideal for n8n instance administrators, developers, and DevOps teams who need a proactive and organized way to monitor the health of their automations. If you want to be notified about failures as soon as they happen, without having to manually check execution logs, this workflow is for you. What does this template do? This workflow automates error monitoring on your n8n instance. Every hour, it performs the following steps: Queries the n8n API to fetch all executions that have failed in the last hour. Groups the errors by workflow to consolidate the information. Builds a rich message for each failed workflow, including the error count. Sends an alert to a Slack channel with a button to open the workflow directly, allowing for immediate investigation. Requirements Before you start, you will need to have the following configured in your n8n instance: n8n API Credentials:** You need to generate an API key in your n8n instance settings so the workflow can query execution data. Slack Credentials:* A configured *Slack (OAuth2 API)** credential to allow n8n to send messages to your workspace. How to set it up Setup is simple and only takes a few minutes: Config Node: In the node named "Config", you must set the value for the baseUrl to your n8n instance's URL (e.g., https://n8n.yourdomain.com). This is crucial for generating the correct workflow links in the Slack message. Schedule Trigger: The workflow is pre-configured to run every hour. You can adjust the frequency in this node to fit your needs. "Get Failed Executions" Node (HTTP Request): Under Authentication, select 'Header Auth'. In the Credentials field, select your n8n API credential. "Post to Slack" Node (Slack): Select your Slack credential. In the Channel field, enter the name of the channel where error notifications should be sent (e.g., #n8n-alerts). Activate the Workflow! After these steps, just activate the workflow to start the automatic error monitoring. How to customize the workflow You can easily customize this template: Change the Schedule:** Modify the Schedule Trigger node to run at different intervals (every 15 minutes, once a day, etc.). Change the Notification Channel:** Instead of Slack, you can replace the last node to send notifications to Discord, Microsoft Teams, Telegram, or even by email. Add More Information:** You can modify the MakeMessage node that generates the message to include more details about the errors, such as the error message or the exact time of failure.
by Artur
Overview This automated workflow fetches Upwork job postings using Apify, removes duplicate job listings via Airtable, and sends new job opportunities to Slack. Key Features: Automated job retrieval** from Upwork via Apify API Duplicate filtering** using Airtable to store only unique jobs Slack notifications** for new job postings Runs every 30 minutes** during working hours (9 AM - 5 PM) This workflow requires an active Apify subscription to function, as it uses the Apify Upwork API to fetch job listings. Who is This For? This workflow is ideal for: Freelancers looking to track Upwork jobs in real time Recruiters automating job collection for analytics Developers who want to integrate Upwork job data into their applications What Problem Does This Solve? Manually checking Upwork for jobs is time-consuming and inefficient. This workflow: Automates job discovery based on your keywords Filters out duplicate listings, ensuring only new jobs are stored Notifies you on Slack when new jobs appear How the Workflow Works 1. Schedule Trigger (Every 20 Minutes) Triggers the workflow at 20-minute intervals Ensures job searches are only executed during working hours (9 AM - 5 PM) 2. Query Upwork for Jobs Uses Apify API to scrape Upwork job posts for specific keywords (e.g., "n8n", "Python") 3. Find Existing Jobs in Airtable Searches Airtable to check if a job (based on title and link) already exists 4. Filter Out Duplicate Jobs The Merge Node compares Upwork jobs with Airtable data The IF Node filters out jobs that are already stored in the database 5. Save Only New Jobs in Airtable The Insert Node adds only new job listings to the Airtable collection 6. Send a Slack Notification If a new job is found, a Slack message is sent with job details Setup Guide Required API Keys Upwork Scraper (Apify Token) – Get your token from Apify Airtable Credentials Slack API Token – Connect Slack to n8n and set the channel ID (default: #general) Configuration Steps Modify search keywords in the 'Assign Parameters' node (startUrls) Adjust the Working Hours in the 'If Working Hours' node Set your Slack channel in the Slack node Ensure Airtable is connected properly - you'll need to create a table with 'title' and 'link' columns. Adjust the 'If Working Hours' node to match your timezone and hours, or remove it altogether to receive notifications and updates constantly. How to Customize the Workflow Change keywords: update the startUrls in the 'Assign Parameters' node to track different job categories Change 'If Working Hours': Modify conditions in the IF Node to filter times based on your needs Modify Slack Notifications: Adjust the Slack message format to include additional job details Why Use This Workflow? Automated job tracking without manual searches Prevents duplicate entries in Airtable Instant Slack notifications for new job opportunities Customizable – adapt the workflow to different job categories Next Steps Run the workflow and test with a small set of keywords Expand job categories for better coverage Enhance notifications by integrating Telegram, Email, or a dashboard This workflow ensures real-time job tracking, prevents duplicates, and keeps you updated effortlessly.