by Shahrear
📜 AI-Powered Contract Management Pipeline (Google Drive + VLM Run + Sheets + Calendar + Slack) ⚙️ What This Workflow Does This workflow automatically extracts, organizes, and tracks legal contract details from documents uploaded to Google Drive. Using VLM Run’s Execute Agent, it parses key metadata such as contract ID, parties, dates, and terms — then stores, alerts, and schedules reminders through Google Sheets, Calendar, and Slack. 🧩 Requirements Google Drive OAuth2** for monitoring and downloads VLM Run API credentials** with Execute Agent access Google Sheets OAuth2** for structured record storage Google Calendar OAuth2** for key date reminders Slack API credentials** for team notifications A reachable Webhook URL (for receiving parsed contract data) ⚡Quick Setup Configure Google Drive OAuth2 and create upload folder and folder for saving extracted images. Install the verified VLM Run node by searching for VLM Run in the node list, then click Install. Once installed, you can start using it in your workflows. Add VLM Run API credentials for document parsing. Configure Google Sheet and Calendar. For Google Sheet, from the document list, pick your Google Sheet (e.g., test). Then select the sheet inside it (e.g., Sheet1). Set the operation to Append Row — this will add new contract details as new rows. Turn on Map Each Column Manually. Match each contract field (like Contract ID, Title, Parties, Effective Date, Termination Date) to its corresponding column in your Google Sheet. Configure Slack for notifications. ⚙️ How It Works Monitor Contract Uploads – Watches a target Google Drive folder for new file uploads (PDFs, images, or scans). Download Contract File – Automatically downloads new contracts for AI analysis. VLM Run ContractParser – Sends the file to the VLM Run Execute Agent, which extracts structured contract data, including: Contract ID Title Parties (with roles) Property address Effective date Termination date Rent, deposit, payment terms, and governing law Receive Contract Data – The webhook endpoint receives the structured JSON response. Format Contract Data – Normalizes fields, formats dates, and prepares for storage. Save to Expense Database (Google Sheets) – Appends extracted data to a master Google Sheet for centralized contract tracking. Notify via Slack – Posts a concise summary to a Slack channel, showing key contract details for visibility. Create Calendar Events – Automatically schedules Google Calendar events for: Effective Date Termination Date Renewal Reminder (60 days before termination) 💡 Why Use This Workflow Manual contract management is error-prone and time-consuming key details like renewal dates, payment terms, or termination clauses often get lost in email threads or folders. This workflow ensures: Zero missed deadlines** automatic Google Calendar reminders keep your team on track. Instant team visibility** - Slack notifications keep legal, finance, and operations aligned. End-to-end automation** no need for manual parsing, data entry, or follow-ups. 🧠 Perfect For Legal teams automating contract intake and tracking Real estate or lease management workflows Finance or procurement teams needing expiration alerts Organizations centralizing contract metadata in Sheets 🛠️ How to Customize Modify Extraction Fields Edit the VLM Run Execute Agent schema to add fields like contract value, payment schedule, department, or contact email. Change Storage Swap Google Sheets for Airtable, Notion, or BigQuery if you manage large datasets or need relational tracking. Customize Notifications Send Slack alerts only for high-value or expiring contracts, and tag relevant teams (e.g., @legal, @finance). Add Calendar Events Auto-create events for reviews or payment milestones using extra date fields. Add Approvals or Signatures Insert a Google Form or Slack approval step, or trigger DocuSign for e-signature automation. ⚠️ Community Node Disclaimer This workflow uses community nodes (VLM Run) that may need additional permissions and custom setup.
by Priya Jain
This workflow provides an OAuth 2.0 auth token refresh process for better control. Developers can utilize it as an alternative to n8n's built-in OAuth flow to achieve improved control and visibility. In this template, I've used Pipedrive API, but users can apply it with any app that requires the authorization_code for token access. This resolves the issue of manually refreshing the OAuth 2.0 token when it expires, or when n8n's native OAuth stops working. What you need to replicate this Your database with a pre-existing table for storing authentication tokens and associated information. I'm using Supabase in this example, but you can also employ a self-hosted MySQL. Here's a quick video on setting up the Supabase table. Create a client app for your chosen application that you want to access via the API. After duplicating the template: a. Add credentials to your database and connect the DB nodes in all 3 workflows. Enable/Publish the first workflow, "1. Generate and Save Pipedrive tokens to Database." Open your client app and follow the Pipedrive instructions to authenticate. Click on Install and test. This will save your initial refresh token and access token to the database. Please watch the YouTube video for a detailed demonstration of the workflow: How it operates Workflow 1. Create a workflow to capture the authorization_code, generate the access_token, and refresh the token, and then save the token to the database. Workflow 2. Develop your primary workflow to fetch or post data to/from your application. Observe the logic to include an if condition when an error occurs with an invalid token. This triggers the third workflow to refresh the token. Workflow 3. This workflow will handle the token refresh. Remember to send the unique ID to the webhook to fetch the necessary tokens from your table. Detailed demonstration of the workflow: https://youtu.be/6nXi_yverss
by Daniel Ng
Auto Backup n8n Workflows to Google Drive Imagine the sinking feeling: hours, weeks, or even months of meticulous work building your n8n workflows, suddenly gone. A server crash, an accidental deletion, data corruption, or an unexpected platform issue – and all your automated processes vanish. Without a reliable backup system, you're facing a complete rebuild from scratch, a scenario that's not just frustrating but can be catastrophic for business operations. Furthermore, consider the daunting task of migrating your n8n instance to a new host or server. Manually exporting each workflow, one by one, then painstakingly importing them into the new environment is not only incredibly time-consuming, especially if you have tens or hundreds of workflows, but also highly prone to errors and omissions. You need a systematic, automated solution. This workflow provides a robust solution for automatically backing up all your n8n workflows to Google Drive on schedule (default to every hour). It creates a uniquely named folder for each backup instance, incorporating the date and hour, and then systematically uploads each workflow as an individual JSON file. To manage storage space, the workflow also includes a cleanup mechanism that deletes backup folders older than a user-defined retention period (defaulting to 7 days). Ideally, this backup workflow should be used in conjunction with a restore solution like our "Restore Workflows from Google Drive Backups" template. For more powerful n8n templates, visit our website or contact us at AI Automation Pro. We help your business build custom AI workflow automation and apps. Feature highlights Triggers on schedule (default to hourly). Creates a \n8n\_backup\_YYYY-MM-DD\_HH\ folder in Google Drive. Fetches all n8n workflows. Saves each workflow as a JSON file to the new folder. Deletes backup folders older than the 'Coverage Period' (default to 7 days). Who is this for? This template is designed for: n8n Administrators and Developers:** Who need a reliable, automated system to safeguard their workflows against accidental loss, corruption, or system issues. Proactive n8n Users:** Who want to maintain a version history of their workflows, enabling easy rollback to previous configurations if necessary. Organizations:** Seeking to implement disaster recovery and data integrity practices for their n8n automation infrastructure. What problem is this workflow solving? / use case This workflow directly addresses these critical risks and challenges by: Automating Backups:** Eliminates the manual effort and inconsistency of ad-hoc backups, ensuring your workflows are regularly and reliably saved. Preventing Data Loss:** Safeguards your valuable automation assets against unforeseen disasters by creating secure, versioned copies in Google Drive. Facilitating Migration & Recovery:** Provides the foundational backups needed for a smoother, more systematic migration or a full disaster recovery, allowing you to restore your operations efficiently. Version Control:** By storing scheduled backups (defaulting to hourly), it allows you to access and restore previous versions of your workflows, offering an undo capability for significant changes or corruptions. Storage Management:** Automatically removes old backups based on a configurable retention period, preventing excessive use of Google Drive storage while keeping a relevant history. What this workflow does Scheduled Trigger: Runs automatically every hour. Timestamping: Fetches the current date and hour to create a unique name for the backup folder. Folder Creation: Creates a new folder in a specified Google Drive location. The folder is named in the format: n8n_backup_YYYY-MM-DD_HH. Workflow Retrieval: Connects to your n8n instance via its API and fetches a list of all existing workflows. Individual Backup: Processes each workflow one by one: Converts the workflow data to a binary JSON file. Uploads the JSON file (named after the workflow) to the hourly backup folder in Google Drive. Includes a short wait step between uploads to respect potential API rate limits. Old Backup Deletion: Calculates a cut-off date based on the "Coverage Period" set in the "Settings" node (e.g., 7 days prior to the current date). Searches Google Drive for backup folders (matching the naming convention) that are older than this cut-off date. Deletes these identified old backup folders to free up storage space. Step-by-step setup Import Template: Upload the provided JSON file into your n8n instance. Configure Credentials: Google Drive Nodes: You will need to create or select existing Google Drive OAuth2 API credentials for these nodes. n8n Node: n8n (node that fetches workflows) Configure n8n API credentials to allow the workflow to access your instance's workflow data. Specify Google Drive Backup Location: Open the "Google Drive Backup Folder Every Hour" node. Under the "Drive ID" parameter: select it from the list or provide its ID. Under the "Folder ID" parameter: select or input the ID of the parent folder in Google Drive where you want the n8n_backup_YYYY-MM-DD_HH folders to be created (e.g., a general "n8n\_Backups" folder). Set Backup Retention Period: Open the "Settings" node. Modify the value for "Coverage Period" (default is 7). This number represents the number of days backups should be kept before being deleted. Activate Workflow: Toggle the "Active" switch for the workflow in your n8n dashboard. How to customize this workflow to your needs Backup Frequency:* Adjust the "Rule" in the *"Schedule Trigger"** node to change the backup interval (e.g., daily, specific times). Folder/File Naming:* Modify the expressions in the "Parameters" tab of the *"Google Drive Backup Folder Every Hour"* node (for folder name) or the *"Google Drive Upload Workflows"** node (for file name) if you require a different naming convention. Targeted Backups:* To back up only specific workflows, insert a "Filter" node after the *"n8n"** node to filter workflows based on criteria like name, tags, or ID before they reach the "Move Binary Data" node. Wait Time:* The *"Wait"** node is set to 3 seconds between uploads. If you have a very large number of workflows or encounter rate limiting, you might adjust this duration. Error Workflow:** The workflow is pre-configured with an "Error Workflow" setting. Ensure this error workflow exists in your n8n instance, or update the setting to point to your preferred error handling workflow. This can be used to send notifications on failure. Important Considerations Resource Usage:** While the workflow includes a wait step between individual workflow uploads to minimize load, backing up an extremely large number of workflows could still consume resources on your n8n instance and make many API calls to Google Drive. Monitor performance if you have thousands of workflows. Testing Restore Process**: Regularly test restoring a few workflows from your Google Drive backups using the companion "Restore All n8n Workflows from Google Drive" template or a manual import. This verifies the integrity of your backups and ensures you can recover when needed. Workflow Modifications**: If you modify this backup workflow (e.g., change the folder naming convention), ensure your restore process or workflow is also updated to match these changes.
by Lorena
This workflow ensures gender inclusive language in Mattermost channels. If someone addresses the group with “guys” or “gals”, a bot promptly replies with: "May I suggest “folks” or “y'all”? We use gender inclusive language here. 😄". Webhook node**: triggers the workflow when a new message is posted in Mattermost. IF node**: verifies if the message includes the words "guys" or "gals". If false, it does not take any action. If true, it triggers the Mattermost node. Mattermost node**: posts the language warning message in the Mattermost channel.
by Shrey
This workflow can be used to save all of your workflows in: a raw state (as a json file in Dropbox) an Airtable base, in a pre-designed format. It runs periodically (currently, every 30 minutes) and either updates (if already existing in Airtable) or creates a new record in Airtable for each workflow. Here's the Airtable base to give you an idea: View Airtable base Note: This workflows uses the "http://localhost:5678/rest" API which the UI editor uses but is still not officially supported. Hence, it may suffer breaking changes at some point in the future and the workflow might become dysfunctional then.
by Harshil Agrawal
This is a workflow that sends daily astronomy picture of the day using the NASA node to a channel on Telegram. Cron node: The Cron node triggers the workflow daily at 8 PM. You can update the time in the Cron node to trigger the workflow at your desired time. NASA node: After the Cron node triggers the workflow, the NASA node fetches the Astronomy Picture of the Day from the NASA API. You can also get the binary file of the image. Toggle Download Image to true to get the file. Telegram node: The Telegram node sends the image to a Telegram channel. If you want to share the image on another platform, you can replace the Telegram node with the node of that platform. For example, if you want to post the image on a channel on Slack, replace the Telegram node with the Slack node. You can learn to build this workflow on the documentation page of the NASA node.
by Harshil Agrawal
This workflow allows you to check for preview for a link and return the preview if it exists. Peekalink node: This node checks if a preview is available for a URL or not. If a preivew is available the node returns true, otherwise false. IF node: The IF node checks the output from the previous node. If the condition is true the node connected to the true branch is executed. If the condition is false the node connected to the false branch is executed. Peekalink1 node: This node will fetch the preview of the URL. Based on your use-case, you can connect the Slack node, Mattermost node etc. to get the response on these platforms. NoOp node: Adding this node here is optional, as the absence of this node won't make a difference to the functioning of the workflow. We've added this as it can sometimes help others with a better understanding of the workflow, visually.
by Obsidi8n
How it Works This n8n template makes it possible to send emails directly from your Obsidian notes. It leverages the power of the Obsidian Post Webhook plugin, allowing seamless integration between your notes and the email workflow. What it does: Receives note content and metadata from Obsidian via a Webhook. Parses YAML frontmatter to define email recipients, subject, and more. Automatically processes attachments, encoding them into an email-friendly format. Sends emails via Gmail and confirms the status back to Obsidian. Includes a testing feature to verify everything works before going live. Set-up Steps Webhook Configuration: Set your n8n POST Webhook URL in the Obsidian Obsidian Post Webhook plugin settings. Email Integration: Submit the Gmail credentials in n8n email nodes. Test the Workflow: Run a test from Obsidian to ensure the template functions correctly. Activate and Enjoy: Start sending customized emails with attachments from your notes in no time!
by tanaypant
This workflow automatically queries a Postgres database to find outlier readings for which SMS notifications have not been sent. This is Workflow 2 in the blog tutorial Database activity monitoring and alerting. Prerequisites A Postgres database set up and credentials A Twilio account and credentials Nodes Cron node triggers the workflow every minute, so the database is queried at regular intervals. Postgres nodes extract values from, and update values in the database. Twilio node sends an alert SMS about the outlier reading to a specified phone number. Set node sets the notification value to true.
by Jenny
Vector Database as a Big Data Analysis Tool for AI Agents Workflows from the webinar "Build production-ready AI Agents with Qdrant and n8n". This series of workflows shows how to build big data analysis tools for production-ready AI agents with the help of vector databases. These pipelines are adaptable to any dataset of images, hence, many production use cases. Uploading (image) datasets to Qdrant Set up meta-variables for anomaly detection in Qdrant Anomaly detection tool KNN classifier tool For anomaly detection The first pipeline to upload an image dataset to Qdrant. The second pipeline is to set up cluster (class) centres & cluster (class) threshold scores needed for anomaly detection. The third is the anomaly detection tool, which takes any image as input and uses all preparatory work done with Qdrant to detect if it's an anomaly to the uploaded dataset. For KNN (k nearest neighbours) classification The first pipeline to upload an image dataset to Qdrant. This pipeline is the KNN classifier tool, which takes any image as input and classifies it on the uploaded to Qdrant dataset. To recreate both You'll have to upload crops and lands datasets from Kaggle to your own Google Storage bucket, and re-create APIs/connections to Qdrant Cloud (you can use Free Tier cluster), Voyage AI API & Google Cloud Storage. [This workflow] KNN classification tool This tool takes any image URL, and as output, it returns a class of the object on the image based on the image uploaded to the Qdrant dataset (lands). An image URL is received via the Execute Workflow Trigger, which is then sent to the Voyage AI Multimodal Embeddings API to fetch its embedding. The image's embedding vector is then used to query Qdrant, returning a set of X similar images with pre-labeled classes. Majority voting is done for classes of neighbouring images. A loop is used to resolve scenarios where there is a tie in Majority Voting, and we increase the number of neighbours to retrieve. When the loop finally resolves, the identified class is returned to the calling workflow.
by Jenny
Vector Database as a Big Data Analysis Tool for AI Agents Workflows from the webinar "Build production-ready AI Agents with Qdrant and n8n". This series of workflows shows how to build big data analysis tools for production-ready AI agents with the help of vector databases. These pipelines are adaptable to any dataset of images, hence, many production use cases. Uploading (image) datasets to Qdrant Set up meta-variables for anomaly detection in Qdrant Anomaly detection tool KNN classifier tool For anomaly detection The first pipeline to upload an image dataset to Qdrant. The second pipeline is to set up cluster (class) centres & cluster (class) threshold scores needed for anomaly detection. 3. This is the third pipeline --- the anomaly detection tool, which takes any image as input and uses all preparatory work done with Qdrant to detect if it's an anomaly to the uploaded dataset. For KNN (k nearest neighbours) classification The first pipeline to upload an image dataset to Qdrant. The second is the KNN classifier tool, which takes any image as input and classifies it on the uploaded to Qdrant dataset. To recreate both You'll have to upload crops and lands datasets from Kaggle to your own Google Storage bucket, and re-create APIs/connections to Qdrant Cloud (you can use Free Tier cluster), Voyage AI API & Google Cloud Storage. [This workflow] Anomaly Detection Tool This is the tool that can be used directly for anomalous images (crops) detection. It takes as input (any) image URL and returns a text message telling if whatever this image depicts is anomalous to the crop dataset stored in Qdrant. An Image URL is received via the Execute Workflow Trigger, which is used to generate embedding vectors using the Voyage AI Embeddings API. The returned vectors are used to query the Qdrant collection to determine if the given crop is known by comparing it to threshold scores of each image class (crop type). If the image scores lower than all thresholds, then the image is considered an anomaly for the dataset.
by Léo Mathurin
✨ Try It Out! Sync your Linear issues to Todoist automatically with this n8n workflow. When an issue is created, updated, or completed in Linear, a corresponding task is created, updated, or closed in Todoist. ⚙️ How It Works Triggered by issue changes via linearTrigger Routes based on action (create, update, remove) Checks if a matching Todoist task already exists (via issue ID) If the issue has: A due date And is assigned to you (youremail@example.com) ➤ Then it creates or updates the task accordingly If the issue is marked Done, the Todoist task is closed If it's deleted in Linear, the Todoist task is also removed Sub-issues get enriched with their parent title for clarity 🛠️ Customization Replace youremail@example.com with your Linear email in the IF nodes Adjust which states are synced (e.g. “In Progress”, “Todo”...) Customize the Todoist project, labels, or title formatting ⚠️ Bi-directional Sync? This workflow is one-way (Linear ➜ Todoist). Bi-directional syncing might be possible but isn’t handled here—it would be a cool upgrade! ✅ Requirements n8n with OAuth2 access to Linear and Todoist Your Linear email set in the workflow for filtering A target Todoist project (default: Inbox) 💬 Need Help? Ask in the n8n Forum or join the Discord. Happy Automating! 🚀