by Harshil Agrawal
This workflow allows you to send position updates of the ISS every minute to a topic in MQTT using the MQTT node. Cron node: The Cron node will trigger the workflow every minute. HTTP Request node: This node will make a GET request to the API https://api.wheretheiss.at/v1/satellites/25544/positions to fetch the position of the ISS. This information gets passed on to the next node in the workflow. Set node: We will use the Set node to ensure that only the data that we set in this node gets passed on to the next nodes in the workflow. AWS SQS: This node will send the data from the previous node to the iss-position topic. If you have created a topic with a different one, you can use that topic instead.
by Jitesh Dugar
Newsletter Sign-up with Email Verification & Welcome Email Automation ๐ Description A complete, production-ready newsletter automation workflow that validates email addresses, sends personalized welcome emails, and maintains comprehensive logs in Google Sheets. Perfect for marketing teams, content creators, and businesses looking to build high-quality email lists with minimal manual effort. โจ Key Features Email Verification Real-time validation** using Verifi Email API Checks email format (RFC compliance) Verifies domain existence and MX records Detects disposable/temporary email addresses Identifies potential spoofed emails Automated Welcome Emails Personalized HTML emails** with subscriber's first name Beautiful, mobile-responsive design with gradient headers Branded confirmation and unsubscribe links Sent via Gmail (or SMTP) automatically to valid subscribers Smart Data Handling Comprehensive logging** to Google Sheets with three separate tabs Handles incomplete submissions gracefully Preserves original user data throughout verification process Tracks source attribution for multi-channel campaigns Error Management Automatic retry logic on API failures Separate logging for different error types Detailed technical reasons for invalid emails No data loss with direct webhook referencing ๐ฏ Use Cases Newsletter sign-ups** on websites and landing pages Lead generation** forms with quality control Marketing campaigns** requiring verified email lists Community building** with automated onboarding SaaS product launches** with email collection Content creator** audience building E-commerce** customer list management ๐ What Gets Logged Master Log (All Subscribers) Timestamp, name, email, verification result Verification score and email sent status Source tracking, disposable status, domain info Invalid Emails Log Detailed rejection reasons Technical diagnostic information MX record status, RFC compliance Provider information for troubleshooting Invalid Submissions Log Incomplete form data Missing required fields Timestamp for follow-up ๐ง Technical Stack Trigger: Webhook (POST endpoint) Email Verification: Verifi Email API Email Sending: Gmail OAuth2 (or SMTP) Data Storage: Google Sheets (3 tabs) Processing: JavaScript code nodes for data formatting ๐ Setup Requirements Google Account - For Sheets and Gmail integration Verifi Email API Key - (https://verifi.email) Google Sheets - Pre-configured with 3 tabs (template provided) 5-10 minutes - Quick setup with step-by-step instructions included ๐ Benefits โ Improve Email Deliverability - Remove invalid emails before sending campaigns โ Reduce Bounce Rates - Only send to verified, active email addresses โ Save Money - Don't waste email credits on invalid addresses โ Better Analytics - Track conversion rates by source โ Professional Onboarding - Personalized welcome experience โ Scalable Solution - Handles high-volume sign-ups automatically โ Data Quality - Build a clean, high-quality subscriber list ๐จ Customization Options Email Template** - Fully customizable HTML design Verification Threshold** - Adjust score requirements Brand Colors** - Match your company branding Confirmation Flow** - Add double opt-in if desired Multiple Sources** - Track different signup forms Language** - Easily translate email content ๐ฆ What's Included โ Complete n8n workflow JSON (ready to import) โ Google Sheets template structure โ Responsive HTML email template โ Setup documentation with screenshots โ Troubleshooting guide โ Customization examples ๐ Privacy & Compliance GDPR-compliant with unsubscribe links Secure data handling via OAuth2 No data shared with third parties Audit trail in Google Sheets Easy data deletion/export ๐ก Quick Stats 12 Nodes** - Fully automated workflow 3 Data Paths** - Valid, invalid, and incomplete submissions 100% Uptime** - When properly configured Instant Processing** - Real-time email verification Unlimited Scale** - Based on your API limits ๐ Perfect For Marketing Agencies SaaS Companies Content Creators E-commerce Stores Community Platforms Educational Institutions Membership Sites Newsletter Publishers ๐ Why Use This Workflow? Instead of manually verifying emails or dealing with bounce complaints, this workflow automates the entire process from sign-up to welcome email. Save hours of manual work, improve your email deliverability, and create a professional first impression with every new subscriber. Start building a high-quality email list today!
by Brian
This template automates posting to Instagram Business and Facebook Pages using the Meta Graph API. It supports both short-lived and long-lived tokens, with a secure approach using System User tokens for reliable, ongoing automation. Includes detailed guidance for authentication, token refresh logic, and API use. Features: ๐ธ Publish to Instagram via /media + /media_publish ๐ Post to Facebook Pages via /photos ๐ Long-lived token support via Meta Business System User โป๏ธ Token refresh support using staticData in n8n ๐ง In-line sticky note instructions Use Cases: Schedule and publish branded social media content Automate marketing flows with CRM + social sync Empower internal teams or clients to post without manual steps Tags: Instagram, Facebook, Meta Graph API, Social Media, Token Refresh, Long-Lived Token, Marketing Automation, System User
by Shahrear
๐ AI-Powered Contract Management Pipeline (Google Drive + VLM Run + Sheets + Calendar + Slack) โ๏ธ What This Workflow Does This workflow automatically extracts, organizes, and tracks legal contract details from documents uploaded to Google Drive. Using VLM Runโs Execute Agent, it parses key metadata such as contract ID, parties, dates, and terms โ then stores, alerts, and schedules reminders through Google Sheets, Calendar, and Slack. ๐งฉ Requirements Google Drive OAuth2** for monitoring and downloads VLM Run API credentials** with Execute Agent access Google Sheets OAuth2** for structured record storage Google Calendar OAuth2** for key date reminders Slack API credentials** for team notifications A reachable Webhook URL (for receiving parsed contract data) โกQuick Setup Configure Google Drive OAuth2 and create upload folder and folder for saving extracted images. Install the verified VLM Run node by searching for VLM Run in the node list, then click Install. Once installed, you can start using it in your workflows. Add VLM Run API credentials for document parsing. Configure Google Sheet and Calendar. For Google Sheet, from the document list, pick your Google Sheet (e.g., test). Then select the sheet inside it (e.g., Sheet1). Set the operation to Append Row โ this will add new contract details as new rows. Turn on Map Each Column Manually. Match each contract field (like Contract ID, Title, Parties, Effective Date, Termination Date) to its corresponding column in your Google Sheet. Configure Slack for notifications. โ๏ธ How It Works Monitor Contract Uploads โ Watches a target Google Drive folder for new file uploads (PDFs, images, or scans). Download Contract File โ Automatically downloads new contracts for AI analysis. VLM Run ContractParser โ Sends the file to the VLM Run Execute Agent, which extracts structured contract data, including: Contract ID Title Parties (with roles) Property address Effective date Termination date Rent, deposit, payment terms, and governing law Receive Contract Data โ The webhook endpoint receives the structured JSON response. Format Contract Data โ Normalizes fields, formats dates, and prepares for storage. Save to Expense Database (Google Sheets) โ Appends extracted data to a master Google Sheet for centralized contract tracking. Notify via Slack โ Posts a concise summary to a Slack channel, showing key contract details for visibility. Create Calendar Events โ Automatically schedules Google Calendar events for: Effective Date Termination Date Renewal Reminder (60 days before termination) ๐ก Why Use This Workflow Manual contract management is error-prone and time-consuming key details like renewal dates, payment terms, or termination clauses often get lost in email threads or folders. This workflow ensures: Zero missed deadlines** automatic Google Calendar reminders keep your team on track. Instant team visibility** - Slack notifications keep legal, finance, and operations aligned. End-to-end automation** no need for manual parsing, data entry, or follow-ups. ๐ง Perfect For Legal teams automating contract intake and tracking Real estate or lease management workflows Finance or procurement teams needing expiration alerts Organizations centralizing contract metadata in Sheets ๐ ๏ธ How to Customize Modify Extraction Fields Edit the VLM Run Execute Agent schema to add fields like contract value, payment schedule, department, or contact email. Change Storage Swap Google Sheets for Airtable, Notion, or BigQuery if you manage large datasets or need relational tracking. Customize Notifications Send Slack alerts only for high-value or expiring contracts, and tag relevant teams (e.g., @legal, @finance). Add Calendar Events Auto-create events for reviews or payment milestones using extra date fields. Add Approvals or Signatures Insert a Google Form or Slack approval step, or trigger DocuSign for e-signature automation. โ ๏ธ Community Node Disclaimer This workflow uses community nodes (VLM Run) that may need additional permissions and custom setup.
by Priya Jain
This workflow provides an OAuth 2.0 auth token refresh process for better control. Developers can utilize it as an alternative to n8n's built-in OAuth flow to achieve improved control and visibility. In this template, I've used Pipedrive API, but users can apply it with any app that requires the authorization_code for token access. This resolves the issue of manually refreshing the OAuth 2.0 token when it expires, or when n8n's native OAuth stops working. What you need to replicate this Your database with a pre-existing table for storing authentication tokens and associated information. I'm using Supabase in this example, but you can also employ a self-hosted MySQL. Here's a quick video on setting up the Supabase table. Create a client app for your chosen application that you want to access via the API. After duplicating the template: a. Add credentials to your database and connect the DB nodes in all 3 workflows. Enable/Publish the first workflow, "1. Generate and Save Pipedrive tokens to Database." Open your client app and follow the Pipedrive instructions to authenticate. Click on Install and test. This will save your initial refresh token and access token to the database. Please watch the YouTube video for a detailed demonstration of the workflow: How it operates Workflow 1. Create a workflow to capture the authorization_code, generate the access_token, and refresh the token, and then save the token to the database. Workflow 2. Develop your primary workflow to fetch or post data to/from your application. Observe the logic to include an if condition when an error occurs with an invalid token. This triggers the third workflow to refresh the token. Workflow 3. This workflow will handle the token refresh. Remember to send the unique ID to the webhook to fetch the necessary tokens from your table. Detailed demonstration of the workflow: https://youtu.be/6nXi_yverss
by ลukasz
What is This? This automation simulates Scrum Master role on daily meetings. Essentially it is an AI Scrum Master using different sources of data. As intelligent support system for Scrum Masters that leverages data from Asana, Slack, and direct developer responses for comprehensive sprint status analysis and identification of areas requiring intervention. As such it is usable for Scrum Masters (of course) but Scrum Team aswell, Product Owner and possibly Business Owner. Who is it For? This automation is designed for Agile teams to support the Scrum Master role by collecting and analyzing data from various sources to identify potential impediments and support the team in sprint delivery. How Does It Work? The workflow has four main data entry points, that are launched either on-click or on workdays. First is collecting project section information from Asana. The automation retrieves project structure, available sections, and their organization, allowing the AI to understand the team's work context. Second is getting recently modified tasks in the Asana project. The system tracks changes in tasks, their status, assignments, and updates to detect potential delays or issues. Third is obtaining communication in the team's Slack channel. The flow collects data about recent conversations, discussion threads, and team communication to identify warning signals or areas requiring attention. Fourth is directly collecting responses from developers about the current sprint - their progress, impediments, concerns, and support needs. All collected data is passed to an AI model that analyzes it within the Scrum methodology context and identifies: Potential impediments in sprint delivery Areas requiring Scrum Master intervention Recommendations for team support Warning signals regarding Sprint Goal achievement Output is being pushed to Slack channel so it can be potentially used by another iteration of same flow itself via Slack channel history. Requirements You need Asana oAuth credentials You need OpenAI / alternative AI for processing data You need to have Slack app with proper permissions channels:history chat:write groups:history im:history mpim:history users.profile:write users:write Configuration Set up node "Asana Project and Slack Channel". Provide Asana project ID and Slack Channel ID (optional) Set up node "Get Scrum Master Answers". There are daily questions/answers that are being sent to channel. Alternative use You can get rid of the whole "Ask Users Daily ScrumMaster Questions" part if you don't want to do it simirarly as "daily Scrum standups". In such case whole flow is essentially changed to static analyzer of project status based on Slack and Asana. Extensions and Customizations There are many possibilities to extend this automation depending on team needs. For example, you can add integration with additional project management tools, implement different notification schemes based on detected issue criticality, or adjust data collection frequency to match the team's work rhythm. Disclaimers and Notes Whole automation has one important assumption: project is run on single Slack channel and on single Asana board. Of cource this can be extended, but is beyond currently designed scope. Adding new sources for AI to analyze should be fairly easy - just add another branch of data and push it to AI prompt. This automation represents a proof-of-concept and should not replace an actual Scrum Master. The Scrum Master role extends far beyond data collection and analysis - it requires deep understanding of team dynamics, business context, and interpersonal skills. As Scrum.org emphasizes, the Scrum Master doesn't need to be present during Daily Scrum, and their role is to ensure the meeting happens, but developers are responsible for conducting the meeting. Mindlessly executing daily questions without proper context analysis can lead to situations where the Scrum Master becomes a team manager instead of a self-organization facilitator. A real Scrum Master analyzes much more data than what's collected by automation - they observe team dynamics, understand business context, identify deeper root causes of problems, and support the team in developing self-organization skills. AI can be a valuable support tool, but it cannot replace the human intuition, empathy, and experience essential in this role. The automation should be treated as a tool supporting the Teams's work, providing additional insights and helping identify areas requiring attention, but always under the supervision and interpretation of an experienced Scrum practitioner.
by Daniel Ng
Auto Backup n8n Workflows to Google Drive Imagine the sinking feeling: hours, weeks, or even months of meticulous work building your n8n workflows, suddenly gone. A server crash, an accidental deletion, data corruption, or an unexpected platform issue โ and all your automated processes vanish. Without a reliable backup system, you're facing a complete rebuild from scratch, a scenario that's not just frustrating but can be catastrophic for business operations. Furthermore, consider the daunting task of migrating your n8n instance to a new host or server. Manually exporting each workflow, one by one, then painstakingly importing them into the new environment is not only incredibly time-consuming, especially if you have tens or hundreds of workflows, but also highly prone to errors and omissions. You need a systematic, automated solution. This workflow provides a robust solution for automatically backing up all your n8n workflows to Google Drive on schedule (default to every hour). It creates a uniquely named folder for each backup instance, incorporating the date and hour, and then systematically uploads each workflow as an individual JSON file. To manage storage space, the workflow also includes a cleanup mechanism that deletes backup folders older than a user-defined retention period (defaulting to 7 days). Ideally, this backup workflow should be used in conjunction with a restore solution like our "Restore Workflows from Google Drive Backups" template. For more powerful n8n templates, visit our website or contact us at AI Automation Pro. We help your business build custom AI workflow automation and apps. Feature highlights Triggers on schedule (default to hourly). Creates a \n8n\_backup\_YYYY-MM-DD\_HH\ folder in Google Drive. Fetches all n8n workflows. Saves each workflow as a JSON file to the new folder. Deletes backup folders older than the 'Coverage Period' (default to 7 days). Who is this for? This template is designed for: n8n Administrators and Developers:** Who need a reliable, automated system to safeguard their workflows against accidental loss, corruption, or system issues. Proactive n8n Users:** Who want to maintain a version history of their workflows, enabling easy rollback to previous configurations if necessary. Organizations:** Seeking to implement disaster recovery and data integrity practices for their n8n automation infrastructure. What problem is this workflow solving? / use case This workflow directly addresses these critical risks and challenges by: Automating Backups:** Eliminates the manual effort and inconsistency of ad-hoc backups, ensuring your workflows are regularly and reliably saved. Preventing Data Loss:** Safeguards your valuable automation assets against unforeseen disasters by creating secure, versioned copies in Google Drive. Facilitating Migration & Recovery:** Provides the foundational backups needed for a smoother, more systematic migration or a full disaster recovery, allowing you to restore your operations efficiently. Version Control:** By storing scheduled backups (defaulting to hourly), it allows you to access and restore previous versions of your workflows, offering an undo capability for significant changes or corruptions. Storage Management:** Automatically removes old backups based on a configurable retention period, preventing excessive use of Google Drive storage while keeping a relevant history. What this workflow does Scheduled Trigger: Runs automatically every hour. Timestamping: Fetches the current date and hour to create a unique name for the backup folder. Folder Creation: Creates a new folder in a specified Google Drive location. The folder is named in the format: n8n_backup_YYYY-MM-DD_HH. Workflow Retrieval: Connects to your n8n instance via its API and fetches a list of all existing workflows. Individual Backup: Processes each workflow one by one: Converts the workflow data to a binary JSON file. Uploads the JSON file (named after the workflow) to the hourly backup folder in Google Drive. Includes a short wait step between uploads to respect potential API rate limits. Old Backup Deletion: Calculates a cut-off date based on the "Coverage Period" set in the "Settings" node (e.g., 7 days prior to the current date). Searches Google Drive for backup folders (matching the naming convention) that are older than this cut-off date. Deletes these identified old backup folders to free up storage space. Step-by-step setup Import Template: Upload the provided JSON file into your n8n instance. Configure Credentials: Google Drive Nodes: You will need to create or select existing Google Drive OAuth2 API credentials for these nodes. n8n Node: n8n (node that fetches workflows) Configure n8n API credentials to allow the workflow to access your instance's workflow data. Specify Google Drive Backup Location: Open the "Google Drive Backup Folder Every Hour" node. Under the "Drive ID" parameter: select it from the list or provide its ID. Under the "Folder ID" parameter: select or input the ID of the parent folder in Google Drive where you want the n8n_backup_YYYY-MM-DD_HH folders to be created (e.g., a general "n8n\_Backups" folder). Set Backup Retention Period: Open the "Settings" node. Modify the value for "Coverage Period" (default is 7). This number represents the number of days backups should be kept before being deleted. Activate Workflow: Toggle the "Active" switch for the workflow in your n8n dashboard. How to customize this workflow to your needs Backup Frequency:* Adjust the "Rule" in the *"Schedule Trigger"** node to change the backup interval (e.g., daily, specific times). Folder/File Naming:* Modify the expressions in the "Parameters" tab of the *"Google Drive Backup Folder Every Hour"* node (for folder name) or the *"Google Drive Upload Workflows"** node (for file name) if you require a different naming convention. Targeted Backups:* To back up only specific workflows, insert a "Filter" node after the *"n8n"** node to filter workflows based on criteria like name, tags, or ID before they reach the "Move Binary Data" node. Wait Time:* The *"Wait"** node is set to 3 seconds between uploads. If you have a very large number of workflows or encounter rate limiting, you might adjust this duration. Error Workflow:** The workflow is pre-configured with an "Error Workflow" setting. Ensure this error workflow exists in your n8n instance, or update the setting to point to your preferred error handling workflow. This can be used to send notifications on failure. Important Considerations Resource Usage:** While the workflow includes a wait step between individual workflow uploads to minimize load, backing up an extremely large number of workflows could still consume resources on your n8n instance and make many API calls to Google Drive. Monitor performance if you have thousands of workflows. Testing Restore Process**: Regularly test restoring a few workflows from your Google Drive backups using the companion "Restore All n8n Workflows from Google Drive" template or a manual import. This verifies the integrity of your backups and ensures you can recover when needed. Workflow Modifications**: If you modify this backup workflow (e.g., change the folder naming convention), ensure your restore process or workflow is also updated to match these changes.
by Lorena
This workflow ensures gender inclusive language in Mattermost channels. If someone addresses the group with โguysโ or โgalsโ, a bot promptly replies with: "May I suggest โfolksโ or โy'allโ? We use gender inclusive language here. ๐". Webhook node**: triggers the workflow when a new message is posted in Mattermost. IF node**: verifies if the message includes the words "guys" or "gals". If false, it does not take any action. If true, it triggers the Mattermost node. Mattermost node**: posts the language warning message in the Mattermost channel.
by Shrey
This workflow can be used to save all of your workflows in: a raw state (as a json file in Dropbox) an Airtable base, in a pre-designed format. It runs periodically (currently, every 30 minutes) and either updates (if already existing in Airtable) or creates a new record in Airtable for each workflow. Here's the Airtable base to give you an idea: View Airtable base Note: This workflows uses the "http://localhost:5678/rest" API which the UI editor uses but is still not officially supported. Hence, it may suffer breaking changes at some point in the future and the workflow might become dysfunctional then.
by Harshil Agrawal
This is a workflow that sends daily astronomy picture of the day using the NASA node to a channel on Telegram. Cron node: The Cron node triggers the workflow daily at 8 PM. You can update the time in the Cron node to trigger the workflow at your desired time. NASA node: After the Cron node triggers the workflow, the NASA node fetches the Astronomy Picture of the Day from the NASA API. You can also get the binary file of the image. Toggle Download Image to true to get the file. Telegram node: The Telegram node sends the image to a Telegram channel. If you want to share the image on another platform, you can replace the Telegram node with the node of that platform. For example, if you want to post the image on a channel on Slack, replace the Telegram node with the Slack node. You can learn to build this workflow on the documentation page of the NASA node.
by Harshil Agrawal
This workflow allows you to check for preview for a link and return the preview if it exists. Peekalink node: This node checks if a preview is available for a URL or not. If a preivew is available the node returns true, otherwise false. IF node: The IF node checks the output from the previous node. If the condition is true the node connected to the true branch is executed. If the condition is false the node connected to the false branch is executed. Peekalink1 node: This node will fetch the preview of the URL. Based on your use-case, you can connect the Slack node, Mattermost node etc. to get the response on these platforms. NoOp node: Adding this node here is optional, as the absence of this node won't make a difference to the functioning of the workflow. We've added this as it can sometimes help others with a better understanding of the workflow, visually.
by Obsidi8n
How it Works This n8n template makes it possible to send emails directly from your Obsidian notes. It leverages the power of the Obsidian Post Webhook plugin, allowing seamless integration between your notes and the email workflow. What it does: Receives note content and metadata from Obsidian via a Webhook. Parses YAML frontmatter to define email recipients, subject, and more. Automatically processes attachments, encoding them into an email-friendly format. Sends emails via Gmail and confirms the status back to Obsidian. Includes a testing feature to verify everything works before going live. Set-up Steps Webhook Configuration: Set your n8n POST Webhook URL in the Obsidian Obsidian Post Webhook plugin settings. Email Integration: Submit the Gmail credentials in n8n email nodes. Test the Workflow: Run a test from Obsidian to ensure the template functions correctly. Activate and Enjoy: Start sending customized emails with attachments from your notes in no time!