by Wayne Simpson
Automate your email management with this workflow, designed for freelancers and business professionals who receive high volumes of emails. By leveraging AI-powered categorisation and dynamic email processing, this template helps you organise your inbox and streamline communication for better efficiency and productivity. Check out the YouTube video for step-by-step set up instructions! How it works: Fetch & Filter Emails: The workflow retrieves emails from your Microsoft Outlook account, filtering out flagged emails and those already categorised. Content Preparation: Each email is cleaned up and converted to a structured format using Markdown, making it easier for AI processing. AI Categorization: The content is analysed using an AI model, which categorises the emails into predefined categories (e.g., Action, Junk, Business, SaaS) based on the context and content. Email Categorization & Folder Management: The categorised emails are updated in Microsoft Outlook and moved to respective folders such as "Junk Email" or "Receipts" based on the AI's classification. Conditional Processing & Final Checks: Additional checks and conditions ensure that only unread emails are processed, and errors are gracefully managed to maintain workflow stability. Set up steps: Connect Microsoft Outlook: Link your Microsoft Outlook account using the built-in credentials node to enable email fetching, updating, and folder management. Configure AI Model (Ollama API): Set up the AI model by connecting to the Ollama API and choosing your desired language model for categorisation. Modify Email Categories (Optional): Customize the categories and subcategories within the workflow to suit your unique email management needs. Set Up Error Handling: Review the error handling node settings to ensure smooth workflow execution. This template offers a robust solution for managing and organising your inbox, helping you save time and keep your focus on important emails.
by Darryn Balanco
This workflow automates the management of DigitalOcean Droplet snapshots by listing all droplets, filtering based on the number of snapshots, and deleting excess snapshots before creating new ones. It ensures your droplet snapshots stay organized and within a manageable limit, preventing unnecessary storage costs due to an excess of snapshots. Who is this for? This workflow is perfect for users managing DigitalOcean Droplets and looking to automate the process of snapshot creation and cleanup to save on storage costs and maintain efficient resource management. It’s useful for DevOps teams, cloud administrators, or any developer leveraging DigitalOcean for their infrastructure. What problem is this workflow solving? When managing multiple DigitalOcean Droplets, snapshots can quickly accumulate, taking up space and increasing storage costs. Manually deleting and creating snapshots can be time-consuming and inefficient. This automation solves this problem by automating the snapshot management process, ensuring that no more than a defined number of snapshots are kept per droplet. What this workflow does Runs every 48 hours: The workflow is triggered by a cron node that runs every 48 hours, ensuring timely snapshot management. List all droplets: The workflow retrieves all droplets in the DigitalOcean account. Retrieve snapshots: For each droplet, the workflow retrieves a list of existing snapshots. Filter snapshots: If the number of snapshots exceeds 4, the workflow filters for snapshots that need to be deleted. Delete snapshots: Excess snapshots are automatically deleted based on the filter criteria. Create new snapshot: After cleaning up, the workflow creates a new snapshot for each droplet, ensuring that backups are always up-to-date. Setup DigitalOcean API Key: You’ll need to configure the HTTP Request nodes with your DigitalOcean API key. This key is required for authenticating requests to list droplets, retrieve snapshots, delete snapshots, and create new ones. Snapshot Threshold: By default, the workflow is set to keep no more than 4 snapshots per droplet. This can be adjusted by modifying the filter node conditions. Set Execution Frequency: The cron node is set to run every 48 hours, but you can adjust the timing to suit your needs. How to customize this workflow Adjust Snapshot Limit**: Change the value in the filter node if you want to keep more or fewer snapshots. Modify Run Frequency**: The workflow runs every 48 hours by default. You can change the frequency in the cron node to run more or less often. Enhance with Notifications**: You can add a notification node (e.g., Slack or email) to alert you when snapshots are deleted or created. Workflow Summary This workflow automates the management of DigitalOcean Droplet snapshots by keeping the number of snapshots under a defined limit, deleting the oldest ones, and ensuring new snapshots are created at regular intervals.
by Darryn Balanco
This workflow automates the process of gathering LinkedIn advice articles, extracting their content, and generating unique contributions for each article using an AI model. The contributions are then posted to a Slack channel and a NocoDB database for record-keeping. The workflow is triggered weekly to ensure new articles are continuously collected and responded to. Who is this for? This workflow is designed for professionals, marketers, and content creators looking to boost their LinkedIn presence by regularly engaging with LinkedIn advice articles. It’s especially useful for those who want to be seen as a "thought leader" or "top voice" in their niche by contributing relevant and unique advice to trending topics. What problem is this workflow solving? Manually searching for relevant LinkedIn articles, reading through them, and crafting thoughtful contributions can be time-consuming. This workflow solves that by automating the process of finding new articles, extracting key content, and generating AI-powered contributions. It helps users stay consistently active on LinkedIn, contributing value to trending discussions. What this workflow does Triggers Weekly: The workflow is set to run every Monday at 8:00 AM. Search Google for LinkedIn Advice Articles: Uses a predefined Google search URL to find the latest LinkedIn advice articles based on the user's area of expertise. Extract LinkedIn Article Links: A code node extracts all LinkedIn advice article links from the search results. Retrieve Article Content: For each article link, the workflow retrieves the HTML content and extracts the article title, topics, and existing contributions. Generate AI-Powered Contributions: The workflow sends the extracted article content to an AI model, which generates unique, helpful advice for each topic within the article. Post to Slack & NocoDB: The AI-generated contributions, along with the article links, are posted to a designated Slack channel and stored in a NocoDB database for future reference. Setup Google Search URL: Update the Google search URL with the relevant LinkedIn advice query for your field (e.g., "site:linkedin.com/advice 'marketing automation'"). Slack Integration: Connect your Slack account and specify the Slack channel where you want the contributions to be posted. NocoDB Integration: Set up your NocoDB project to store the generated contributions along with the article titles and links. How to customize this workflow Change Search Terms**: Modify the Google search URL to focus on a different LinkedIn topic or expertise area. Adjust Trigger Frequency**: The workflow is set to run weekly, but you can adjust the frequency by changing the schedule trigger. Enhance Contribution Quality**: Customize the AI model's prompt to generate contributions that align with your brand voice or content strategy. Workflow Summary This workflow helps users maintain a consistent presence on LinkedIn by automating the discovery of new advice articles and generating unique contributions using AI. It is ideal for professionals who want to engage with LinkedIn content regularly without spending too much time manually searching and drafting responses.
by Didac Fernandez
Nova AI Content Marketing Agent - LinkedIn & Facebook Automation This n8n template demonstrates how to create a complete AI-powered social media content creation and scheduling system that generates platform-optimized posts for LinkedIn and Facebook with custom images and human approval workflows. Possible use cases: Generate a full week of social media content from a single brand brief Create platform-specific content that maintains brand voice consistency Automate image generation with AI while maintaining quality control Schedule approved content across multiple social platforms Track and organize all content in centralized spreadsheets How it works The automation starts with a form submission collecting 10 brand variables (name, industry, demographics, etc.) Nova AI Agent analyzes the brand information and generates 6 distinct social media posts (3 LinkedIn professional, 3 Facebook community-focused) Content is split by platform and routed to separate image generation workflows Google Imagen 4 Ultra creates custom visuals for each post with platform-specific aspect ratios Each generated image is sent to Slack for human approval via interactive forms If feedback is provided, NanoBanana AI edits the image based on natural language instructions Approved images are uploaded to Google Drive with organized naming conventions All content data is logged to Google Sheets with image URLs and scheduling information Final posts are scheduled via Late API to respective social platforms The workflow loops through each post individually for quality control Requirements OpenRouter API credentials for GPT-5 Mini access Replicate API key for Google Imagen 4 Ultra and NanoBanana Slack OAuth2 credentials with bot permissions Google Drive OAuth2 credentials Google Sheets API access GetLate API key connected to LinkedIn and Facebook accounts Perplexity API for research enhancement (optional) HOW TO USE STEP 1 - Setup Form and Brand Variables Configure the Form Trigger webhook URL for brand data collection Update the 10 form fields with your specific industry placeholders Test the form submission to ensure data flows correctly STEP 2 - Configure AI Services Add your OpenRouter API credentials to both Chat Model nodes Add your Replicate API key to the HTTP Header Auth credential Configure Perplexity API credentials for research functionality Set up custom session keys for memory management STEP 3 - Setup Approval Workflow Add Slack OAuth2 credentials to both "Send message and wait" nodes Update the Slack channel ID to your preferred approval channel Configure the custom form fields for approval/feedback collection STEP 4 - Configure Storage and Scheduling Add Google Drive OAuth2 credentials and update the target folder ID Add Google Sheets credentials and update the spreadsheet ID Get your Late API key from getlate.dev and add to HTTP Header Auth Update the Late accountId in both Schedule Post nodes with your platform IDs STEP 5 - Customize Content Strategy Modify the Nova system prompt to match your brand voice requirements Adjust the visual style requirements in the AI Agent configuration Update posting date logic and timezone settings as needed Test the complete workflow with sample brand data
by Angel Menendez
Streamline Case Management in TheHive via Slack! Our TheHive Slack Integration empowers SOC analysts by allowing them to efficiently manage and update case attributes directly within Slack, reducing the need to switch contexts and enhancing response time. Key Features: Direct Case Management**: Modify case details such as assignee, severity, status, and more through intuitive form inputs embedded within Slack messages. Seamless Integration**: Assumes matching email addresses between TheHive and Slack users for straightforward assignee updates. Note: Ensure email consistency to avoid assignment errors. Instant Case Actions**: Quickly close cases as false positives or adjust threat levels with minimal clicks, directly impacting case status in TheHive and reflecting updates immediately in Slack. Task Management**: Add tasks to cases through a user-friendly modal popup, fostering better task tracking and delegation within your team. Operational Benefits: Efficiency**: Enables analysts to perform multiple case actions without leaving Slack, streamlining workflows and saving valuable time. Accuracy**: Reduces the chances of human error by providing a controlled interface for case updates. Agility**: Enhances the SOC team's agility by providing tools for rapid response and case management, crucial for effective security operations. Setup Tips: Verify that all SOC team members have matching email IDs in TheHive and Slack. Familiarize your team with the Slack form inputs and ensure they understand the importance of accurate data entry. Regularly review and update the integration settings to accommodate any changes in your security operations protocols. Need Help? For detailed setup instructions or troubleshooting, refer to our Integration Guide or reach out on our Support Forum. Leverage this integration to maximize your SOC team's efficiency and responsiveness, ensuring that case management is as streamlined and effective as possible.
by Alexandra Spalato
YouTube Content Repurposing Automation Who's it for This workflow is for content creators, marketers, agencies, coaches, and businesses who want to maximize their YouTube content ROI by automatically generating multiple content assets from single videos. It's especially useful for professionals who want to: Repurpose YouTube videos into blogs, social posts, newsletters, and tutorials without manual effort Scale their content production across multiple channels and platforms Create consistent, high-quality content derivatives while saving time and resources Build automated content systems that generate multiple revenue streams Maintain active presence across social media, email, and blog platforms simultaneously What problem is this workflow solving Content creators face significant challenges when trying to maximize their video content: Time-intensive manual repurposing: Converting one YouTube video into multiple content formats traditionally requires hours of manual writing, editing, and formatting across different platforms. Inconsistent content quality: Manual repurposing often leads to varying quality levels and missed opportunities to optimize content for specific platforms. High costs for content services: Hiring ghostwriters or content agencies to repurpose videos can cost thousands of dollars monthly. Scaling bottlenecks: Manual processes prevent creators from efficiently scaling their content across multiple channels and formats. This workflow solves these problems by automatically extracting YouTube video transcripts, using AI to generate multiple high-quality content formats (tutorials, blog posts, social media content, newsletters), and organizing everything in Airtable for easy management and distribution. How it works Automated Video Processing Starts with a manual trigger and retrieves YouTube URLs from your Airtable configuration, processing only videos marked as "selected" while filtering out those marked for deletion. Intelligent Transcript Extraction Uses Scrape Creator API to extract video transcripts, automatically cleaning and formatting the text for optimal AI processing and content generation. Multi-Format Content Generation Leverages OpenRouter models, o you can easily test different AI models and choose the one that delivers the best results for your needs: Step-by-step tutorials with code snippets and technical details YouTube scripts with hooks, titles, and conclusions Blog posts optimized for lead generation Structured summaries with key takeaways LinkedIn posts with engagement triggers Newsletter content for email marketing Twitter/X posts for social media Smart Content Filtering Processes only the content types you've selected in Airtable, ensuring efficient resource usage and faster execution times. Automated Content Organization Matches and combines all generated content pieces by URL, then updates your Airtable with complete, ready-to-use content assets organized by type and source video. How to set up Required credentials OpenRouter API key** Airtable Personal Access Token** Scrape Creators API Key** - For YouTube transcript extraction and processing Airtable base setup Create an Airtable base with one main table: Videos Table: title** (Single line text): Video title for reference url** (URL): YouTube video URL to process Status** (Single select): Options: "selected", "delete", "processed" output** (Multiple select): Content types to generate summary tutorial blog-post linkedin newsletter tweeter youtube summary** (Long text): Generated video summary tutorial** (Long text): Generated step-by-step tutorial key_take_aways** (Long text): Extracted key insights blog_post** (Long text): Generated blog post content linkedin** (Long text): LinkedIn post content newsletter** (Long text): Email newsletter content tweeter** (Long text): Twitter/X post content youtube_titles** (Long text): YouTube video title suggestions youtube_hook** (Long text): Video opening hooks youtube_steps** (Long text): Video step breakdowns youtube_conclusion** (Long text): Video ending/CTAs API Configuration Scrape Creator Setup: Sign up for Scrape Creator API Obtain your API key from the dashboard Configure the HTTP Request node with your credentials Set the endpoint to: https://api.scrapecreators.com/v1/youtube/video/transcript OpenAI Setup: Create an OpenRouter account and generate an API key Workflow Configuration Import the workflow JSON into your n8n instance Update all credential references with your API keys Configure the Airtable nodes with your base and table IDs Test the workflow with a single video URL first Requirements n8n instance** (self-hosted or cloud) Active API subscriptions** for OpenRouter (or the LLM or your choice), Airtable, and Scrape Creator YouTube video URLs** - Must be publicly accessible videos with available transcripts Airtable account** - Free tier sufficient for most use cases How to customize the workflow Modify content generation prompts Edit the LLM Chain nodes to customize content style and format: Tutorial node**: Adjust technical depth and formatting preferences Blog post node**: Modify tone, length, and CTA strategies LinkedIn node**: Customize engagement hooks and professional tone Newsletter node**: Tailor subject lines and email marketing approach Adjust AI model selection Update the OpenRouter Chat Model to use different models Add new content formats Create additional LLM Chain nodes for new content types: Instagram captions TikTok scripts Podcast descriptions Course outlines
by Angel Menendez
Analyze Emails for Security Insights Who is this for? This workflow is ideal for security teams, IT Ops professionals, and managed service providers (MSPs) responsible for monitoring and validating email traffic. It’s especially useful for organizations that need to identify potential phishing attempts, spam, or compromised accounts by analyzing email headers and IP reputation. What problem is this workflow solving? This workflow helps identify malicious or suspicious emails by verifying email authentication headers (SPF, DKIM, DMARC) and analyzing the reputation of the originating IP address. By automating these checks, it reduces manual analysis time and flags potential threats efficiently. What this workflow does Email Monitoring:** Polls a specified Microsoft Outlook folder for new emails in real-time. Header Analysis:** Retrieves and processes email headers to extract critical information such as authentication results and the sender’s IP address. IP Reputation Check:** Leverages external APIs (IP Quality Score and IP-API) to analyze the originating IP for potential spam or malicious activity. Authentication Validation:** Validates SPF, DKIM, and DMARC headers, determining if the email passes industry-standard authentication protocols. Data Aggregation and Reporting:** Combines all analyzed data into a unified format, ready for reporting or integration into downstream systems. Webhook Integration:** Outputs the findings via a webhook, enabling integration with alerting tools or security information and event management (SIEM) platforms. Setup Connect to Outlook: Configure the Microsoft Outlook trigger node with valid OAuth2 credentials. Specify the email folder to monitor for new messages. API Keys (Optional): Obtain an API key for IP Quality Score (https://ipqualityscore.com). Ensure the IP-API endpoint is accessible. This step is optional as ipqualityscore.com will provide a limited number of free lookups each month. See more details here. Webhook Configuration: Set up a webhook endpoint to receive the output of the workflow. Optional Adjustments: Customize polling intervals in the trigger node. Modify header filters or extend the validation logic as needed. How to customize this workflow to your needs Add Alerts:** Use the Respond to Webhook node to trigger notifications in Slack, email, or any other communication channel. Integrate with SIEM:** Forward the workflow output to SIEM tools like Splunk or ELK Stack for further analysis. Modify Validation Rules:** Update SPF, DKIM, or DMARC logic in the Set nodes to align with your organization’s security policies. Expand IP Analysis:** Add more APIs or services to enrich IP reputation data, such as VirusTotal or AbuseIPDB. This workflow provides a robust foundation for email security monitoring and can be tailored to fit your organization's unique requirements. With its modular design and integration options, it’s a versatile tool to enhance your cybersecurity operations.
by Angel Menendez
Analyze Emails for Security Insights Who is this for? This workflow is ideal for IT professionals, security analysts, and organizations looking to enhance their email security practices. It is particularly useful for those who need to analyze Gmail email headers for IP tracking, spoofing detection, and sender reputation assessment. What problem is this workflow solving? Email spoofing and phishing attacks are significant cybersecurity threats. By analyzing email headers, this workflow provides detailed insights into the email's origin, authentication status, and the reputation of the sending IP address. It helps detect potential spoofing attempts and assess the trustworthiness of incoming emails. What this workflow does This n8n workflow automates the process of analyzing email headers received in Gmail. It performs the following key functions: Triggering and Email Header Extraction: It monitors Gmail inboxes for new emails and extracts their headers for analysis. Authentication Analysis: It validates SPF, DKIM, and DMARC authentication results to ensure the email adheres to industry-standard security protocols. IP Analysis: The workflow extracts the originating IP address and evaluates its reputation and geographic details using external APIs. Reputation Scoring: It integrates with IP Quality Score to detect spam activity and assess the sender's reputation. Consolidation and Webhook Response: All results are aggregated into a single JSON response, making it easy to integrate with third-party platforms or tools for further automation. Setup Authenticate Gmail: Configure the Gmail Trigger node with your Gmail account credentials. API Keys (Optional): Obtain an API key for IP Quality Score (https://ipqualityscore.com). Ensure the IP-API endpoint is accessible. This step is optional as ipqualityscore.com will provide a limited number of free lookups each month. See more details here. Activate the Workflow: Ensure the workflow is active to process incoming emails in real-time. How to customize this workflow to your needs Add Alerts:** Use the Gmail - Respond to Webhook node to trigger notifications in Slack, email, or any other communication channel. Integrate with SIEM:** Forward the workflow output to SIEM tools like Splunk or ELK Stack for further analysis. Modify Validation Rules:** Update SPF, DKIM, or DMARC logic in the Set nodes to align with your organization’s security policies. Expand IP Analysis:** Add more APIs or services to enrich IP reputation data, such as VirusTotal or AbuseIPDB. This workflow provides a robust foundation for email security monitoring and can be tailored to fit your organization's unique requirements. With its modular design and integration options, it’s a versatile tool to enhance your cybersecurity operations.
by Dat Proto
Introduction This workflow will backup all of your existed workflows to a single Github repository. The Backup folders' name are based on the current backup date and have default format: "yyyy/MM/dd" (setup at "Create sub path" node). Throughout the backup process, the N8N will inform user via discord with clear message about Start, Success and Failure backups. The workflow will be Tech Stack The following nodes / services / libraries are used in this workflow: Nodes: Discord: To send message to configured setup channel. N8N: To get all workflows' information. Github: To store backup data. Code: To run data comparison (Existed vs Latest workflow data). Wait: To avoid discord message rate limit. External libraries: Underscore.js: JavaScript library that provides lots of common Javascript functions, to help user save time when using code node. Guideline Open "Config" node and setup the following information: repo_owner: Your Github username. repo_name: The repository that you want to store workflows backup data. Open "Create sub path" node and change the naming and path format of backup folder(s). Setup custom messages in 3 discord nodes: Starting Message: N8N inform user at the time workflow start. Inform Success Flows: After each success backup, N8N will notify user. Inform Failed Flows: After each failure backup, N8N will notify user to have appropriate action. Completed Notifications: Then at the final, the workflow will give user a summary. Setup "Schedule Trigger" node to change default automated backup time. Screenshots Discord output
by Boriwat Chanruang
Who is this for? This workflow is designed for: Content creators**, artists, or hobbyists looking to experiment with AI-generated art. Small business owners* or *marketers** using LEGO-style designs for branding or promotions. Developers* or *AI enthusiasts** wanting to automate image transformations through messaging platforms like LINE. What problem is this workflow solving? Simplifies the process of creating custom AI-generated LEGO-style images. Automates the manual effort of transforming user-uploaded images into AI-generated artwork. Bridges the gap between messaging platforms (LINE) and advanced AI tools (DALL·E). Provides a seamless system for users to upload an image and receive an AI-transformed output without technical expertise. What this workflow does Image Upload via LINE: Users send an image to the LINE chatbot. AI-Powered Prompt Creation: GPT generates a prompt to describe the uploaded image for LEGO-style conversion. AI Image Generation: DALL·E 3 processes the prompt and creates a LEGO-style isometric image. Image Delivery: The generated image is returned to the user in LINE. Setup Prerequisites LINE Developer Account** with API credentials. Access to OpenAI API with DALL·E and GPT-4 capabilities. A configured n8n instance to run this workflow. Steps Environment Setup: Add your LINE API Token and OpenAI credentials as environment variables (LINE_API_TOKEN, OPENAI_API_KEY) in n8n. Configure LINE Webhook: Point the LINE webhook to your n8n instance. Connect OpenAI: Set up OpenAI API credentials in the workflow nodes for GPT-4 and DALL·E. Test Workflow: Upload a sample image in LINE and ensure it returns the LEGO-style AI image. How to customize this workflow to your needs Localization**: Modify response messages in LINE to fit your audience's language and tone. Integration**: Add nodes to send notifications through other platforms like Slack or email. Image Style**: Replace the LEGO-style image prompt with other artistic styles or themes. Advanced Use Cases Art Contests: Users upload images and receive AI-enhanced outputs for community voting or branding. Marketing Campaigns: Quickly generate creative visual content for ads and promotions using customer-submitted photos. Education: Use the workflow to teach students about AI-generated art and automation through a hands-on approach. Tips for Optimization Error Handling**: Add fallback nodes to handle invalid images or API errors gracefully. Logging**: Implement a logging mechanism to track requests and outputs for debugging and analytics. Scalability**: Use queue-based systems or cloud scaling to handle large volumes of image requests. Enhancements Add sticky notes in n8n to provide inline instructions for configuring each node. Create a tutorial video or documentation for first-time users to set up and customize the workflow. Include advanced filters to allow users to select from multiple styles beyond LEGO (e.g., pixel art, watercolor). This workflow enables seamless interaction between messaging platforms and advanced AI capabilities, making it highly versatile for various creative and business applications.
by Mario
Purpose This solution enables you to manage all your Notion and Todoist tasks from different workspaces as well as your calendar events in a single place. All tasks can be managed in Todoist and additionally Fantastical can be used to manage scheduled tasks & events all together. Demo & Explanation How it works The realtime sync consists of two workflows, both triggered by a registered webhook from either Notion or Todoist To avoid overwrites by lately arriving webhook calls, every time the current task is retrieved from both sides. Redis is used to prevent from endless loops, since an update in one system triggers another webhook call again. Using the ID of the task, the trigger is being locked down for 15 seconds. Depending on the detected changes, the other side is updated accordingly. Generally Notion is treaded as the main source. Using an "Obsolete" Status, it is guaranteed, that tasks never get deleted entirely by accident. The Todoist ID is stored in the Notion task, so they stay linked together An additional full sync workflow daily fixes inconsistencies, if any of them occurred, since webhooks cannot be trusted entirely. Since Todoist requires a more complex setup, a tiny workflow helps with activating the webhook. Another tiny workflow helps generating a global config, which is used by all workflows for mapping purposes. Mapping (Notion >> Todoist) Name: Task Name Priority: Priority (1: do first, 2: urgent, 3: important, 4: unset) Due: Date Status: Section (Done: completed, Obsolete: deleted) <page_link>: Description (read-only) Todoist ID: <task_id> Current limitations Changes on the same task cannot be made simultaneously in both systems within a 15-20 second time frame Subtasks are not linked automatically to their parent yet Recurring tasks are not supported yet Tasks names do not support URL’s yet Prerequisites Notion A database must already exist (get a basic template here) with the following properties (case matters!): Text: "Name" Status: "Status", containing at least the options "Backlog", "In progress", "Done", "Obsolete" Select: "Priority", containing the options "do first", "urgent", "important" Date: "Due" Checkbox: "Focus" Text: "Todoist ID" Todoist A project must already exist with the same sections like defined as Status in Notion (except Done and Obsolete) Redis Create a Free Redis Cloud instance or self-host Setup The setup involves quite a lot of steps, yet many of them can be automated for business internal purposes. Just follow the video or do the following steps: Setup credentials for Notion (access token), Todoist (access token) and Redis - you can also create empty credentials and populate these later during further setup Clone this workflow by clicking the "Use workflow" button and then choosing your n8n instance - otherwise you need to map the credentials of many nodes. Follow the instructions described within the bundle of sticky notes on the top left of the workflow How to use You can apply changes (create, update, delete) to tasks both in Notion and Todoist which then get synced over within a couple of seconds (this is handled by the differential realtime sync) The daily running full sync, resolves possible discrepancies in Todoist and sends a summary via email, if anything needed to be updated. In case that contains an unintended change, you can jump to the Task from the email directly to fix it manually.
by Sarfaraz Muhammad Sajib
This automation workflow captures incoming chat messages from your Tawk.to live chat widget and sends alert emails via Gmail to notify your support team instantly. It is designed to help you respond promptly to visitors and improve your customer support experience. Prerequisites Tawk.to account:** You must have an active Tawk.to account with a configured live chat widget on your website. Gmail account:** A Gmail account with API access enabled and configured in n8n for sending emails. n8n instance:** Access to an n8n workflow automation instance where you will import and configure this workflow. Step-by-Step Setup Instructions 1. Configure Tawk.to Webhook Log in to your Tawk.to dashboard. Navigate to Administration > Webhooks. Click Add Webhook and enter the following: URL: Your n8n webhook URL from the Receive Tawk.to Request node (e.g., https://your-n8n-instance.com/webhook/a4bf95cd-a30a-4ae0-bd2a-6d96e6cca3b4) Method: POST Events: Select the chat message event (e.g., Visitor Message or Chat Message Received) Save the webhook configuration. 2. Configure Gmail Credentials in n8n In your n8n instance, go to Credentials. Add a new Gmail OAuth2 credential: Follow Google's instructions to create a project, enable Gmail API, and obtain client ID and secret. Authenticate and authorize n8n to send emails via your Gmail account. 3. Import and Activate Workflow Import the provided workflow JSON into n8n. Verify the Receive Tawk.to Request webhook node path matches the webhook URL configured in Tawk.to. Enter the email address you want the alerts sent to in the Send alert email node’s sendTo parameter. Activate the workflow. Workflow Explanation Receive Tawk.to Request: This webhook node listens for POST requests from Tawk.to containing chat message data. Format the message: Extracts relevant data from the incoming payload such as chat ID, visitor name, country, and message text, and assigns them to new fields for easy use downstream. Send alert email: Uses Gmail node to send a notification email to your support team with all relevant chat details formatted in a clear, concise text email. Customization Guidance Email Recipient:** Update the sendTo field in the Send alert email node to specify your support team’s email address. Email Content:** Modify the message template in the Send alert email node’s message parameter to suit your tone or include additional details like timestamps or chat URLs. Additional Processing:** You can extend the workflow by adding nodes for logging chats, triggering Slack notifications, or storing messages in a database. By following these instructions, your support team will receive immediate email alerts whenever a new chat message arrives on your website, improving response times and customer satisfaction.