by Jimleuk
This n8n workflow builds an appointment scheduling AI agent which can Take enquiries from prospective customers and help them book an appointment by checking appointment availability Where no appointment is booked, the Agent is able to send follow-up messages to re-engage leads. After an appointment is booked, the agent is able reschedule or even cancel the booking for the user without human intervention. For small outfits, this workflow could contribute the necessary "man-power" required to increase business sales. The sample Airtable can be found here: https://airtable.com/appO2nHiT9XPuGrjN/shroSFT2yjf87XAox 2024-10-22 Updated to Cal.com API v2. How it works The customer sends an enquiry via SMS to trigger our workflow. For this trigger, we'll use a Twilio webhook. The prospective or existing customer's number is logged in an Airtable Base which we'll be using to track all our enquries. Next, the message is sent to our AI Agent who can reply to the user and decide if an appointment booking can be made. The reply is made via SMS using Twilio. A scheduled trigger which runs every day, checks our chat logs for a list of prospective customers who have yet to book an appointment but still show interest. This list is sent to our AI Agent to formulate a personalised follow-up message to each lead and ask them if they want to continue with the booking. The follow-up interaction is logged so as to not to send too many messages to the customer. Requirements A Twilio account to receive customer messages. An Airtable account and Base to use as our datastore for enquiries. Cal.com account to use as our scheduling service. OpenAI account for our AI model. Customising this workflow Not using Airtable? Swap this out for your CRM of choice such as hubspot or your own service. Not using Cal.com? Swap this out for API-enabled services such as Acuity Scheduling or your own service.
by Mario
Purpose This workflow automatically creates Tasks from forwarded Emails, similar to Asana, but better. Emails are processed by AI and converted to rather actionable task. In addition this workflow is build in a way, that multiple users can share this single process by setting up their individual configuration through a user friendly portal (internal tool) instead of the need to manage their own workflows. Demo How it works One Gmail account is used to process inbound mails from different users. A custom web portal enables users to define “routes”. Thats where the mapping between an automatically generated Gmail Alias and a Notion Database URL, including the personal API Token, happens. Using a Gmail Trigger, new entries are split by the Email Alias, so the corresponding route can be retrieved from the Database connected to the portal. Every Email then gets processed by AI to get generate an actionable task and get a short summary of the original Email as well as some metadata. Based on a predefined structure a new Page is created in the corresponding Notion Database. Finally the Email is marked as “processed” in Gmail. If an error happens, the route gets paused for a possible overflow and the user gets notified by Email. Setup Create a new Google account (alternatively you can use an existing one and set up rules to keep your inbox organized) Create two Labels in Gmail: “Processed” and “Error” Clone this Softr template including the Airtable dataset and publish the application Clone this workflow and choose credentials (Gmail, Airtable) Follow the additional instructions provided within the workflow notes Enable the workflow, so it runs automatically in the background How to use Open published Softr application Register as a new user Create a new route containing the Notion API key and the Notion Database URL Expand the new entry to copy the Email address Save the address as a new contact in your Email provider of choice Forward an Email to it and watch how it gets converted to an actionable task Disclamer Airtable was chosen, so you can setup this template fairly quickly. It is advised to replace the persistence by something you own, like a self hosted SQL server, since we are dealing with sensitive information of multiple users This solution is only meant for building internal tools, unless you own an embed license for n8n.
by Jimleuk
This n8n template is designed to assist and improve customer support team member capacity by automating the resolution of long-lived and forgotten JIRA issues. How it works Schedule Trigger runs daily to check for long-lived unresolved issues and imports them into the workflow. Each Issue is handled as a separate subworkflow by using an execute workflow node. This allows parallel processing. A report is generated from the issue using its comment history allowing the issue to be classified by AI - determining the state and progress of the issue. If determined to be resolved, sentiment analysis is performed to track customer satisfaction. If negative, a slack message is sent to escalate, otherwise the issue is closed automatically. If no response has been initiated, an AI agent will attempt to search and resolve the issue itself using similar resolved issues or from the notion database. If a solution is found, it is posted to the issue and closed. If the issue is blocked and waiting for responses, then a reminder message is added. How to use This template searches for JIRA issues which are older than 7 days which are not in the "Done" status. Ensure there are some issues that meet this criteria otherwise adjust the search query to suit. Works best if you frequently have long-lived issues that need resolving. Ensure the notion tool is configured as to not read documents you didn't intend it to ie. private and/or internal documentation. Requirements JIRA for issues management OpenAI for LLM Slack for notifications Customising this workflow Why not try classifying issues as they are created? One use-case may be for quality control such as ensuring reporting criteria is adhered to, summarising and rephrasing issue for easier reading or adjusting priority.
by Darryn Balanco
This workflow automates the process of gathering LinkedIn advice articles, extracting their content, and generating unique contributions for each article using an AI model. The contributions are then posted to a Slack channel and a NocoDB database for record-keeping. The workflow is triggered weekly to ensure new articles are continuously collected and responded to. Who is this for? This workflow is designed for professionals, marketers, and content creators looking to boost their LinkedIn presence by regularly engaging with LinkedIn advice articles. It’s especially useful for those who want to be seen as a "thought leader" or "top voice" in their niche by contributing relevant and unique advice to trending topics. What problem is this workflow solving? Manually searching for relevant LinkedIn articles, reading through them, and crafting thoughtful contributions can be time-consuming. This workflow solves that by automating the process of finding new articles, extracting key content, and generating AI-powered contributions. It helps users stay consistently active on LinkedIn, contributing value to trending discussions. What this workflow does Triggers Weekly: The workflow is set to run every Monday at 8:00 AM. Search Google for LinkedIn Advice Articles: Uses a predefined Google search URL to find the latest LinkedIn advice articles based on the user's area of expertise. Extract LinkedIn Article Links: A code node extracts all LinkedIn advice article links from the search results. Retrieve Article Content: For each article link, the workflow retrieves the HTML content and extracts the article title, topics, and existing contributions. Generate AI-Powered Contributions: The workflow sends the extracted article content to an AI model, which generates unique, helpful advice for each topic within the article. Post to Slack & NocoDB: The AI-generated contributions, along with the article links, are posted to a designated Slack channel and stored in a NocoDB database for future reference. Setup Google Search URL: Update the Google search URL with the relevant LinkedIn advice query for your field (e.g., "site:linkedin.com/advice 'marketing automation'"). Slack Integration: Connect your Slack account and specify the Slack channel where you want the contributions to be posted. NocoDB Integration: Set up your NocoDB project to store the generated contributions along with the article titles and links. How to customize this workflow Change Search Terms**: Modify the Google search URL to focus on a different LinkedIn topic or expertise area. Adjust Trigger Frequency**: The workflow is set to run weekly, but you can adjust the frequency by changing the schedule trigger. Enhance Contribution Quality**: Customize the AI model's prompt to generate contributions that align with your brand voice or content strategy. Workflow Summary This workflow helps users maintain a consistent presence on LinkedIn by automating the discovery of new advice articles and generating unique contributions using AI. It is ideal for professionals who want to engage with LinkedIn content regularly without spending too much time manually searching and drafting responses.
by Harsh Maniya
🤖 Universal E-Commerce AI Assistant (Shopify, WooCommerce & RAG) This powerful n8n workflow deploys a sophisticated, multi-talented AI chatbot designed to streamline your e-commerce and customer support operations. The AI assistant can intelligently understand user queries and route them to the correct specialized agent, whether it's for Shopify, WooCommerce, or general knowledge questions answered by a Retrieval-Augmented Generation (RAG) system. This template automates responses to a wide range of inquiries, from checking Shopify order statuses with GraphQL to fetching product lists from WooCommerce, and even answering general questions by looking up information in a Pinecone vector database. How It Works ⚙️ The workflow operates in a series of logical steps, starting from the moment a user sends a message. 💬 Chat Trigger: The workflow activates when a user sends a message in the n8n chat interface. It captures the user's input and a unique session ID to track the conversation. 🧠 Intelligent Routing: The user's query is first sent to a Router Agent powered by GPT-4o-mini. This agent's sole purpose is to classify the intent of the message and output one of three keywords: SHOPIFY, WOOCOMMERCE, or None of them. 🔀 Conditional Branching: Based on the Router's output, a series of IF nodes direct the conversation down one of three paths: General Queries Path Shopify Path WooCommerce Path 📚 General Queries (RAG): If the query is not about e-commerce, it's handled by a RAG agent. Embedding: The user's question is converted into a vector embedding using AWS Bedrock. Retrieval: The workflow searches a Pinecone Vector Store to find the most relevant information from your knowledge base. Generation: A GPT-4o-mini agent receives the context from Pinecone and generates a comprehensive, helpful answer. 🛍️ E-Commerce Specialists: If the query is about Shopify or WooCommerce, it's passed to a dedicated agent. Shopify Agent: This agent uses Google Gemini and has a suite of tools to manage Shopify tasks. It can Get Order info, Fetch All Products, or run complex queries using the powerful GraphQL tool. WooCommerce Agent: This agent also uses Google Gemini and is equipped with tools to Fetch Order Details and Fetch All Products from a WooCommerce store. 🗣️ Conversation Memory: Each agent (Router, General, Shopify, WooCommerce) is connected to its own Memory node. This allows the chatbot to remember previous parts of the conversation for a more natural and context-aware interaction. 🏁 Merge & Respond: All three paths converge at a final Merge node. This ensures that no matter which agent handled the request, the final answer is streamlined into a single output and sent back to the user in the chat. Nodes Used 🔗 Triggers: Chat Trigger: Starts the workflow when a chat message is received. AI & Agents: AI Agent: Four separate agents for Routing, Shopify, WooCommerce, and General Queries. OpenAI Chat Model: Uses GPT-4o-mini for the Router and General Queries agent. Google Gemini Chat Model: Uses Google Gemini for the Shopify and WooCommerce agents. Tools & Data: Shopify Tool: To get products and order information from Shopify. WooCommerce Tool: To get products and order information from WooCommerce. GraphQL Tool: For advanced, custom queries to the Shopify API. Pinecone Vector Store: To retrieve context for the RAG agent. AWS Bedrock Embeddings: To create vector embeddings for Pinecone. Logic & Memory: IF Node: To conditionally route the workflow. Merge Node: To consolidate the different branches before ending. Window Buffer Memory: Four nodes to provide conversational memory to each agent. Setup Guide 🛠️ To use this workflow, you'll need to configure several nodes with your own credentials and settings. 1\. AI Model Credentials OpenAI: Create an API key in your OpenAI Platform dashboard. Add this credential to the Router Model and GPT-4o-mini nodes. Google Gemini: Create an API key in your Google AI Studio dashboard. Add this credential to the Shopify Chat Model and WooCommerce Chat Model nodes. 2\. E-Commerce Platform Credentials Shopify: You will need a Shopify Access Token. Follow the n8n documentation to generate one. Add the credential to the Fetch All Products and Get Order info nodes. WooCommerce: Create API credentials from your WordPress dashboard. Add the credential to the Fetch All Products2 and Fetch Order Details nodes. 3\. RAG System Credentials (Pinecone & AWS) Pinecone: Sign up for a Pinecone account and create an API key. Add your Pinecone credentials in n8n. In the Pinecone Vector Store node, set the pineconeIndex to the name of your index. You must have a pre-existing index with data for the RAG to work. AWS: Create an AWS account and an IAM user with programmatic access to Amazon Bedrock. Add your AWS credentials in n8n. Select your AWS credentials in the AWS Bedrock Embeddings node. 4\. GraphQL Node Configuration In the GraphQL node, you must update the endpoint URL. Replace the placeholder https://{subdomain}.myshopify.com/admin/api/2025-04/graphql.json with your own Shopify store's GraphQL API endpoint.
by Solomon
This n8n workflow automates lead extraction from Google Maps, enriches data with AI, and stores results for cold outreach. It uses the Bright Data community node and Bright Data MCP for scraping and AI message generation. How it works Form Submission User provides Google Maps starting location, keyword and country. Bright Data Scraping Bright Data community node triggers a Maps scraping job, monitors progress, and downloads results. AI Message Generation Uses Bright Data MCP with LLMs to create a personalized cold call script and talking points for each lead. Database Storage Enriched leads and scripts are upserted to Supabase. How to use Set up all the credentials, create your Postgres table and submit the form. The rest happens automatically. Requirements LLM account (OpenAI, Gemini…) for API usage. Bright Data account for API and MCP usage. Supabase account (or other Postgres database) to store information.
by Alex Kim
Printify Automation - Update Title and Description Workflow This n8n workflow automates the process of retrieving products from Printify, generating optimized product titles and descriptions, and updating them back to the platform. It leverages OpenAI for content generation and integrates with Google Sheets for tracking and managing updates. Features Integration with Printify**: Fetch shops and products through Printify's API. AI-Powered Optimization**: Generate engaging product titles and descriptions using OpenAI's GPT model. Google Sheets Tracking**: Log and manage updates in Google Sheets. Custom Brand Guidelines**: Ensure consistent tone by incorporating brand-specific instructions. Loop Processing**: Iteratively process each product in batches. Workflow Structure Nodes Overview Manual Trigger: Manually start the workflow for testing purposes. Printify - Get Shops: Retrieves the list of shops from Printify. Printify - Get Products: Fetches product details for each shop. Split Out: Breaks down the product list into individual items for processing. Loop Over Items: Iteratively processes products in manageable batches. Generate Title and Desc: Uses OpenAI GPT to create optimized product titles and descriptions. Google Sheets Integration: Trigger: Monitors Google Sheets for changes. Log Updates: Records product updates, including old and new titles/descriptions. Conditional Logic: If Nodes: Ensure products are ready for updates and stop processing once completed. Printify - Update Product: Sends updated titles and descriptions back to Printify. Brand Guidelines + Custom Instructions: Sets brand tone and seasonal instructions. Setup Instructions Prerequisites n8n Instance: Ensure n8n is installed and configured. Printify API Key: Obtain an API key from your Printify account. Add it to n8n under HTTP Header Auth. OpenAI API Key: Obtain an API key from OpenAI. Add it to n8n under OpenAI API. Google Sheets Integration: Share your Google Sheets with the Google API service account. Configure Google Sheets credentials in n8n. Workflow Configuration Set Brand Guidelines: Update the Brand Guidelines + Custom Instructions node with your brand name, tone, and seasonal instructions. Batch Size: Configure the Loop Over Items node for optimal batch sizes. Google Sheets Configuration: Set the correct Google Sheets document and sheet names in the integration nodes. Run the Workflow: Start manually or configure the workflow to trigger automatically. Key Notes Customization**: Modify API calls to support other platforms like Printful or Vistaprint. Scalability**: Use batch processing for efficient handling of large product catalogs. Error Handling**: Configure retries or logging for any failed nodes. Output Examples Optimized Content Example Input Title**: "Classic White T-Shirt" Generated Title**: "Stylish Classic White Tee for Everyday Wear" Input Description**: "Plain white T-shirt made of cotton." Generated Description**: "Discover comfort and style with our classic white tee, crafted from premium cotton for all-day wear. Perfect for casual outings or layering." Next Steps Monitor Updates: Use Google Sheets to review logs of updated products. Expand Integration: Add support for more Printify shops or integrate with other platforms. Enhance AI Prompts: Customize prompts for different product categories or seasonal needs. Feel free to reach out for additional guidance or troubleshooting!
by Alexandra Spalato
YouTube Content Repurposing Automation Who's it for This workflow is for content creators, marketers, agencies, coaches, and businesses who want to maximize their YouTube content ROI by automatically generating multiple content assets from single videos. It's especially useful for professionals who want to: Repurpose YouTube videos into blogs, social posts, newsletters, and tutorials without manual effort Scale their content production across multiple channels and platforms Create consistent, high-quality content derivatives while saving time and resources Build automated content systems that generate multiple revenue streams Maintain active presence across social media, email, and blog platforms simultaneously What problem is this workflow solving Content creators face significant challenges when trying to maximize their video content: Time-intensive manual repurposing: Converting one YouTube video into multiple content formats traditionally requires hours of manual writing, editing, and formatting across different platforms. Inconsistent content quality: Manual repurposing often leads to varying quality levels and missed opportunities to optimize content for specific platforms. High costs for content services: Hiring ghostwriters or content agencies to repurpose videos can cost thousands of dollars monthly. Scaling bottlenecks: Manual processes prevent creators from efficiently scaling their content across multiple channels and formats. This workflow solves these problems by automatically extracting YouTube video transcripts, using AI to generate multiple high-quality content formats (tutorials, blog posts, social media content, newsletters), and organizing everything in Airtable for easy management and distribution. How it works Automated Video Processing Starts with a manual trigger and retrieves YouTube URLs from your Airtable configuration, processing only videos marked as "selected" while filtering out those marked for deletion. Intelligent Transcript Extraction Uses Scrape Creator API to extract video transcripts, automatically cleaning and formatting the text for optimal AI processing and content generation. Multi-Format Content Generation Leverages OpenRouter models, o you can easily test different AI models and choose the one that delivers the best results for your needs: Step-by-step tutorials with code snippets and technical details YouTube scripts with hooks, titles, and conclusions Blog posts optimized for lead generation Structured summaries with key takeaways LinkedIn posts with engagement triggers Newsletter content for email marketing Twitter/X posts for social media Smart Content Filtering Processes only the content types you've selected in Airtable, ensuring efficient resource usage and faster execution times. Automated Content Organization Matches and combines all generated content pieces by URL, then updates your Airtable with complete, ready-to-use content assets organized by type and source video. How to set up Required credentials OpenRouter API key** Airtable Personal Access Token** Scrape Creators API Key** - For YouTube transcript extraction and processing Airtable base setup Create an Airtable base with one main table: Videos Table: title** (Single line text): Video title for reference url** (URL): YouTube video URL to process Status** (Single select): Options: "selected", "delete", "processed" output** (Multiple select): Content types to generate summary tutorial blog-post linkedin newsletter tweeter youtube summary** (Long text): Generated video summary tutorial** (Long text): Generated step-by-step tutorial key_take_aways** (Long text): Extracted key insights blog_post** (Long text): Generated blog post content linkedin** (Long text): LinkedIn post content newsletter** (Long text): Email newsletter content tweeter** (Long text): Twitter/X post content youtube_titles** (Long text): YouTube video title suggestions youtube_hook** (Long text): Video opening hooks youtube_steps** (Long text): Video step breakdowns youtube_conclusion** (Long text): Video ending/CTAs API Configuration Scrape Creator Setup: Sign up for Scrape Creator API Obtain your API key from the dashboard Configure the HTTP Request node with your credentials Set the endpoint to: https://api.scrapecreators.com/v1/youtube/video/transcript OpenAI Setup: Create an OpenRouter account and generate an API key Workflow Configuration Import the workflow JSON into your n8n instance Update all credential references with your API keys Configure the Airtable nodes with your base and table IDs Test the workflow with a single video URL first Requirements n8n instance** (self-hosted or cloud) Active API subscriptions** for OpenRouter (or the LLM or your choice), Airtable, and Scrape Creator YouTube video URLs** - Must be publicly accessible videos with available transcripts Airtable account** - Free tier sufficient for most use cases How to customize the workflow Modify content generation prompts Edit the LLM Chain nodes to customize content style and format: Tutorial node**: Adjust technical depth and formatting preferences Blog post node**: Modify tone, length, and CTA strategies LinkedIn node**: Customize engagement hooks and professional tone Newsletter node**: Tailor subject lines and email marketing approach Adjust AI model selection Update the OpenRouter Chat Model to use different models Add new content formats Create additional LLM Chain nodes for new content types: Instagram captions TikTok scripts Podcast descriptions Course outlines
by Mark Shcherbakov
Video Guide I prepared a detailed guide to help you set up your workflow effectively, enabling you to extract insights from YouTube for content generation using an AI agent. Youtube Link Who is this for? This workflow is ideal for content creators, marketers, and analysts looking to enhance their YouTube strategies through data-driven insights. It’s particularly beneficial for individuals wanting to understand audience preferences and improve their video content. What problem does this workflow solve? Navigating the content generation and optimization process can be complex, especially without significant audience insight. This workflow automates insights extraction from YouTube videos and comments, empowering users to create more engaging and relevant content effectively. What this workflow does The workflow integrates various APIs to gather insights from YouTube videos, enabling automated commentary analysis, video transcription, and thumbnail evaluation. The main functionalities include: Extracting user preferences from comments. Transcribing video content for enhanced understanding. Analyzing thumbnails via AI for maximum viewer engagement insights. AI Insights Extraction: Automatically pulls comments and metrics from selected YouTube creators to evaluate trends and gaps. Dynamic Video Planning: Uses transcriptions to help creators outline video scripts and topics based on audience interest. Thumbnail Assessment: Provides analysis on thumbnail designs to improve click-through rates and viewer attraction. Setup N8N Workflow API Setup: Create a Google Cloud project and enable the YouTube Data API. Generate an API key to be included in your workflow requests. YouTube Creator and Video Selection: Start by defining a request to identify top creators based on their video views. Capture the YouTube video IDs for further analysis of comments and other video metrics. Comment Analysis: Gather comments associated with the selected videos and analyze them for user insights. Video Transcription: Utilize the insights from transcriptions to formulate content plans. Thumbnail Analysis: Evaluate your video thumbnails by submitting the URL through the OpenAI API to gain insights into their effectiveness.
by Hojjat Jashnniloofar
Overview This n8n templates helps you to authomatically search Linkeding jobs. It uses AI (Gemini or OpenAPI) to match your resume with each job description and write a sample cover letter for each job and update the job google sheet. You can receive daily matched linkedin job alerts by telegram. Prerequisites AI API Key from one model like: Google Gemini OpenAI Telegram Bot Token - Create via @BotFather Google Sheets - OAuth2 credentials Google Drive - OAuth2 credentials Setup 1. Upload your resume Upload your CV in PDF format in google drive and configure google drive node to read your resume from list of google drive files. You need to configure Google Drive OAuth2 and grant access to your drive before that. You can find useful infomration about how to configure Googel OAuth2 API key in n8n documents. 2. Create Google sheet You need to create a google sheet document consist of two sheets, one sheet for define job filter criteria and second sheet to store job search result. You can download this Google Sheet Template and copy in your personal space. Then you can add your job filter in Google sheet. You can search job by keywords, location, remote type, job type and easy apply. You need to configure Google Sheet OAuth2 and grant access to your drive before that. 3. Conifgure Telegram Bot You need to create a new Telegram Bot in @BotFather and insert API Key in Telegram node and you need to TELEGRAM_CHAT_ID to your telegram ID.
by n8n Team
This workflow sends a OpenAI GPT reply when an email is received from specific email recipients. It then saves the initial email and the GPT response to an automatically generated Google spreadsheet. Subsequent GPT responses will be added to the same spreadsheet. Additionally, when feedback is given for any of the GPT responses, it will be recorded to the spreasheet, which can then be used later to fine-tune the GPT model. Prerequisites OpenAI credentials Google credentials How it works This workflow is essentially a two-in-one workflow. It triggers off from two different nodes and have very different functionality from each trigger. The flow triggered from On email received node is as follows: Triggers off on the On email received node. Extract the email body from the email. Generate a response from the email body using the OpenAI node. Reply to the email sender using the Send reply to recipient node. A feedback link is also included in the email body which will trigger the On feedback given node. This is used to fine-tune the GPT model. Save the email body and OpenAI response to a Google Sheet. If a sheet does not exist, it will be created. The flow triggered from On feedback given node is as follows: Triggers off when a feedback link is clicked in the emailed GPT response. The feedback, either positive or negative, for that specific GPT response is then recorded to the Google Sheet.
by Mario
Purpose This solution enables you to manage all your Notion and Todoist tasks from different workspaces as well as your calendar events in a single place. All tasks can be managed in Todoist and additionally Fantastical can be used to manage scheduled tasks & events all together. Demo & Explanation How it works The realtime sync consists of two workflows, both triggered by a registered webhook from either Notion or Todoist To avoid overwrites by lately arriving webhook calls, every time the current task is retrieved from both sides. Redis is used to prevent from endless loops, since an update in one system triggers another webhook call again. Using the ID of the task, the trigger is being locked down for 15 seconds. Depending on the detected changes, the other side is updated accordingly. Generally Notion is treaded as the main source. Using an "Obsolete" Status, it is guaranteed, that tasks never get deleted entirely by accident. The Todoist ID is stored in the Notion task, so they stay linked together An additional full sync workflow daily fixes inconsistencies, if any of them occurred, since webhooks cannot be trusted entirely. Since Todoist requires a more complex setup, a tiny workflow helps with activating the webhook. Another tiny workflow helps generating a global config, which is used by all workflows for mapping purposes. Mapping (Notion >> Todoist) Name: Task Name Priority: Priority (1: do first, 2: urgent, 3: important, 4: unset) Due: Date Status: Section (Done: completed, Obsolete: deleted) <page_link>: Description (read-only) Todoist ID: <task_id> Current limitations Changes on the same task cannot be made simultaneously in both systems within a 15-20 second time frame Subtasks are not linked automatically to their parent yet Recurring tasks are not supported yet Tasks names do not support URL’s yet Prerequisites Notion A database must already exist (get a basic template here) with the following properties (case matters!): Text: "Name" Status: "Status", containing at least the options "Backlog", "In progress", "Done", "Obsolete" Select: "Priority", containing the options "do first", "urgent", "important" Date: "Due" Checkbox: "Focus" Text: "Todoist ID" Todoist A project must already exist with the same sections like defined as Status in Notion (except Done and Obsolete) Redis Create a Free Redis Cloud instance or self-host Setup The setup involves quite a lot of steps, yet many of them can be automated for business internal purposes. Just follow the video or do the following steps: Setup credentials for Notion (access token), Todoist (access token) and Redis - you can also create empty credentials and populate these later during further setup Clone this workflow by clicking the "Use workflow" button and then choosing your n8n instance - otherwise you need to map the credentials of many nodes. Follow the instructions described within the bundle of sticky notes on the top left of the workflow How to use You can apply changes (create, update, delete) to tasks both in Notion and Todoist which then get synced over within a couple of seconds (this is handled by the differential realtime sync) The daily running full sync, resolves possible discrepancies in Todoist and sends a summary via email, if anything needed to be updated. In case that contains an unintended change, you can jump to the Task from the email directly to fix it manually.