by Zacharia Kimotho
This workflow makes it easier to keep track of the stocks market and get an email with a summary of the daily highlights on what happened, key insights and trends Setup Guide Define the schedule (days, times, intervals). Replace sample stock data with your desired stock list (ticker, name, etc.) in JSON format. Split Out the fields to have a clean list of the stocks to monitor set keyword node Extracts the stock ticker from each item and sets it to the keyword property. Financial times scraper Triggers the Bright Data Datasets API to scrape financial data. Set the node as below Method: POST URL: https://api.brightdata.com/datasets/v3/trigger Query Parameters: dataset_id: Replace with your Bright Data dataset ID. include_errors: true type: discover_new discover_by: keyword Headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Replace with your Bright Data API key. Body: JSON, ={{ $('set keyword').all().map(item => item.json)}} Execute Once: Checked. Get progress node Checks the status of the Bright Data scraping job if complete, or running Setup: URL: https://api.brightdata.com/datasets/v3/progress/{{ $json.snapshot_id }} Headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Replace with your Bright Data API key. Get snapshot + data retrieves the scraped data from the Bright Data API. Pass the request as URL: https://api.brightdata.com/datasets/v3/snapshot/{{ $json.snapshot_id }} Query Parameters: format: json Headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Replace with your Bright Data API key. Aggregate. Combines the data from each stock item into a single object Update to sheet and add all items to This sheet. Make a copy before you can map the data create summary node generates a summary of the scraped stock data using the Google Gemini AI model and notifies you via Gmail. Setup: Prompt Type: define Text: Customize the prompt to define the AI's role, input format, tasks, output format (HTML email), and constraints. Google Sheets. Appends the scraped data to a Google Sheet. This should be set to automap so as to adjust to the results found in the request Important Notes: Remember to replace placeholder values (API keys, dataset IDs, email addresses, Google Sheet IDs) with your actual values. Review and customize the AI prompt for the "create summary" node to achieve the desired email summary output. Consider adding error handling for a more robust workflow. Monitor API usage to avoid rate limits.
by Leonard
Open Deep Research - AI-Powered Autonomous Research Workflow Description This workflow automates deep research by leveraging AI-driven search queries, web scraping, content analysis, and structured reporting. It enables autonomous research with iterative refinement, allowing users to collect, analyze, and summarize high-quality information efficiently. How it works 🔹 User Input The user submits a research topic via a chat message. 🧠 AI Query Generation A Basic LLM generates up to four refined search queries to retrieve relevant information. 🔎 SERPAPI Google Search The workflow loops through each generated query and retrieves top search results using the SerpAPI API. 📄 Jina AI Web Scraping Extracts and summarizes webpage content from the URLs obtained via SerpAPI. 📊 AI-Powered Content Evaluation An AI Agent evaluates the relevance and credibility of the extracted content. 🔁 Iterative Search Refinement If the AI finds insufficient or low-quality information, it generates new search queries to improve results. 📜 Final Report Generation The AI compiles a structured markdown report, including sources with citations. Set Up Instructions 🚀 Estimated setup time: ~10-15 minutes ✅ Required API Keys:** SerpAPI → For Google Search results Jina AI → For text extraction OpenRouter → For AI-driven query generation and summarization ⚙️ n8n Components Used:** AI Agents with memory buffering for iterative research Loops to process multiple search queries efficiently HTTP Requests for direct API interactions with SerpAPI and Jina AI 📝 Recommended Enhancements:** Add sticky notes in n8n to explain each step for new users Implement Google Drive or Notion Integration to save reports automatically 🎯 Ideal for: ✔️ Researchers & Analysts - Automate background research ✔️ Journalists - Quickly gather reliable sources ✔️ Developers - Learn how to integrate multiple AI APIs into n8n ✔️ Students - Speed up literature reviews 🔗 Completely free and open-source! 🚀
by Jaruphat J.
⚠️ Important Disclaimer: This template is only compatible with a self-hosted n8n instance using a community node. Who is this for? This workflow is ideal for digital content creators, marketers, social media managers, and automation enthusiasts who want to produce fully automated vertical video content featuring inspirational or motivational quotes. Specifically tailored for Thai language, it effectively demonstrates integration of AI-generated imagery, video, ambient sound, and visually appealing quote overlays. What problem is this workflow solving? Manually creating high-quality, vertically formatted quote videos is often repetitive, time-consuming, and involves multiple tedious steps like selecting suitable visuals, editing audio tracks, and correctly overlaying text. Additionally, manual uploading to platforms like YouTube and maintaining accurate content records are prone to errors and inefficiencies. What this workflow does: Fetches a quote, author, and scenic background description from a Google Sheet. Automatically generates a vertical background image using the Flux AI (txt2img) API. Transforms the AI-generated image into a subtly animated cinematic vertical video using the Kling video-generation API. Generates an immersive, ambient background sound using ElevenLabs’ sound generation API. Dynamically overlays the selected Thai-language quote and author text onto the generated video using FFmpeg, ensuring visually appealing typography (e.g., Kanit font). Automatically uploads the final video to YouTube. Updates the resulting YouTube video URL back to the Google Sheet, keeping your content records current and well-organized. Setup Requirements: This workflow requires a self-hosted n8n instance, as the execution of FFmpeg commands is not supported on n8n Cloud. Ensure FFmpeg is installed on your self-hosted environment. API keys and accounts setup for Flux, Kling, ElevenLabs, Google Sheets, Google Drive, and YouTube. Google Sheets Setup: Your Google Sheet must include these columns: Index** Unique identifier for each quote Quote (Thai)** Quote text in Thai language (or your chosen language) Pen Name (Thai)** Author or pen name of the quote's creator Background (EN)** Short English description of the scene (e.g., "sunrise over mountains") Prompt (EN)** Detailed English prompt describing the image/video scene (e.g., "peaceful sunrise with misty mountains") Background Image** URL of AI-generated image (updated automatically) Background Video** URL of generated video (updated automatically) Music Background** URL of generated ambient audio (updated automatically) Video Status** YouTube URL (updated automatically after upload) A ready-to-use Google Sheets template is provided [here (provide your actual link)]. To help you get started quickly, you can use this template spreadsheet. Next steps: Authenticate Google Sheets, Google Drive, YouTube API, Flux AI, Kling API, and ElevenLabs API within n8n. Ensure FFmpeg supports fonts compatible with your chosen language (for Thai, "Kanit" font is recommended). Prepare your Google Sheets with desired quotes, authors, and image/video prompts. How to customize this workflow to your needs: Fonts:** Adjust font type, size, color, and positioning within the provided FFmpeg commands in the workflow’s code nodes. Verify that selected fonts properly support your target language. Media Customization:** Customize the scene descriptions in your Google Sheet to change image/video backgrounds automatically generated by AI. Quote Management:** Easily manage, add, or update quotes and associated details directly via Google Sheets without workflow modifications. Audio Ambiance:** Customize or adjust the ambient sound prompt for ElevenLabs within the workflow’s HTTP Request node to match your video's desired mood. Benefits of using AI-generated content and localized fonts: Leveraging AI-generated visual and audio elements along with localized fonts greatly enhances audience engagement by creating visually appealing, professional-quality content tailored specifically for your target audience. This automated workflow drastically reduces production time and manual effort, enabling rapid, consistent content creation optimized for platforms such as YouTube Shorts, Instagram Reels, and TikTok.
by Obsidi8n
How it works: Send notes from Obsidian via Webhook to start the audio conversion OpenAI converts your text to natural-sounding audio and generates episode descriptions Audio files are stored in Cloudinary and automatically attached to your notes in Obsidian A professional podcast feed is generated, compatible with all major podcast platforms (Apple, Spotify, Google) Set up steps: Install and configure the Post Webhook Plugin in Obsidian Set up Custom Auth credentials in n8n for Cloudinary using the following JSON: { "name": "Cloudinary API", "type": "httpHeaderAuth", "authParameter": { "type": "header", "key": "Authorization", "value": "Basic {{Buffer.from('your_api_key:your_api_secret').toString('base64')}}" } } Configure podcast feed metadata (title, author, cover image, etc.) Note: The second flow is a generic Podcast Feed module that can be reused in any '[...]-to-Podcast' workflow. It generates a standard RSS feed from Google Sheets data and podcast metadata, making it compatible with all major podcast platforms.
by RealSimple Solutions
Who Is This For? This workflow is designed for AI engineers, automation specialists, and content creators who need a scalable system to dynamically manage prompts stored in GitHub. It eliminates manual updates, enforces required variable checks, and ensures that AI interactions always receive fully processed prompts. 🚀 What Problem Does This Solve? Manually managing AI prompts can be inefficient and error-prone. This workflow: ✅ Fetches dynamic prompts from GitHub ✅ Auto-populates placeholders with values from the setVars node ✅ Ensures all required variables are present before execution ✅ Processes the formatted prompt through an AI agent 🛠 How This Workflow Works This workflow consists of three key branches, ensuring smooth prompt retrieval, variable validation, and AI processing. 1️⃣ Retrieve the Prompt from GitHub (HTTP Request → Extract from File → SetPrompt) The workflow starts manually or via an external trigger. It fetches a text-based prompt stored in a GitHub repository. The Extract from File Node retrieves the content from the GitHub file. The SetPrompt Node stores the prompt, making it accessible for processing. 📌 Note: The prompt must contain n8n expression format variables (e.g., {{ $json.company }}) so they can be dynamically replaced. 2️⃣ Extract & Auto-Populate Variables (Check All Prompt Vars → Replace Variables) A Code Node scans the prompt for placeholders in the n8n expression format ({{ $json.variableName }}). The workflow compares required variables against the setVars node: ✅ If all variables are present, it proceeds to variable replacement. ❌ If any variables are missing, the workflow stops and returns an error listing them. The Replace Variables Node replaces all placeholders with values from setVars. 📌 Example of a properly formatted GitHub prompt: Hello {{ $json.company }}, your product {{ $json.features }} launches on {{ $json.launch_date }}. This ensures seamless replacement when processed in n8n. 3️⃣ AI Processing & Output (AI Agent → Prompt Output) The Set Completed Prompt Node stores the final, processed prompt. The AI Agent Node (Ollama Chat Model) processes the prompt. The Prompt Output Node returns the fully formatted response. 📌 Optional: Modify this to use OpenAI, Claude, or other AI models. ⚠️ Error Handling: Missing Variables If a required variable is missing, the workflow stops execution and provides an error message: ⚠️ Missing Required Variables: ["launch_date"] This ensures no incomplete prompts are sent to AI agents. ✅ Example Use Case 📜 GitHub Prompt File (Using n8n Expressions) Hello {{ $json.company }}, your product {{ $json.features }} launches on {{ $json.launch_date }}. 🔹 Variables in setVars Node { "company": "PropTechPro", "features": "AI-powered Property Management", "launch_date": "March 15, 2025" } ✅ Successful Output Hello PropTechPro, your product AI-powered Property Management launches on March 15, 2025. 🚨 Error Output (If Missing launch_date) ⚠️ Missing Required Variables: ["launch_date"] 🔧 Setup Instructions 1️⃣ Connect Your GitHub Repository Store your prompt in a public or private GitHub repo. The workflow will fetch the raw file using the GitHub API. 2️⃣ Configure the SetVars Node Define the required variables in the SetVars Node. Make sure the variable names match those used in the prompt. 3️⃣ Test & Run Click Test Workflow to execute. If variables are missing, it will show an error. If everything is correct, it will output the fully formatted prompt. ⚡ How to Customize This Workflow 💡 Need CRM or Database Integration? Connect the setVars node to an Airtable, Google Sheets, or HubSpot API to pull variables dynamically. 💡 Want to Modify the AI Model? Replace the Ollama Chat Model with OpenAI, Claude, or a custom LLM endpoint. 📌 Why Use This Workflow? ✅ No Manual Updates Required – Fetches prompts dynamically from GitHub. ✅ Prevents Broken Prompts – Ensures required variables exist before execution. ✅ Works for Any Use Case – Handles AI chat prompts, marketing messages, and chatbot scripts. ✅ Compatible with All n8n Deployments – Works on Cloud, Self-Hosted, and Desktop versions.
by Nick Saraev
AI Facebook Ad Spy Tool with Apify, OpenAI, Gemini & Google Sheets Categories: Competitive Intelligence, Marketing Automation, AI Analysis This workflow creates a comprehensive Facebook ad spy tool that scrapes competitor ads from Facebook's ad library and generates detailed analysis with rewritten versions. The system processes text, image, and video ads using different AI models, providing strategic intelligence for PPC agencies and marketers. Built to be sold as a premium service for $2,000+, this tool combines web scraping, multi-modal AI analysis, and competitor intelligence into one powerful automation. Benefits Complete Competitive Intelligence** - Analyze competitor strategies across all ad formats (text, image, video) Multi-Modal AI Analysis** - Uses GPT-4 Vision for images and Gemini for video content understanding Automated Ad Rewriting** - Generates inspired variations of successful competitor ads Quality Filtering** - Targets high-performing advertisers with significant page likes Scalable Processing** - Handle hundreds of competitor ads with detailed strategic analysis Premium Service Potential** - Easily sold to agencies and marketers for $2,000+ implementations How It Works Facebook Ad Library Scraping: Connects to Facebook's public ad library through Apify's specialized scraper Searches for active ads using customizable keywords and targeting parameters Extracts comprehensive ad data including creative assets, targeting info, and engagement metrics Filters results to focus on high-quality advertisers with substantial page followings Intelligent Content Routing: Automatically categorizes ads into text-only, image-based, or video content types Routes each ad type to specialized processing pipelines optimized for that content format Ensures appropriate AI models are used for each type of creative analysis Maintains data integrity while processing different content formats simultaneously Advanced Video Analysis Pipeline: Downloads video ads directly from Facebook's content delivery network Uploads videos to Google Drive for temporary storage and processing Initiates Gemini AI video upload sessions for multi-modal analysis Uses Gemini's advanced video understanding to generate detailed content descriptions Processes video narrative, visual elements, messaging strategy, and target audience insights Image and Text Processing: Analyzes image ads using GPT-4 Vision for comprehensive visual content understanding Processes text-only ads using GPT-4 for messaging strategy and copywriting analysis Identifies key persuasion techniques, target demographics, and messaging frameworks Generates detailed competitive intelligence reports for each ad format Strategic Intelligence Generation: Creates comprehensive summaries analyzing competitor messaging strategies and target audiences Generates rewritten ad copy that captures successful elements while avoiding direct copying Produces recreation prompts for images and videos that can be used with AI generation tools Organizes all insights in structured Google Sheets database for easy analysis and reporting Required Setup Configuration Apify Integration: Sign up for Apify account and obtain API key Replace <your-apify-api-key-here> in "Run Ad Library Scraper" node Customize Facebook Ad Library search URLs with your target keywords and regions AI Service Configuration: OpenAI API**: Set up for text analysis and image understanding with GPT-4 Vision Gemini API**: Configure for advanced video content analysis and description Replace <your-gemini-api-key-here> in all Gemini-related nodes Google Services Setup: Google Drive**: Configure OAuth for temporary video storage during Gemini processing Google Sheets**: Create results database with proper column structure for ad intelligence storage Facebook Ad Library Search Configuration: Customize the search parameters in the Apify scraper Google Sheets Database Structure: Create a sheet with these columns: ad_archive_id - Unique Facebook ad identifier page_id - Advertiser's Facebook page ID page_name - Advertiser's business name page_url - Link to advertiser's Facebook page type - Ad format (text, image, or video) date_added - When ad was analyzed summary - Detailed competitive intelligence analysis rewritten_ad_copy - AI-generated inspired version image_prompt - Description for recreating image ads video_prompt - Description for recreating video ads Business Use Cases PPC Agencies - Offer comprehensive competitor analysis services to clients for strategic advantage Marketing Teams - Research competitor strategies and messaging before launching new campaigns E-commerce Businesses - Analyze successful ads in your industry for creative inspiration SaaS Companies - Study how competitors position their products and target audiences Course Creators - Research educational content marketing approaches and messaging strategies Affiliate Marketers - Identify successful promotional strategies and high-converting ad formats Difficulty Level: Advanced Estimated Build Time: 3-4 hours Monthly Operating Cost: ~$200 (Apify + OpenAI + Gemini + Google Workspace APIs) Watch My Complete Build Process Want to see exactly how I built this entire Facebook ad spy system from scratch? I walk through the complete development process live, including API integrations, multi-modal AI setup, error handling, and the exact business strategy for selling this as a premium service. 🎥 Watch My Live Build: "Build A Facebook Ads Spy Tool With N8N (Sell for $2k+)" This comprehensive tutorial shows the real development process - including complex API orchestration, multi-modal AI integration, and proven strategies for monetizing competitive intelligence systems. Set Up Steps Apify Scraper Configuration: Set up Apify account and configure Facebook Ad Library scraper Customize search parameters for your target industries and regions Configure result limits and filtering parameters for quality control Test scraper with sample searches to verify data quality Multi-Modal AI Setup: Configure OpenAI API credentials for text and image analysis Set up Gemini API access for advanced video content understanding Configure appropriate rate limits and error handling for API stability Test AI analysis with sample ads to optimize prompt quality Google Services Integration: Set up Google Drive OAuth for temporary video storage during processing Create Google Sheets database with proper column structure for intelligence storage Configure sharing permissions and access controls for team collaboration Test complete data flow from scraping to final intelligence reports Quality Control and Filtering: Configure page likes threshold in "Filter For Likes" node (recommend 1,000+ for quality) Adjust content routing logic in Switch node based on your analysis needs Set up error handling and retry logic for reliable large-scale processing Test complete workflow with various ad types to ensure proper routing Advanced Customization: Customize AI prompts for your specific industry analysis needs Configure additional filtering criteria beyond page likes Set up automated scheduling for regular competitor monitoring Add custom fields to database for tracking specific competitive metrics Advanced Features Scale the system with additional capabilities: Industry-Specific Analysis - Customize prompts and filters for different verticals Trend Tracking - Monitor messaging changes over time for strategic insights Performance Correlation - Cross-reference ad engagement with business outcomes Alert Systems - Notify when competitors launch new campaign types Custom Reporting - Generate client-ready intelligence reports automatically Integration Extensions - Connect to CRM and marketing platforms for strategic workflow Important Considerations API Rate Limits - Built-in delays and error handling prevent service interruptions Content Rights - System generates inspired variations, not direct copies, for legal compliance Data Storage - Organize intelligence database for easy client reporting and analysis Scalability - Batch processing handles hundreds of ads efficiently without blocking Quality Assurance - Filtering logic ensures analysis focuses on successful, high-quality advertisers Why This System Works The competitive advantage lies in comprehensive multi-modal analysis: Complete format coverage - analyzes text, image, and video ads with appropriate AI models Strategic depth - goes beyond basic scraping to provide actionable intelligence Automation scale - processes competitor research that would take weeks manually Premium positioning - advanced AI analysis justifies higher service pricing Immediate value - clients receive actionable insights within hours of setup Check Out My Channel For more advanced automation systems that generate real business results and premium service opportunities, explore my YouTube channel where I share proven strategies for building profitable automation businesses.
by Julian Kaiser
This automated workflow scrapes and processes the monthly "Who is Hiring" thread from Hacker News, transforming raw job listings into structured data for analysis or integration with other systems. Perfect for job seekers, recruiters, or anyone looking to monitor tech job market trends. How it works Automatically fetches the latest "Who is Hiring" thread from Hacker News Extracts and cleans relevant job posting data using the HN API Splits and processes individual job listings into structured format Parses key information like location, role, requirements, and company details Outputs clean, structured data ready for analysis or export Set up steps Configure API access to [Hacker News](https://github.com/HackerNews/API ) (no authentication required) Follow the steps to get your cURL command from https://hn.algolia.com/ Set up desired output format (JSON structured data or custom format) Optional: Configure additional parsing rules for specific job listing information Optional: Set up integration with preferred storage or analysis tools The workflow transforms unstructured job listings into clean, structured data following this pattern: Input: Raw HN thread comments Process: Extract, clean, and parse text Output: Structured job listing data This template saves hours of manual work collecting and organizing job listings, making it easier to track and analyze tech job opportunities from Hacker News's popular monthly hiring threads.
by Franz
🕸️ Dynamic Website Change Monitor with Smart Email Alerts Never miss important website updates again! This workflow automatically tracks changes on dynamic websites (think React apps, JavaScript-heavy sites) and sends you instant email notifications when something changes. Perfect for keeping tabs on competitors, monitoring product updates, or staying on top of important announcements. ✨ What makes this special? 🚀 Handles Dynamic Websites: Uses Firecrawl API to scrape JavaScript-rendered content that basic scrapers can't touch 📧 Smart Email Alerts: Only sends notifications when content actually changes (no spam!) 📊 Historical Tracking: Keeps a complete log of all changes in Google Sheets 🛡️ Bulletproof: Continues working even if one part fails ⚡ Ready to Deploy: Webhook-triggered, perfect for cron jobs or external schedulers 🎯 Perfect for monitoring: Competitor pricing pages Job board postings Product availability updates News sites for breaking stories API documentation changes Terms of service updates 🛠️ What you'll need to get started: API Accounts & Keys: Firecrawl Account 🔥 Sign up at firecrawl.dev Grab your API key from the dashboard Create a "Bearer Auth" credential in n8n Google Cloud Setup ☁️ Enable Google Sheets API Enable Gmail API Set up OAuth2 credentials Add both as credentials in n8n Google Sheets Document 📋 Create a new spreadsheet Add two tabs: "Log" and "comparison" Follow the structure outlined in the workflow notes 🚀 How it works: Webhook receives trigger → Starts the monitoring process Firecrawl scrapes website → Gets fresh content (even JavaScript-rendered!) Smart comparison → Checks against previously stored content Change detected? → If yes, send email + log everything Update storage → Prepares for next monitoring cycle ⚙️ Setup Steps: Import this workflow into your n8n instance Configure credentials for Firecrawl, Google Sheets, and Gmail Update the target URL in the Firecrawl node Set your email address in the Gmail node Create your Google Sheets with the required structure Test it manually first, then activate! 🎨 Customize it your way: Target any website** by updating the URL Change email templates** to match your style Adjust monitoring frequency** with external cron jobs Switch between markdown/HTML** extraction formats Fine-tune change detection** sensitivity 🔧 Troubleshooting: Firecrawl errors?** Check your API key and rate limits Google Sheets issues?** Verify OAuth permissions and sheet structure Email not sending?** Check Gmail API quotas and spam folders Webhook problems?** Make sure the workflow is activated Ready to never miss another website change? Let's get this automation running! 🎉
by Ranjan Dailata
Who this is for? This workflow is designed for professionals and teams who need real-time, structured insights from Perplexity Search results without manual effort. What problem is this workflow solving? This n8n workflow solves the problem of automating Perplexity Search result extraction, cleanup, summarization, and AI-enhanced formatting for downstream use like sending the results to a webhook or another system. What this workflow does Automates Perplexity Search via Bright Data Uses Bright Data’s proxy-based SERP API to run a Google Search query programmatically. Makes the process repeatable and scriptable with different search terms and regions/zones. Cleans and Extracts Useful Content The Readable Data Extractor uses LLM-based cleaning to remove HTML/CSS/JS from the response and extract pure text data. Converts messy, unstructured web content into structured, machine-readable format. Summarizes Search Results Through the Gemini Flash + Summarization Chain, it generates a concise summary of the search results. Ideal for users who don’t have time to read full pages of search results. Formats Data Using AI Agent The AI Agent acts like a virtual assistant that: - Understands search results Formats them in a readable, JSON-compatible form Prepares them for webhook delivery Delivers Results to Webhook Sends the final summary + structured search result to a webhook (could be your app, a Slack bot, Google Sheets, or CRM). Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the Perplexity Search Request node with the prompt you wish to perform the search. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs 1. Change the Perplexity Search Input Default: It searches a fixed query or dataset. Customize: Accept input from a Google Sheet, Airtable, or a form. Auto-trigger searches based on keywords or schedules. 2. Customize Summarization Style (LLM Output) Default: General summary using Google Gemini or OpenAI. Customize: Add tone: formal, casual, technical, executive-summary, etc. Focus on specific sections: pricing, competitors, FAQs, etc. Translate the summaries into multiple languages. Add bullet points, pros/cons, or insight tags. 3.Choose Where the Results Go Options: Email, Slack, Notion, Airtable, Google Docs, or a dashboard. Auto-create content drafts for WordPress or newsletters. Feed into CRM notes or attach to Salesforce leads.
by max e
Turn plain-language chat like “Tomorrow 9 AM: write blog post” into neatly organised Todoist tasks with GPT-4o and n8n—zero code. 🪄 Ultimate Personal Todoist Agent Turn natural-language requests into perfectly-organized Todoist tasks—all on autopilot inside n8n. > “Add Finish quarterly report by Friday afternoon” → the agent creates the task, sets the due date & priority, and even drops it into the right project. ✨ 🌟 Why this workflow rocks All-in-one Todoist super‑powers** – create, update, complete, move, archive… every major Todoist endpoint is wired up (tasks, projects, sections, labels, comments). LLM‑powered intent detection** – an OpenAI model interprets plain-English (or emoji‑filled!) messages so you don’t have to remember slash‑commands. Minimal setup** – just two credentials and you’re live. Battle‑tested building block** – use it as‑is, or plug the Todoist Agent node into your own agents & chatbots. 🛠️ What you’ll need | Credential | Where it’s used | How to set it up | | ------------------ | -------------------------------------- | --------------------------------------------------------------------------------------------- | | OpenAI API | Orchestrator & LLM nodes | Paste your OpenAI secret key into an OpenAI credential in n8n. | | Todoist OAuth2 | Todoist node and HTTP Request node | Log in Todoist from your browser to set up credential in n8n. | > That’s it—no webhooks, no extra secrets. > Tested with *gpt‑4o‑latest* – the fastest & most accurate model in our trials. ⚡ Quick‑start (5 minutes) Import the JSON template (hit ▶️ Try it out on the n8n template page or drag‑drop the file into your canvas). Select your credentials in the two credential dropdowns. Click Test workflow. In the sample Function node, tweak the message field (e.g. “Tomorrow at 9 am: write blog post”). Run → watch your new Todoist task appear. (Optional) Swap the Function node for your favourite chat trigger (Telegram, Slack, WhatsApp, Discord, you name it). Boom—your personal Todoist genie is alive! 🧞♂️ 🧩 How it works (under the hood) [Trigger / Chat message] │ ▼ [🗂️ Orchestrator Agent] ← OpenAI Chat Model + Short‑term Memory │ ↳ Parses intent & entities │ ▼ [🤖 Todoist Agent] ← 15+ Todoist endpoints │ ↳ Executes the right call (create, update, complete, etc.) ▼ [Done ✅ ] The Orchestrator is an example. In production you can drop it and simply expose the Todoist Agent as a tool for any other agent workflow. 🎛️ Customising & extending | Idea | How to do it | | ------------------------- | ---------------------------------------------------------------------------------------- | | Notion / Sheets sync | After the Todoist Agent node, add a Notion or Google Sheets node to log completed items. | | Voice commands | Swap the chat trigger for a Speech‑to‑Text node (e.g. Whisper). | 🤝 Need custom automations? Want me to build or tweak something for you? → Email maxemelyanenko@gmail.com and let’s make it happen! ⚠️ What’s not included (yet) Shared projects & other Todoist Pro/Business endpoints. File attachments in the comments. Editing comments. Pull requests welcome! 🙌
by Arlin Perez
Sort New Gmail messages by category with AI 👥 Who's it for This workflow is perfect for individuals or teams who receive a high volume of emails 📥 and want to automatically organize them into Gmail labels 🏷️ using AI. No coding required! For sorting existing emails messages in your gmail inbox, please use this free workflow: Categorize and Label Existing Gmail Emails Automatically with GPT-4o mini. 🤖 What it does It automatically processes new Gmail emails, skips those that already have labels, sends the content to an AI Agent powered by GPT-4o mini 🧠, and applies a relevant label based on the content. All labels must exist in Gmail beforehand. ⚙️ How it works 📬 Gmail Trigger – Activates on new email received. 🚫 Filter – Skips emails that already have a label. 🧠 AI Agent (GPT-4o mini) – Analyzes the message and decides which label fits best. 🧾 Structured Output Parser – Formats the AI output into a clean JSON. 🔀 Switch Node – Routes each email to the correct label path based on the AI result. 🏷️ Gmail Nodes – Assign the Gmail label to the original email. 📋 Requirements Gmail account connected to n8n Pre-created labels in Gmail matching the AI categories OpenAI credentials with GPT-4o mini access n8n's AI Agent & Structured Output Parser nodes 🛠️ How to set up Open the workflow and adjust the trigger interval (e.g., every minute, hours or Custom using Cron ⏱️) Check that the Filter skips emails with existing labels Define your categories in the AI Agent prompt and make sure they match the Gmail labels Configure the Switch Node conditions for each category Ensure each Gmail Label Node applies the correct label Save and activate the workflow ✅ 🎨 How to customize the workflow Add or remove categories in the AI prompt & Switch Node Fine-tune prompt instructions to match your specific use case
by AK Pasnoor
Put your productivity on autopilot with this workflow. How it works This workflow generates a beautifully formatted daily briefing email every morning at 6:00 AM by combining your Todoist tasks and Google Calendar events, summarizing them using GPT-4o, and sending them as a clean HTML email. It includes: Auto-fetching today's tasks and events Formatting them for context Generating a motivational summary with GPT-4o Converting the output into styled HTML Emailing it to you daily Set up steps Connect your Google Calendar and Todoist accounts Set your project ID in the Todoist node Customize the OpenAI prompt or email template if needed Enable the Schedule Trigger to automate daily runs All configuration logic and summaries are explained in sticky notes inside the workflow. No external tools required. Just plug, personalize, and automate your day!