by Parth Pansuriya
Analyze Ingredient Photos using Telegram & Gemini AI Who’s it for Skincare enthusiasts who want to know if a product is safe. Food or supplement buyers checking ingredient safety. Parents reviewing kids’ products. Anyone wanting quick ingredient analysis before using/buying a product. How it works / What it does Telegram Input – User sends a photo of a product label or a text list of ingredients. Photo Handling – Workflow checks if the message contains a photo. If yes → retrieves the file, extracts ingredients using Google Gemini AI. If no → handles text/greetings/off-topic queries. Caption Branching With caption → Gives Use / Do Not Use recommendation + reason. Without caption → Gives Advantages, Disadvantages, Recommended For, Not Recommended For (3 points each). Response on Telegram – Sends a friendly, structured response back to the user. How to set up Import this workflow JSON into n8n. Create and connect a Telegram bot via BotFather → paste the API token in Telegram credentials. Add Google Gemini (PaLM) API credentials inside n8n. Activate the workflow and send your first product photo via Telegram! Requirements n8n instance (self-hosted or cloud). Telegram bot token. Google Gemini API credentials. How to customize the workflow Change AI Instructions** – Update system messages to tweak tone (more technical, casual, or medical). Adjust Output Format** – Edit Telegram response nodes for shorter/longer answers. Expand Analysis** – Add extra categories (e.g., allergens, environmental impact). Multi-language Support** – Modify prompts to output in the user’s preferred language. Add Database Logging** – Connect to MySQL, PostgreSQL to save conversations (user queries + AI responses).
by Yuvraj Singh
Purpose This solution enables you to manage all your Notion and Todoist tasks from different workspaces as well as your calendar events in a single place. This is 2 way sync with partial support for recurring How it works The realtime sync consists of two workflows, both triggered by a registered webhook from either Notion or Todoist. To avoid overwrites by lately arriving webhook calls, every time the current task is retrieved from both sides. Redis is used to prevent from endless loops, since an update in one system triggers another webhook call again. Using the ID of the task, the trigger is being locked down for 80 seconds. Depending on the detected changes, the other side is updated accordingly .Generally Notion is treaded as the main source. Using an "Obsolete" Status, it is guaranteed, that tasks never get deleted entirely by accident. The Todoist ID is stored in the Notion task, so they stay linked together An additional full sync workflow daily fixes inconsistencies, if any of them occurred, since webhooks cannot be trusted entirely. Since Todoist requires a more complex setup, a tiny workflow helps with activating the webhook. Another tiny workflow helps generating a global config, which is used by all workflows for mapping purposes. Mapping (Notion >> Todoist) Name: Task Name Priority: Priority (1: do first, 2: urgent, 3: important, 4: unset) Due: Date Status: Section (Done: completed, Obsolete: deleted) <page_link>: Description (read-only) Todoist ID: <task_id> Current limitations Changes on the same task cannot be made simultaneously in both systems within a 15-20 second time frame. Subtasks are not linked automatically to their parent yet. Tasks names do not support URL’s yet. Credentials Follow the video: Setup credentials for Notion (access token), Todoist (access token) and Redis. Todoist Follow this video to get Todoist to obtain API Token. Todoist Credentials.mp4 Notion Follow this video to get Notion Integration Secret. Redis Follow this video to get Redis Setup The setup involves quite a lot of steps, yet many of them can be automated for business internal purposes. Just follow the video or do the following steps: Setup credentials for Notion (access token), Todoist (access token) and Redis - you can also create empty credentials and populate these later during further setup Clone this workflow by clicking the "Use workflow" button and then choosing your n8n instance - otherwise you need to map the credentials of many nodes. Follow the instructions described within the bundle of sticky notes on the top left of the workflow How to use You can apply changes (create, update, delete) to tasks both in Notion and Todoist which then get synced over within a couple of seconds (this is handled by the differential realtime sync) The daily running full sync, resolves possible discrepancies in Todoist. This workflow incorporates ideas and techniques inspired by Mario (https://n8n.io/creators/octionic/) whose expertise with specific nodes helped shape parts of this automation. Significant enhancements and customizations have been made to deliver a unique and improved solution.
by Axiomlab.dev
Tasks Briefing This template posts a clean, Slack-ready morning summary of your Google Tasks due today. It fetches tasks, filters only those due “today” in your timezone, asks a local LLM (via LangChain + Ollama) to produce a short summary (no steps, just a concise brief), strips any hidden <think> blocks, and delivers the message to your chosen Slack channel. How it works Trigger at Morning (Cron) – runs at 7:00 AM (you can change the hour) to kick things off daily. Get many tasks (Google Tasks node) – pulls tasks from your selected Google Tasklist. Code (Filter Due Today) – normalizes dates to your timezone, keeps only tasks due today, and emits a fallback flag if none exist. If – routes: True (has tasks) → continues to the LLM summary path. False (no tasks) → sends a “No tasks due today” message to Slack. Code (Build LLM Prompt) – builds a compact, Markdown-only prompt for the model (no tool calls). Basic LLM Chain (LangChain) + Ollama Model – generates a short summary for Slack. Code (Cleanup) – removes any <think>…</think> content if the model includes it. Send a message (Slack) – posts the final brief to your Slack channel. Required credentials Google Tasks OAuth2 API** – to read tasks from your Google Tasklist. Slack API** – to post the summary into a channel. Ollama** – local model endpoint (e.g., qwen3:4b); used by the LangChain LLM nodes. Setup Instructions Google Tasks credential In Google Cloud Console: enable Google Tasks API, create an OAuth Client (Web), and set the redirect URI shown by n8n. In n8n Credentials, add Google Tasks OAuth2 API with scope: https://www.googleapis.com/auth/tasks (read/write) or https://www.googleapis.com/auth/tasks.readonly (read-only). In the Get many tasks node, select your credential and your Tasklist. Slack credential & channel In n8n Credentials, add Slack API (bot/user token with chat:write). In Send a message nodes, select your Slack credential and set the Channel (e.g., #new-leads). Ollama model (LangChain) Ensure Ollama is running on your host (default http://localhost:11434). Pull a model (e.g., ollama pull qwen3:4b) or use another supported model (llama3:8b, etc.). In Ollama Model node, select your Ollama credential and set the model name to match what you pulled. Timezone & schedule The Cron node is set to 7:00 AM. Adjust as needed. The Code (Filter Due Today) node is configured for Asia/Dhaka; change the TZ constant if you prefer a different timezone. (Optional) Cleanup safety The template includes a Code (Cleanup) node that strips <think>…</think> blocks from model output. Keep this connected before the Slack node. Test the flow Run the workflow once manually: If you have tasks due today, you should see a concise summary posted to your Slack channel. If none are due, you’ll receive a friendly “No tasks due today” message. Activate When everything looks good, toggle the workflow Active to receive the daily summary automatically.
by riandra
Turn Internet Into Database — n8n Workflow Description This n8n template automates the entire process of turning any website into a structured database — no manual scraping required. It uses MrScraper's AI-powered agents to crawl a domain, extract listing pages, scrape detail pages, and export everything into Google Sheets with an email notification via Gmail. Whether you're building a real estate database, product catalog, job board aggregator, or competitor price tracker, this workflow handles the full pipeline end-to-end. How It Works Phase 1 – Discover URLs (Crawling):** The Map Agent crawls your target domain and discovers all relevant URLs based on your include/exclude patterns. It returns a clean list of listing/search page URLs. Phase 2 – Scrape Listing Pages:** The workflow loops through each discovered listing URL and runs the Listing Agent to extract all detail page URLs. Duplicates are automatically removed. Phase 3 – Scrape Detail Pages:** Each detail URL is looped through the General Agent, which extracts structured fields (title, price, location, description, etc.). Nested JSON is automatically flattened into clean, spreadsheet-ready rows. Phase 4 – Export & Notify:** Scraped records are appended or upserted into Google Sheets using a unique key. Once complete, a Gmail notification is sent with a run summary. How to Set Up Create 3 scrapers in your MrScraper account: Map Agent Scraper (for crawling/URL discovery) Listing Agent Scraper (for extracting detail URLs from listing pages) General Agent Scraper (for extracting structured data from detail pages) Copy the scraperId for each — you'll need these in n8n. Enable AI Scraper API access in your MrScraper account settings. Add your credentials in n8n: MrScraper API token Google Sheets OAuth2 Gmail OAuth2 Configure the Map Agent node: Set your target domain URL (e.g. https://example.com) Set includePatterns to match listing pages (e.g. /category/) Adjust maxDepth, maxPages, and limit as needed Configure the Listing Agent node: Enter the Listing scraperId Set maxPages based on how many pages per listing URL to scrape Configure the General Agent node: Enter the General scraperId Connect Google Sheets: Enter your spreadsheet and sheet tab URL Choose append or upsert strategy (recommended: upsert by url) Configure Gmail: Set recipient email, subject line, and message body Requirements MrScraper** account with API access enabled Google Sheets** (OAuth2 connected) Gmail** (OAuth2 connected) Good to Know The workflow uses batch looping, so large sites with hundreds of pages are handled gracefully without overloading. The Flatten Object node automatically normalizes nested JSON — no manual field mapping needed for most sites. Set a unique match key (e.g. url) in the Google Sheets upsert step to avoid duplicate rows on re-runs. Scraping speed and cost will depend on MrScraper's pricing plan and the number of pages processed. Customising This Workflow Different site types:** Works for real estate listings, job boards, e-commerce catalogs, directory sites, and more — just adjust your URL patterns. Add filtering:** Insert a Code or Filter node after Phase 3 to drop incomplete records before saving. Schedule it:** Replace the manual trigger with a Schedule Trigger to run daily or weekly and keep your database fresh automatically. Multi-site:** Duplicate Phase 1–3 branches to scrape multiple domains in a single workflow run.
by Ramdoni
📸💰 Log receipt transactions from Telegram images to Excel using OCR and AI OCR + AI + Duplicate Protection (n8n Workflow) Overview This workflow automatically converts receipt images or payment proof sent via Telegram into structured financial records stored in Excel 365. Core flow: Take Photo → OCR → AI Structuring → Validation → Duplicate Check → Append to Excel The system is designed to reduce manual data entry errors and improve financial visibility by turning unstructured receipt images into reliable financial records. ⸻ How it works This workflow captures receipt images sent to a Telegram bot and converts them into structured financial records in Excel. When a user sends an image, the workflow downloads the file and extracts the text using OCR. The extracted text is then analyzed by an AI model (OpenAI or Gemini) to identify key transaction details such as date, amount, merchant, and transaction type. The workflow validates the extracted data and generates a unique duplicate key. It then checks Excel to ensure the transaction has not already been recorded. If no duplicate is found, the transaction is appended to the Excel sheet. The user receives a confirmation message in Telegram indicating whether the transaction was successfully recorded or rejected due to missing data or duplication. Workflow Nodes The workflow contains the following nodes: Telegram Trigger Edit Fields (extract chat_id and image_id) Condition (validate image exists) Telegram Get File Tesseract OCR AI Agent (ChatGPT / Gemini) Edit Field (JSON Output Parsing) Excel 365 – Get Configuration Edit Field – Normalize & Generate duplicate_key Condition – Validate required fields Excel 365 – Lookup duplicate_key Condition – Duplicate Check Excel 365 – Append Transaction Telegram – Send Confirmation Message 🧠 What This Workflow Does This workflow automatically: Captures receipt images from Telegram Extracts text using OCR Converts OCR text into structured transaction data using AI Validates required financial fields Prevents duplicate transaction entries Saves valid records into Excel Sends confirmation messages to Telegram Setup steps Create a Telegram bot using @BotFather and copy the Bot Token. Add Telegram credentials in n8n using the Bot Token. Prepare an Excel file in OneDrive or SharePoint with a TRANSACTIONS sheet. Configure Microsoft Excel credentials in n8n. Add either OpenAI or Google Gemini credentials for the AI parsing step. If running self-hosted n8n, ensure Tesseract OCR is installed. Activate the workflow and send a receipt image to the Telegram bot to test. 🎯 Key Benefits • Reduce financial recording errors • Eliminate repetitive manual data entry • Improve cashflow visibility • Simple photo-based financial recording • Scalable foundation for AI-powered financial automation 🚀 Use Cases This workflow is useful for: • Small business owners • Online sellers • Freelancers • Finance administrators • Personal expense tracking 📌 Notes • Duplicate detection is based on a composite transaction key • OCR accuracy depends on image quality • AI parsing improves recognition compared to OCR-only approaches 🧱 Requirements To run this workflow you need: • n8n (Cloud or Self-Hosted) • Telegram Bot Token • Microsoft Excel 365 Account • OpenAI API Key or Google Gemini API Key • Tesseract OCR (for self-hosted installations) 🏁 Result After setup, users can simply send receipt photos to Telegram and the workflow will automatically convert them into structured financial records stored in Excel. No manual bookkeeping required. Built for automation lovers, small businesses, and teams who want simple but powerful financial tracking using n8n.
by Daniel Turgeman
How it works A webhook receives a chatbot message or demo request form with an email address The email is validated and cleaned, then Lusha enriches the lead A priority score is calculated based on seniority, company size, and request type The lead is upserted into HubSpot; high-priority leads go to #urgent-demos on Slack, others to #demo-requests Set up steps Install the Lusha community node Add your Lusha API, Slack, and HubSpot credentials Point your chatbot or demo form webhook to the n8n endpoint Customize priority scoring thresholds and Slack channels
by Avkash Kakdiya
How it works This workflow automates the complete employee leave approval process from submission to final resolution. Employees submit leave requests through a form, which are summarized professionally using AI and sent for approval via email. The workflow waits for the approver’s response and then either sends an approval confirmation or schedules a clarification discussion automatically. All communication is handled consistently with no manual follow-ups required. Step-by-step Step 1: Capture leave request, generate summary, and request approval** On form submission – Captures employee details, leave dates, reason, and task handover information. AI Agent – Generates a professional, manager-ready summary of the leave request. OpenAI Chat Model – Provides the language model used to generate the summary. Structured Output Parser – Extracts the email subject and HTML body from the AI response. Send message and wait for response – Emails the summary to the approver and pauses the workflow until approval or rejection. If – Routes the workflow based on the approval decision. Step 2: Notify employee or schedule discussion automatically** Approved path Send a message – Sends an official leave approval email to the employee. Clarification or rejection path Booking Agent – Determines the next business day and finds the first available 10-minute slot. OpenAI – Applies scheduling logic to select the earliest valid slot. Get Events – Fetches existing calendar events to avoid conflicts. Check Availability – Confirms free time within working hours. Output Parser – Extracts the final meeting start time. Send a message1 – Emails the employee with the scheduled discussion details. Why use this? Eliminate manual approval follow-ups and email back-and-forth Ensure consistent, professional communication for every leave request Automatically handle both approvals and clarification scenarios Reduce manager effort with AI-generated summaries Schedule discussions without manual calendar coordination
by Avkash Kakdiya
How it works This workflow runs on a daily schedule to monitor users currently on a trial plan. It fetches user data from MongoDB, calculates their trial stage, and assigns a trigger label such as Day 3, Day 7, Day 13, or Last Day. Based on this stage, the workflow sends a targeted email using Gmail. This ensures consistent engagement and improves trial-to-paid conversion without manual effort. Step-by-step Fetch and filter trial users** Schedule Trigger – Runs the workflow daily at a fixed time. Find documents – Retrieves users with an active trial plan from MongoDB. Code in JavaScript – Calculates trial progress and assigns a trigger type (day_3, day_7, day_13, last_day). Send stage-based emails** Loop Over Items – Iterates through each filtered user. Switch – Routes users based on their trigger type. Send day 3 email – Sends onboarding encouragement email. Send day 7 mail – Sends mid-trial engagement email. Send day 13 mail – Sends near-expiry reminder email. Send Last day email – Sends final urgency email before trial ends. Merge – Combines all branches and continues looping for remaining users. Why use this? Automates user engagement throughout the trial lifecycle Improves activation and feature adoption with timely nudges Increases conversion rates with strategic email timing Eliminates manual tracking of trial users Scales easily for large SaaS user bases
by Niclas Aunin
LinkedIn Content Generation Workflow Summary Automated workflow that transforms Notion content notes into publication-ready LinkedIn posts using Claude AI. Monitors Notion database and generates multiple variations based on structured outlines, so that the author can pick the one they like most. Use Cases Automate LinkedIn content creation from content planning database. Generate multiple post variations from a single outline. Maintain consistent voice and formatting across all posts. Scale content production while preserving quality. How It Works Trigger - Monitors Notion "Content Plan" database hourly for updates. Conditional Check - Verifies "LinkedIn Post (Main)" tag and "Ready for Writing" status Main Post - Claude generates single post from project name and notes Outline Analysis - Parallel process creates 3 distinct post concepts with different angles Multi-Post Generation - Each outline becomes a complete LinkedIn post Save to Notion - All posts automatically saved to database AI Setup: Claude Sonnet 4.5 (claude-sonnet-4-5-20250929) Main post: temperature 0.8 (creative) Multi-post: default temperature (consistent) How to Use Setup a content database in notion, or link your existing one: Use field mapping as outlined below or update field mapping in n8n template. Add content to Notion: Project name (topic) Notes (article content/key points) Tag: "LinkedIn Post (Main)" Status: "Ready for Writing" Workflow triggers automatically (hourly check) Retrieve posts from Notion database Review and publish to LinkedIn Requirements Credentials: Notion API (access to Content Plan database) Anthropic API key OpenAI API Key Notion Database: Connect Database Required properties: Project name (text) Notes (rich text) Tags (multi-select with "LinkedIn Post (Main)") Status (select with "Ready for Writing") Notes: Posts optimized for 1800 character limit Generates both single posts and multi-angle variations
by WeblineIndia
This n8n workflow automatically fetches all interview events scheduled for today from a specified Google Calendar and sends a personalized email to each interviewer. The email contains a formatted HTML table listing their interview details, including meeting times, Google Meet links, and attendees with their response status. This ensures all interviewers are informed daily at 8:00 AM IST without any manual coordination. Who’s it for Interviewers** who want a quick morning packet instead of opening multiple calendar tabs. Recruiters/coordinators** who need a reliable, zero‑friction daily brief for interviewers. Teams** that paste CV/notes links directly into calendar events (no file search required). How it works Cron triggers daily at 08:00 IST. Google Calendar reads today’s events from the Interviews calendar. A Code step parses each event to identify interviewers and extract candidate details, meeting link, and any CV: / Notes: links from the description and create a table to share via Gmail. A grouping step compiles a per‑interviewer list for the day. Email (Gmail) sends one digest to interviewer. How to set up Ensure all interviews are scheduled on the Google Calendar named Interviews and that interviewers are added as attendees. Add CV: <url> and Notes: <url> in the event description when available. Import the workflow and add credentials: Google Calendar (OAuth) SMTP/Gmail for sending the email digests Keep the default 08:00 IST schedule in the Cron node or adjust as needed. Requirements Google Workspace account with access to the Interviews calendar. Gmail sender account for digests (App Password if using 2FA). n8n instance (cloud or self‑hosted). Steps Trigger Schedule Node:** Schedule Trigger Purpose:* Starts the workflow daily at *8:00 AM**. Fetch Interview Events Node:** Google Calendar(Fetch Interview Events) Purpose:** Retrieves all events (interviews) from the configured calendar. Output:** Event details including summary, time, and organizer email. Group & Format Schedule Node:** HTML Table (JavaScript Code Node) Purpose:** Groups events by interviewer email and generates an HTML schedule table. Output:** Formatted fields: interviewer_email subject htmlContent Send Personalized Emails Node:** Gmail Purpose:** Sends the formatted interview schedule to each interviewer’s email address. Send To:** Dynamically set using ={{ $json.interviewer_email }} Subject:** "Interview Reminder" Body:** ={{ $json.htmlContent }} (HTML) How to customize the workflow Parsing rules:** If your event titles follow a pattern (e.g., Onsite – {Candidate} – {Role}), tweak the regex in the Code node. Attendee logic:** Refine how interviewers are detected (e.g., filter only accepted responses, or include tentative). Digest format:** Adjust table columns/order, or add role/team labels from the title. Schedule:** Duplicate the Cron for regional time zones or add a midday refresh. Add-ons to level up the Workflow with additional nodes Reminder pings:** Add 10‑minute pre‑interview reminders via Email or Slack/Teams. Conflict alerts:** Flag if an interviewer is double‑booked within a 15‑minute window. Feedback follow‑up:** After the scheduled time, DM or email a standardized feedback form link. Drive search (optional):** If you later want auto‑attach CVs, add a Google Drive search step (by candidate name) in a designated folder. Common troubleshooting points No events found:* Confirm the calendar name is *Interviews* and that events exist *today**. Missing links:** If CV/notes links aren’t in the email, add CV:/Notes: links to the event description. Email not sent:** Verify SMTP credentials, from‑address permissions, and any sending limits. Time mismatch:* Confirm workflow timezone and Calendar times are set to *Asia/Kolkata** (or adjust). A short note If you need help tailoring the parsing rules, adjusting the schedule or troubleshooting any step, feel free to reach out we will happy to help.
by Dahiana
YouTube Transcript Extractor This n8n template demonstrates how to extract transcripts from YouTube videos using two different approaches: automated Google Sheets monitoring and direct webhook API calls. Use cases: Content creation, research, accessibility, meeting notes, content repurposing, SEO analysis, or building transcript databases for analysis. How it works Google Sheets Integration:** Monitor a sheet for new YouTube URLs and automatically extract transcripts Direct API Access:** Send YouTube URLs via webhook and get instant transcript responses Smart Parsing:** Extracts video ID from various YouTube URL formats (youtube.com, youtu.be, embed) Rich Metadata:** Returns video title, channel, publish date, duration, and category alongside transcript Fallback Handling:** Gracefully handles videos without available transcripts Two Workflow Paths Automated Sheet Processing: Add URLs to Google Sheet → Auto-extract → Save results to sheet Webhook API: Send POST request with video URL → Get instant transcript response How to set up Replace "Dummy YouTube Transcript API" credentials with your YouTube Transcript API key Create your own Google Sheet with columns: "url" (input sheet) and "video title", "transcript" (results sheet) Update Google Sheets credentials to connect your sheets Test each workflow path separately Customize the webhook path and authentication as needed Requirements YouTube Transcript API access (youtube-transcript.io or similar) Google Sheets API credentials (for automated workflow) n8n instance (cloud or self-hosted) YouTube videos How to customize Modify transcript processing in the Code nodes Add additional metadata extraction Connect to other storage solutions (databases, CMS) Add text analysis or summarization steps Set up notifications for new transcripts
by Oneclick AI Squad
This n8n workflow monitors blood stock levels daily and sends alerts when availability is low. It fetches data from a Google Sheet, checks stock status, and notifies via WhatsApp. Key Features Daily Monitoring**: Checks blood stock every day. Automated Alerts**: Sends notifications when stock is low. Real-Time Updates**: Uses live data from Google Sheets. Efficient Delivery**: Alerts sent instantly via WhatsApp. Continuous Check**: Loops to ensure ongoing monitoring. Workflow Process Daily Check Blood Stock: Triggers the workflow daily. Fetch Blood Stock: Reads data from a Google Sheet. Get All Stock: Collects all available blood stock details. Check Stock Availability: Analyzes stock levels for low thresholds. Send Alert Message: Sends WhatsApp alerts if stock is low. Sheet Columns Blood Type**: Type of blood (e.g., A+, O-). Quantity**: Current stock amount. Threshold**: Minimum acceptable stock level. Last Updated**: Date and time of last update. Status**: Current status (e.g., Low, Sufficient). Setup Instructions Import Workflow**: Add the workflow to n8n via the import option. Configure Sheet**: Set up a Google Sheet with blood stock data. Set Up WhatsApp**: Configure WhatsApp API credentials in n8n. Activate**: Save and enable the workflow. Test**: Simulate low stock to verify alerts. Requirements n8n Instance**: Hosted or cloud-based n8n setup. Google Sheets**: Access with stock data. WhatsApp API**: Integration for sending alerts. Admin Access**: For monitoring and updates. Customization Options Adjust Threshold**: Change low stock limits. Add Channels**: Include email or SMS alerts. Update Frequency**: Modify daily trigger time.