by Jayraj Pamnani
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Workflow Description: Startup Idea Finder (n8n) What This Workflow Does: This n8n workflow helps entrepreneurs discover startup ideas by automatically scraping top posts from multiple subreddits that often discuss unmet needs or problems (e.g., posts containing phrases like “Why is there no tool that”, “I wish there was an app for”, “someone should make”, etc.). The workflow extracts key information from these posts and sends it to Google’s Gemini 1.5 Flash-8b AI model, which analyzes the problem and suggests possible solutions or startup ideas. All relevant data and AI-generated insights are then saved to a Google Sheet for easy review and tracking. How It Works (Step-by-Step) 1. Manual Trigger: The workflow starts with a manual trigger. 2. Reddit Scraping: It queries multiple subreddits for top posts matching specific keywords that indicate a problem or unmet need. 3. Merge & Edit Fields: The results are merged and filtered to keep only the necessary fields: title, selftext, ups, created, and url. 4. AI Analysis: The filtered post data is sent to the Gemini 1.5 Flash-8b model with a prompt asking for: An explanation of the core problem, Whether existing solutions exist, A new startup idea if not, The target user, An implementation overview. 5. Google Sheets Logging: Both the original post data and the AI’s output are appended as a new row in a Google Sheet for future reference. APIs & Credentials Needed To use this workflow, you must set up the following credentials in your n8n instance: Reddit API: For scraping subreddit posts. Google Gemini (PaLM) API: For AI-powered analysis and idea generation. Google Sheets API: For saving results to your spreadsheet. Google Sheets Setup Before running the workflow, create a Google Sheet with the following columns (in this order): title, selftext, ups, created, url, output* The workflow will automatically append new rows with the scraped post data and the AI-generated output. Summary This workflow is a powerful tool for anyone looking to systematically discover and analyze real-world problems discussed online, and to generate actionable startup ideas using AI. Just set up your credentials, prepare your Google Sheet, and you’re ready to start finding your next big idea!
by Hirokazu Kawamoto
How it works Please send a corporate website URL via chat. The AI will investigate the company website on your behalf and return the extracted company information. Since this is set up as a conversational workflow, retrying or trying another URL is simple. How to use To get started, please set up the Credential in the Gemini node attached to the AI Agent node. You can obtain an API key from Google AI Studio. Once configured, the workflow will run when you send a corporate website URL (e.g., https://example.com/) via chat. Customizing this workflow You can change the settings in the Config node. You can modify targetCompanyFields to customize which company data fields are extracted. You can modify language to receive the results in a language other than English.
by Cordexa Technologies
Description Automatically fetch, rank, and summarize the top financial stories from curated RSS feeds each day, then deliver a concise AI-written digest to Telegram and log the run to Google Sheets. What This Template Does Runs on a daily schedule and reads 8 curated financial RSS feeds. Loops through each feed with the native RSS Read node to pull recent articles. Tags each article with source name and source tier for ranking. Scores and ranks stories by source quality, recency, and finance-related keyword relevance. Sends the top stories to NVIDIA NIM using Mistral Large 3 to generate a concise digest. Builds a Telegram-ready message with a customizable header and footer. Sends the digest to a Telegram personal chat or public channel. Logs successful runs to Digest_Log and delivery failures to Error_Log in Google Sheets. Key Benefits Free-tier-friendly setup using NVIDIA NIM, Telegram Bot API, and Google Sheets. Easy to adapt for other niches by changing RSS feeds, ranking logic, and branding. Reduces manual news scanning by producing a concise daily digest automatically. Works for both personal use and public Telegram channel delivery. Includes logging for both successful runs and delivery failures. Features 8 pre-configured financial RSS feeds with tier-based scoring. Native n8n RSS Read node for feed ingestion. NVIDIA NIM integration using Mistral Large 3 via HTTP Header Auth. Customizable digest header and footer in a single Code node. Google Sheets logging with Digest_Log and Error_Log tabs. Error handling for Telegram delivery failures. Requirements An n8n instance. A NVIDIA NIM API key. A Telegram bot token and a chat ID or channel username. A Google account with Google Sheets OAuth connected in n8n. Target Audience Investors and traders who want a curated daily briefing. Finance professionals tracking market-moving news. Founders and operators monitoring business and macro signals. No-code builders creating niche Telegram digest workflows. Agencies building white-label news automations. Step-by-Step Setup Instructions Create a HTTP Header Auth credential for NVIDIA NIM: Header name: Authorization Value: Bearer YOUR_NVIDIA_API_KEY Create a Telegram bot with @BotFather and get your chat ID or channel username. Open RSS Feed Config1 and set: telegramBotToken telegramChatId googleSheetId any RSS feeds you want to add or replace Create a Google Sheet with two tabs: Digest_Log Error_Log Connect Google Sheets OAuth in n8n. Run the workflow manually once before activating the schedule. Confirm one successful run appears in Digest_Log. Trigger one failure test and confirm a row appears in Error_Log. Built by Cordexa Technologies https://cordexa.tech cordexatech@gmail.com
by Ruthwik
⚡ Next-Gen Customer Support: Two-Way WhatsApp + Telegram Integration for 10k+ Clients Who is this workflow for This workflow is designed for **customer support teams, e-commerce founders, and operations managers** who want to handle thousands of customer queries seamlessly. Instead of building a brand-new chat application, it leverages WhatsApp (where customers already are) and Telegram (where your support team operates) to create a scalable, topic-based support system. If you are a brand handling 1000s of daily WhatsApp customer messages and need a structured way to map each customer into a dedicated support thread without chaos, this workflow is for you. What it does / How it works This two-way n8n automation bridges WhatsApp and Telegram by creating one Telegram forum topic per customer and syncing messages both ways: Incoming WhatsApp → Telegram When a new WhatsApp message arrives, the workflow checks if the customer already has a topic in Telegram. If yes → The message is forwarded into that existing topic. If no → A new topic is created automatically, the mapping is saved in the database, and the message is posted there. Result: every customer has a dedicated thread in your Telegram supergroup. Outgoing Telegram → WhatsApp When a support agent replies in a Telegram topic, the workflow looks up the linked WhatsApp number. The reply is sent back to the customer on WhatsApp, preserving context. Result: two-way synced conversations without building a custom app. How to set it up Configure WhatsApp Cloud API Create a Meta Developer account and register a WhatsApp Business number. Generate an access token and phone number ID. Configure Telegram Bot Use BotFather to create a bot and enable it in a **Telegram Supergroup with Topics**. Get the chat_id and allow the bot to create/send messages in topics. Database (Supabase/Postgres) Create a table wa_tg_threads to map phone_e164 ↔ telegram_topic_id ↔ supergroup_id. n8n Workflows Workflow A: WhatsApp → Telegram Trigger: WhatsApp Webhook Steps: Lookup customer → If exists send to topic, else create topic → Save mapping → Forward message. Workflow B: Telegram → WhatsApp Trigger: Telegram Webhook Steps: Filter only topic replies → Lookup mapping → Send WhatsApp message. Testing Send a WhatsApp message → Check Telegram topic created. Reply in Telegram topic → Ensure customer receives WhatsApp reply. Requirements A free or paid n8n instance (self-hosted or cloud). WhatsApp Cloud API credentials** (phone number ID + access token). Telegram Bot token* with access to a *Supergroup with Topics** enabled. A Postgres/Supabase database to store thread mappings. Basic familiarity with editing HTTP Request nodes in n8n. How to customize the workflow Brand personalization:** Pre-populate first message templates (thank you, order status, delivery updates). Routing rules:** Assign specific agents to certain topics by ID ranges. Integrations:** Extend to CRMs (HubSpot, Zoho) or support platforms (Freshdesk, Zendesk). Notifications:** Push high-priority WhatsApp queries into Slack/Teams for instant alerts. Archival:** Auto-close inactive topics after N days and mark customers as dormant. Why Telegram instead of building a new App The client's requirement was clear: **use an existing, reliable, and scalable chat platform** instead of building a new app from scratch. Telegram Supergroups with Topics** scale to 100,000+ members and millions of messages, making them ideal for managing 10k+ customer threads. Agents don't need to install or learn a new tool---they continue inside Telegram, which is fast, free, and mobile-friendly. Building a custom chat app would require authentication, push notifications, scaling infra, and UX---all solved instantly by Telegram. This decision **saves development cost, accelerates deployment, and provides proven scalability**. Why this improves support productivity Organized by customer:** Each WhatsApp number has its own Telegram topic. No missed messages:** Agents can quickly scroll topics without drowning in one endless chat. Two-way sync:** Replies flow back to WhatsApp seamlessly. Scales automatically:** Handle 10k+ conversations without losing track. Leverages existing tools:** WhatsApp (customers) + Telegram (agents). Result: **faster responses, better tracking, and zero need to reinvent chat software.**
by Kevin Meneses
How it works Runs on a schedule and iterates a watchlist of symbols (e.g., BTC/ETH/SOL). For each symbol, request intraday 1h OHLCV from EODHD. A Code node computes Wilder’s RSI(14) and detects 30/70 crossings. When a signal appears, the bot sends a Telegram alert (HTML message) with price, RSI (prev → now), timestamp, and a “View chart” button that opens the pair on TradingView (BINANCE/USD). Set up steps (≈10–15 min) Prereqs: n8n (cloud or self-hosted), EODHD API key, Telegram bot + your chat_id. Env vars: set EODHD_TOKEN and TELEGRAM_CHAT_ID on your n8n instance. Credentials: add your Telegram credential (bot token). Import the workflow JSON. Edit Fields node: adjust the symbol array to your watchlist. Schedule Trigger: choose how often to run (e.g., every 5–10 min). Test: temporarily flip the Code node’s FORCE_ALERT flag to true to verify Telegram delivery, then set it back to false.
by oka hironobu
Who is this for Team leads, project managers, and operations staff who want to automate meeting documentation. Useful for any team that records meetings and needs structured notes with clear action items. What this workflow does This workflow accepts a meeting recording upload via a web form. The recording is uploaded to Gemini Files API for audio analysis. Gemini AI generates a structured summary including key decisions, action items with assignees, and follow-up topics. A Notion page is created with the complete notes, and the team is notified on Slack with a summary and the action item list. Setup Add a Google Gemini API credential for file upload and audio analysis. Add a Notion API credential and create a database with columns: Title, Date, Summary, Action Items, Status. Add a Slack OAuth2 credential and set your meetings channel. Requirements Google Gemini API key (supports audio file analysis) Notion workspace with API integration enabled Slack workspace with OAuth2 app How to customize Edit the analysis prompt in "Analyze recording with Gemini" to focus on specific meeting types (standup, retrospective, planning). Change the Gemini model to a larger variant for longer recordings. Add a Google Calendar integration to automatically match recordings to calendar events. Important disclaimer AI-generated summaries may not capture every detail or nuance from the recording. Always review the notes before sharing externally or making decisions based on them.
by Budi SJ
Personalized Weather Assistant with Google Calendar, WeatherAPI, AI & Telegram This workflow automates the delivery of a personalized daily agenda by combining events from Google Calendar with real-time local weather conditions. Using AI-powered summarization and Telegram integration, users receive a friendly and motivating message every morning everything needed to plan the day effectively. Perfect for professionals or individuals who want an overview of their schedule and weather in one place. 🛠️ Key Features Triggered automatically every morning by schedule Fetches Google Calendar events for today and tomorrow Retrieves weather conditions (temperature, humidity, wind, UV index) using WeatherAPI based on event location Uses AI Agent to generate a concise, human-friendly agenda summary Sends the summary via Telegram bot If no location is available, delivers a simplified agenda without weather 🔧 Requirements Google Calendar OAuth2 credentials connected to n8n WeatherAPI key (weatherapi.com) Telegram Bot Token and user chat_id OpenRouter API Key (openrouter.ai) 🧩 Setup Instructions Timezone Adjust timezone in the Set Timezone node Google Calendar Add Google Calendar OAuth2 credentials Set your primary calendar ID in the Get many events node WeatherAPI Replace the API key in the HTTP Request node with your WeatherAPI key OpenRouter Create credentials in n8n and connect them to the OpenRouter Chat Model nodes Telegram Add your bot token and chat_id to both Telegram nodes Deploy Activate the workflow You’ll start receiving personalized daily Telegram messages
by Alex Berman
Who is this for This template is for B2B sales teams, SDRs, growth marketers, and founders who maintain a spreadsheet of prospects and need verified contact details -- emails and mobile numbers -- without manual research. How it works Reads a list of contacts (first name, last name, company domain) from a Google Sheet. Formats the contacts and submits them to the ScraperCity Email Finder API to discover business email addresses. Polls until the email-finder job completes, then downloads and parses the results. Submits the found emails to the ScraperCity Mobile Finder API to look up phone numbers. Polls until the mobile-finder job completes, then downloads and parses results. Submits all found emails to the ScraperCity Email Validator API for deliverability and catch-all checks. Polls until validation completes, merges all enriched data together, and writes the final enriched rows back to a Google Sheet. How to set up Add your ScraperCity API key as an HTTP Header Auth credential named "ScraperCity API Key". Set your input Google Sheets document ID and sheet name in the "Configure Workflow" node. Set your output Google Sheets document ID and sheet name in the same node. Click "Execute workflow" to run. Requirements ScraperCity account with Email Finder, Mobile Finder, and Email Validator products enabled. Google Sheets OAuth2 credential connected to n8n. Input sheet with columns: first_name, last_name, domain. How to customize the workflow Swap Google Sheets for Airtable or a webhook trigger. Add a Filter node after validation to keep only emails with status "valid". Extend the output Set node to include additional fields from either API response.
by Abrar Sami
How it works Fetches a blog post HTML from your blog URL using an HTTP request node Extracts readable content using Cheerio (code node) Saves the raw blog text to Airtable Translates the content to a language of your choice using Google Translate Updates the same Airtable record with the translated version in a different column Set up steps Estimated setup time:** 15–20 minutes (includes connecting Airtable and Google Translate credentials) You’ll need an Airtable base with HTML and TRANSLATED fields Or use this pre-made base: Airtable Template Simply add your blog post URL inside the HTTP Request node
by Mohammad Jibril
Overview This template creates a smart FAQ bot on Telegram, powered by Google Gemini for intelligent answers and Supabase to store user data. The workflow can distinguish between new and existing users. How It Works Trigger: The workflow starts when a user sends any message to the Telegram Bot. Check User: It looks up the user's chat_id in a Supabase telegram_users table. Route (New vs. Existing): New User (True Path): If the user is not found, the workflow saves their chat_id to Supabase and sends a one-time welcome message. Existing User (False Path): If the user exists, the workflow proceeds to the AI step. Generate Answer: It loads a predefined FAQ context and combines it with the user's question. This is sent to Google Gemini via the AI Agent node. Send Response: The AI-generated answer is sent back to the user on Telegram. Setup Instructions Telegram: Connect your Telegram credentials to the Telegram Trigger and both Send a text message nodes. Supabase: Connect your Supabase credentials to the Find user in DB and Create a row nodes. You MUST have a table named telegram_users. This table MUST have a column named chat_id (type: text or varchar). Google Gemini: Connect your Google Gemini (Palm API) credentials to the Google Gemini Chat Model node. (REQUIRED) Customization: Open the Set FAQ Context node and change its contents with Questions (Q) and Answers (A) that are appropriate for your bot. Change the text in the Send a text message (Welcome Message) node as you want.
by SOLOVIEVA ANNA
Who this workflow is for This template is for teams who want a lightweight “daily icebreaker” in Slack and creators who’d like to build a reusable trivia database over time. It works well for remote teams, communities, and any workspace that enjoys a quick brain teaser each day. What this workflow does The workflow fetches a random multiple-choice question from the Open Trivia Database (OpenTDB), posts a nicely formatted trivia message to a Slack channel, and logs the full question and answers into a Google Sheets spreadsheet. Over time, this creates a searchable “trivia archive” you can reuse for quizzes, content, or community events. How it works A Schedule Trigger runs once per day at a time you define. A Set node randomly chooses a difficulty level (easy, medium, or hard). A Switch node routes to the matching OpenTDB HTTP request. Each branch normalizes the API response into common fields (timestamp, date, difficulty, category, question, correct, incorrect, messageTitle, messageBody). A Merge node combines the three branches into a single stream. Slack posts the trivia message. Google Sheets appends the same data as a new row. How to set up Connect your Slack OAuth2 credentials and choose a target channel. Connect your Google Sheets credentials and select the spreadsheet and sheet. Adjust the schedule (time and frequency) to match your use case. How to customize Change the Slack message format (for example, add emojis or hints). Filter categories or difficulty levels instead of picking them fully at random. Add additional logging (e.g., user reactions, answer stats) in Sheets or another datastore.
by DIGITAL BIZ TECH
SharePoint → Supabase → Google Drive Sync Workflow Overview This workflow is a multi-system document synchronization pipeline built in n8n, designed to automatically sync and back up files between Microsoft SharePoint, Supabase/Postgres, and Google Drive. It runs on a scheduled trigger, compares SharePoint file metadata against your Supabase table, downloads new or updated files, uploads them to Google Drive, and marks records as completed — keeping your databases and storage systems perfectly in sync. Workflow Structure Data Source:** SharePoint REST API for recursive folder and file discovery. Processing Layer:** n8n logic for filtering, comparison, and metadata normalization. Destination Systems:** Supabase/Postgres for metadata, Google Drive for file backup. SharePoint Sync Flow (Frontend Flow) Trigger:** Schedule Trigger Runs at fixed intervals (customizable) to start synchronization. Fetch Files:** Microsoft SharePoint HTTP Request Recursively retrieves folders and files using SharePoint’s REST API: /GetFolderByServerRelativeUrl(...)?$expand=Files,Folders,Folders/Files,Folders/Folders/Folders/Files Filter Files:** filter files A Code node that flattens nested folders and filters unwanted file types: Excludes system or temporary files (~$) Excludes extensions: .db, .msg, .xlsx, .xlsm, .pptx Normalize Metadata:** normalize last modified date Ensures consistent Last_modified_date format for accurate comparison. Fetch Existing Records:** Supabase (Get) Retrieves current entries from n8n_metadata to compare against SharePoint files. Compare Datasets:** Compare Datasets Detects new or modified files based on UniqueId, Last_modified_date, and Exists. Routes only changed entries forward for processing. File Processing Engine (Backend Flow) Loop:** Loop Over Items2 Iterates through each new or updated file detected. Build Metadata:** get metadata and Set metadata Constructs final metadata fields: file_id, file_title, file_url, file_type, foldername, last_modified_date Generates fileUrl using UniqueId and ServerRelativeUrl if missing. Upsert Metadata:** Insert Document Metadata Inserts or updates file records in Supabase/Postgres (n8n_metadata table). Operation: upsert with id as the primary matching key. Download File:** Microsoft SharePoint HTTP Request1 Fetches the binary file directly from SharePoint using its ServerRelativeUrl. Rename File:** rename files Renames each downloaded binary file to its original file_title before upload. Upload File:** Upload file Uploads the renamed file to Google Drive (My Drive → root folder). Mark Complete:** Postgres Updates the Supabase/Postgres record setting Loading Done = true. Optional Cleanup:** Supabase1 Deletes obsolete or invalid metadata entries when required. Integrations Used | Service | Purpose | Credential | |----------|----------|-------------| | Microsoft SharePoint | File retrieval and download | microsoftSharePointOAuth2Api | | Supabase / Postgres | Metadata storage and synchronization | Supabase account 6 ayan | | Google Drive | File backup and redundancy | Google Drive account 6 rn dbt | | n8n Core | Flow control, dataset comparison, batch looping | Native | System Prompt Summary > “You are a SharePoint document synchronization workflow. Fetch all files, compare them to database entries, and only process new or modified files. Download files, rename correctly, upload to Google Drive, and mark as completed in Supabase.” Workflow rule summary: > “Maintain data integrity, prevent duplicates, handle retries gracefully, and continue on errors. Skip excluded file types and ensure reliable backups between all connected systems.” Key Features Scheduled automatic sync across SharePoint, Supabase, and Google Drive Intelligent comparison to detect only new or modified files Idempotent upsert for consistent metadata updates Configurable file exclusion filters Safe rename + upload pipeline for clean backups Error-tolerant and fully automated operation Summary > A reliable, SharePoint-to-Google Drive synchronization workflow built with n8n, integrating Supabase/Postgres for metadata management. It automates file fetching, filtering, downloading, uploading, and marking as completed — ensuring your data stays mirrored across platforms. Perfect for enterprises managing document automation, backup systems, or cross-cloud data synchronization. Need Help or More Workflows? Want to customize this workflow for your organization? Our team at Digital Biz Tech can extend it for enterprise-scale document automation, RAGs and social media automation. We can help you set it up for free — from connecting credentials to deploying it live. Contact: rajeet.nair@digitalbiz.tech Website: https://www.digitalbiz.tech LinkedIn: https://www.linkedin.com/company/digital-biz-tech/ You can also DM us on LinkedIn for any help.