by Felix
How It Works This workflow automates multi-currency expense tracking via Telegram. Send a receipt photo to your bot, and it automatically extracts the invoice details, converts the amount to EUR using a live exchange rate, and logs everything straight into Google Sheets. Flow overview: User sends a receipt photo via Telegram easybits Extractor reads the document and returns structured data The data is normalised and cleaned The exchange rate is fetched (with fallback if needed) The amount is converted to EUR The result is appended to Google Sheets Step-by-Step Setup Guide 1. Set Up Your easybits Extractor Pipeline Before connecting this workflow, you need a configured extraction pipeline on easybits. Go to extractor.easybits.tech and click "Create a Pipeline". Fill in the Pipeline Name and Description – describe the type of document you're processing (e.g. "Invoice / Receipt"). Upload a sample receipt or invoice as your reference document. Click "Map Fields" and define the following fields to extract: invoice_number (String) – The unique identifier of the invoice, e.g. INV-20240301 currency (String) – The currency code found on the invoice, e.g. USD amount (Number) – The total amount due on the invoice, e.g. 149.99 Click "Save & Test Pipeline" in the Test tab to verify the extraction works correctly. 2. Connect the easybits Node in n8n Once you have finalized your pipeline, go back to your dashboard and click Pipelines in the left sidebar. Click "View Pipeline" on the pipeline you want to connect. On the Pipeline Details page, you will find: API URL: https://extractor.easybits.tech/api/pipelines/[YOUR_PIPELINE_ID] API Key: Your unique authentication token Copy both values and integrate them into the "easybits Extractor" HTTP Request node in the workflow. > To keep in mind: Each pipeline has its own API Key and Pipeline ID. If you have multiple pipelines (for example, one for receipts and one for invoices), you will need separate credentials for each. > Important: When adding your API Key, set the Credential Type to Bearer Auth and paste your API Key as the Bearer Token value. 3. Connect Your Telegram Bot Open the Telegram: Receipt Photo node. Connect your Telegram Bot credentials (Bot Token from @BotFather). Make sure "Download" is enabled under Additional Fields so the image binary is forwarded correctly. 4. Connect Google Sheets Open the Append row in sheet node. Connect your Google Sheets account via OAuth2. Select your target spreadsheet and sheet. Make sure your sheet has at least these two columns: Vendor Name and Overall Due. 5. Activate the Workflow Click the "Active" toggle in the top-right corner of n8n to enable the workflow. Send a receipt photo to your Telegram bot to test it end to end. Check your Google Sheet – a new row with the invoice reference and EUR amount should appear.
by Yaron Been
Amazon Competitive Gap & Assortment Intelligence Workflow Description This workflow automatically scrapes competitor product data from Amazon and identifies gaps in your assortment, pricing, and positioning. It helps merchandising and product teams spot opportunities they are missing before competitors fill them. Overview This workflow uses Bright Data to scrape Amazon product pages, then normalizes the data and feeds it to AI for competitive gap analysis. It identifies: Missing product variants Bundle expansion ideas Positioning gaps Pricing weaknesses Each opportunity is scored and prioritized — high-impact gaps are routed to dedicated sheets, while standard opportunities are logged separately. All results are sent to Google Sheets dashboards for structured decision-making. Tools Used n8n**: Automation platform that orchestrates the workflow Bright Data**: Scrapes Amazon product data at scale without getting blocked OpenRouter**: AI-powered competitive clustering, gap detection, and opportunity scoring Google Sheets**: Logs missing variants, bundle opportunities, pricing gaps, and errors How to Install 1. Import the Workflow Download the .json file and import it into your n8n instance. 2. Configure Bright Data Add your Bright Data API credentials to the Bright Data node. 3. Configure OpenRouter Add your OpenRouter API key for AI competitive analysis. 4. Set Up Google Sheets Create a spreadsheet following the "Google Sheets Setup" sticky note inside the workflow. Connect each Google Sheets node to your document. 5. Customize Edit the configuration node to define: Target Amazon product URL Category scope Competitive depth Opportunity scoring thresholds Use Cases Merchandising Teams Discover product variants competitors carry that are missing from your catalog. Pricing Analysts Detect pricing gaps and positioning weaknesses relative to competitors in your category. Product Managers Find bundle and cross-sell opportunities based on real competitive data. Category Managers Track assortment gaps across an entire product category to prioritize expansion. Ecommerce Strategy Build a data-driven competitive intelligence layer for smarter assortment and pricing decisions. Connect with Me Website: https://www.nofluff.online YouTube: https://www.youtube.com/@YaronBeen/videos LinkedIn: https://www.linkedin.com/in/yaronbeen/ Get Bright Data: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) Tags #n8n #automation #brightdata #webscraping #competitiveanalysis #pricingintelligence #assortmentplanning #ecommerce #amazondata #productgaps #pricingstrategy #competitortracking #merchandising #bundleopportunities #n8nworkflow #workflow #nocode #businessintelligence #marketresearch #pricingoptimization #categorymanagement #retailintelligence #competitivelandscape #productexpansion #ecommerceautomation
by Madame AI
Auto-post curated remote jobs to Telegram with BrowserAct & Gemini This workflow acts as an intelligent job board curator for your Telegram community. It scrapes multiple sources (Remotive, SimplyHired), uses AI to filter out spam and low-quality listings, formats the best jobs into professional posts, and publishes them automatically on a schedule. Target Audience Community managers running job boards, recruiters, and developers building niche job aggregators. How it works Scheduled Fetch: Every 12 hours, the workflow triggers BrowserAct to scrape the latest job listings from Remotive and SimplyHired in parallel. Merge & Aggregate: The raw job data from both sources is combined into a single list. AI Curation: An AI Agent (using Google Gemini) reviews each job. It removes duplicates, filters out gigs paying less than $20/hr, and discards low-quality descriptions. Format Content: The AI rewrites the remaining jobs into clean, engaging HTML summaries suitable for Telegram. Publish: The workflow loops through the curated list and sends each job to your Telegram channel, pausing between messages to avoid rate limits. How to set up Configure Credentials: Connect your Telegram, BrowserAct, and Google Gemini accounts in n8n. Prepare BrowserAct: Ensure you have the Automated Remote Job Fetching & Filtering for Telegram Feed templates (for Remotive and SimplyHired) saved in your BrowserAct account. Configure Telegram: Ensure your bot is an admin in the target channel and add the Chat ID to the Send a text message node. Activate: Turn on the workflow. Requirements BrowserAct* account with the *Automated Remote Job Fetching & Filtering for Telegram Feed** templates. Telegram** account (Bot Token). Google Gemini** account. How to customize the workflow Add More Sources: Duplicate the BrowserAct nodes to scrape additional sites like We Work Remotely or LinkedIn. Refine Filters: Update the system prompt in the AI Agent node to filter by specific keywords (e.g., "Python", "Senior") or locations. Change Frequency: Adjust the Schedule Trigger to run more or less frequently depending on your needs. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Build an AI-Powered Remote Job Aggregator (Remotive & SimplyHired)
by TAKUTO ISHIKAWA
Judge AI math RPG answers and update quest status in Google Sheets Who it's for This template is for educators, parents, or self-learners who want to gamify their study routines. This is Part 2 of the "AI Math RPG" system. It handles the quiz judgment and status updates without using expensive AI tokens for basic math checks. How it works When a user submits their answer via an n8n Form, this workflow searches Google Sheets for their pending quest. It uses a fast and reliable IF node to check if the user's answer matches the correct answer generated previously. If correct, it updates the quest status to solved in Google Sheets to prevent infinite EXP farming, and then uses a Basic LLM Chain to generate an enthusiastic, RPG-style victory fanfare. If incorrect, it returns a friendly "try again" message. How to set up Ensure you have set up Part 1 of this system (Generate AI math RPG quests from study logs). Connect your Google Sheets credential and replace ENTER_YOUR_SPREADSHEET_ID_HERE with your actual Sheet ID. Connect your OpenAI or OpenRouter credential in the LLM node. Open the "Quiz Answer Form" and enter your user ID and answer to test the battle! Requirements A Google account (for Google Sheets) An OpenAI or OpenRouter API key How to customize the workflow You can easily customize the "Generate Victory Message" prompt to match different themes, like a sci-fi battle, a magic school, or historical events!
by Panth1823
Stop manually checking dozens of career pages. This workflow runs every morning, hits the public APIs of 8+ ATS platforms and job boards, normalizes every listing into a single clean schema, and syncs everything to Supabase and Google Sheets deduplicated and ready to query. Who it's for Job seekers, recruiters, or career platforms that want a consolidated, up-to-date feed of openings from specific companies without scraping, without API keys, and without paying for a jobs aggregator. How it works A Schedule Trigger fires daily at 8 AM IST A Company List code node defines all sources grouped by ATS type (Greenhouse, Lever, Ashby, Workable, SmartRecruiters, RemoteOK, and board APIs like Remotive, Himalayas, Arbeitnow, Jobicy) A Prepare Request node builds the correct API URL and headers for each source, including multi-page pagination for SmartRecruiters (up to 500 jobs via offset) and Himalayas (up to 500 via page param) An HTTP Request node fetches all sources in batches of 5 A Parse + Enrich + Filter node normalizes all divergent JSON structures into a unified schema — resolving ISO country codes, Indian city detection, salary parsing across all formats, and domain-based filtering Deduplicated results are upserted to a Supabase (Postgres) table and written to Google Sheets ATS platforms supported Greenhouse, Lever, Ashby, Workable, SmartRecruiters, RemoteOK, Remotive, Himalayas, Arbeitnow, Jobicy Normalized output schema job_id, title, company, location, country, salary, job_type, apply_url, posted_at, source_ats Setup Open the Company List node and edit the sources array — add or remove companies and their ATS slugs Update ALLOWED_DOMAINS in the Parse node to filter by location or job type relevant to you Add your Supabase credentials in the Postgres node and confirm your table name and schema match the output fields Connect your Google Sheets credentials and set the target spreadsheet and sheet ID (Optional) Adjust pagination limits per source in the Prepare Request node Requirements Self-hosted or cloud n8n instance Supabase project with a jobs table Google Sheets with headers matching the normalized schema No external API keys required — all sources use public endpoints
by David Olusola
📉 Buy the Dip Alert (Telegram/Slack/SMS) 📌 Overview This workflow automatically notifies you when Bitcoin or Ethereum drops more than a set percentage in the last 24 hours. It’s ideal for traders who want to stay ready for buy-the-dip opportunities without constantly refreshing charts. ⚙️ How it works Schedule Trigger — runs every 30 minutes (adjustable). HTTP Request (CoinGecko) — fetches BTC & ETH prices and 24h % change. Code Node (“Dip Check”) — compares changes against your dip threshold. IF Node — continues only if dip condition is true. Notification Node — sends alert via Telegram, Slack, or SMS (Twilio). Example Output: Dip Alert — BTC –3.2%, ETH –2.8% Not financial advice. 🛠 Setup Guide 1) Dip threshold Open the Code node. Change the line: const DIP = -2.5; // trigger if 24h drop <= -2.5% Set your preferred dip value (e.g., –5 for a 5% drop). 2) Choose your alert channel Telegram: add your bot token & chat ID. Slack: connect Slack API & set channel name. Twilio: configure SID, token, from/to numbers. 3) Test Temporarily set DIP to 0 to force an alert. Run once from the Code node → confirm alert message text. Execute the Notification node → confirm delivery to your channel. 🎛 Customization Cadence: change Schedule Trigger (every 5m, 15m, hourly, etc.). Coins: extend the CoinGecko call (add solana, bnb) and update Code node logic. Multiple alerts: duplicate IF → Notification branch for different thresholds (minor vs major dip). Combine with “Threshold Alerts” workflow to cover both upside breakouts and downside dips. Storage: log alerts into Google Sheets for tracking dip history. 🧩 Troubleshooting No alerts firing: check CoinGecko API response in Execution Data. Wrong %: CoinGecko returns usd_24h_change directly — no math needed. Duplicate alerts: add a debounce using a Sheet/DB to store last fired time. Telegram not posting: confirm bot has access to your channel/group.
by Sergey Skorobogatov
GiggleGPTBot — Witty Telegram Bot with AI & Postgres 📝 Overview GiggleGPTBot is a witty Telegram bot built with n8n, OpenRouter, and Postgres. It delivers short jokes, motivational one-liners, and playful roasts, responds to mentions, and posts scheduled witty content. The workflow also tracks user activity and provides lightweight statistics and leaderboards. ✨ Features 🤖 AI-powered humor engine — replies with jokes, motivation, random witty lines, or sarcastic roasts. 💬 Command support — /joke, /inspire, /random, /roast, /help, /stats, /top. 🎯 Mention detection — replies when users tag @GiggleGPTBot. ⏰ Scheduled posts — morning jokes, daily motivation, and random wisdom at configured times. 📊 User analytics — counts messages, commands, reactions, and generates leaderboards. 🗄️ Postgres persistence — robust schema with tables for messages, responses, stats, and schedules. 🛠️ How It Works Triggers Telegram Trigger — receives all messages and commands from a chat. Schedule Trigger — runs hourly to check for planned posts. Processing Switch routes commands (/joke, /inspire, /random, /roast, /help, /stats, /top). Chat history fetches the latest context. Mention Analysis determines if the bot was mentioned. Generating an information response builds replies for /help, /stats, /top. AI nodes (AI response to command, AI response to mention, AI post generation) craft witty content via OpenRouter. Persistence Init Database ensures tables exist (user_messages, bot_responses, bot_commands, message_reactions, scheduled_posts, user_stats). Logging nodes update stats and store every bot/user interaction. Delivery Replies are sent back via Telegram Send nodes (Send AI response, Send info reply, Reply to Mention, Submit scheduled post). ⚙️ Setup Instructions Create a Telegram Bot with @BotFather and get your API token. Add credentials in n8n: Telegram API (your bot token) OpenRouter (API key from openrouter.ai) Postgres (use your DB, Supabase works well). Run the Init Database node once to create all required tables. (Optional) Seed schedule with the Adding a schedule node — it inserts: Morning joke at 06:00 Daily motivation at 09:00 Random wisdom at 17:00 (Adjust chat_id to your group/channel ID.) Activate workflow and connect Telegram Webhook or Polling. 📊 Database Schema user\_messages** — stores user chat messages. bot\_responses** — saves bot replies. bot\_commands** — logs command usage. message\_reactions** — tracks reactions. scheduled\_posts** — holds scheduled jokes/wisdom/motivation. user\_stats** — aggregates per-user message/command counts and activity. 🔑 Example Commands /joke → witty one-liner with light irony. /inspire → short motivational phrase. /random → unexpected witty remark. /roast → sarcastic roast (no offensive targeting). /stats → shows your personal stats. /top → displays leaderboard. /help → lists available commands. @GiggleGPTBot + message → bot replies in context. 🚀 Customization Ideas Add new command categories (/quote, /fact, /news). Expand analytics with reaction counts or streaks. Localize prompts into multiple languages. Adjust CRON schedules for posts. ✅ Requirements Telegram Bot token OpenRouter API key Postgres database 📦 Import this workflow, configure credentials, run the DB initializer — and your witty AI-powered Telegram companion is ready!
by David Olusola
WordPress Daily News Digest Generator Overview: This automation automatically fetches trending tech news every morning, uses AI to create engaging blog posts from each article, and publishes them directly to your WordPress site. What it does: Fetches top 10 US technology news stories every day at 8 AM via NewsAPI Splits articles into individual items for processing Processes each article through a loop system AI creates expanded, engaging blog posts (600-800 words) from each news article Parses AI response to extract clean titles and content Publishes individual blog posts to WordPress automatically Setup Required: NewsAPI Configuration Get free API key from newsapi.org (1,000 requests/day free) Replace YOUR_API_KEY in the HTTP Request URL with your actual key Customize country/category parameters in URL if needed WordPress Connection Configure WordPress credentials in the "Publish to WordPress" node Enter your WordPress site URL, username, and password/app password AI Configuration Set up Google Gemini API credentials Connect the Gemini model to the "AI News Summarizer" node Customization Options Publishing Schedule: Modify schedule trigger (default: daily 8 AM) News Sources: Change country, category, or pageSize in NewsAPI URL Content Style: Adjust AI system message for different writing tones Post Status: Change from "publish" to "draft" for manual review Testing Run workflow manually to test all connections Verify news articles are fetched correctly Check that blog posts appear properly on your WordPress site Features: Automatic daily content creation AI-generated unique titles and expanded content Loop processing for multiple articles per day Duplicate content filtering (removes incomplete articles) SEO-optimized blog post formatting Automatic tagging and categorization Customization: Change news categories (technology, business, science, etc.) Adjust posting frequency (hourly, twice daily, etc.) Modify AI prompts for different writing styles Add custom categories and tags Change article limits (currently 5 articles max) Need Help? For n8n coaching or one-on-one consultation
by Felix
How it works I wanted to avoid the rush at end of month to log expenses. I tried existing expense apps but found them either too expensive for what they offer, or frustrating with inconsistent extraction results. That is why I built my own Telegram expense bot that: Lets users send receipt photos or PDFs via Telegram Automatically extracts vendor, amount, date, and category using AI Applies expense rules like partial reimbursement rates (for example, 80% for phone bills) Organizes expenses into monthly Google Sheets tabs Asks for clarification when the category is unclear Supports flexible descriptions via Telegram caption Sends a confirmation message with expense details The whole extraction process takes about 10 seconds and is fully GDPR compliant. No coding. No manual typing. Just snap and send. Step-by-step guide Initial Setup Import the JSON workflow Sign up and log in to easybits at https://extractor.easybits.tech Create a pipeline by uploading an example receipt and mapping the fields you want to extract: -- vendor_name -- total_amount -- currency -- transaction_date -- category -- extraction_confidence For more details, visit our Quick Start Guide Get Your easybits Credentials Once you have finalized your pipeline, go back to your dashboard and click Pipelines in the left sidebar Click View Pipeline on the pipeline you want to connect On the Pipeline Details page, you will find: API URL:** https://extractor.easybits.tech/api/pipelines/[YOUR_PIPELINE_ID] API Key:** Your unique authentication token Copy both values and integrate them into the "Extract with easybits" HTTP Request node To keep in mind: Each pipeline has its own API Key and Pipeline ID. If you have multiple pipelines (for example, one for receipts and one for invoices), you will need separate credentials for each. Important: To integrate your API Key, make sure to set it up in the following format: > Bearer [YOUR_API_KEY] Set Up Telegram Bot Open Telegram and search for @BotFather Send /newbot and follow the prompts Copy your Bot Token and add it to the Telegram credentials in n8n Connect Google Sheets Create a new spreadsheet for expenses Copy the Spreadsheet ID from the URL Update the Google Sheets nodes with your Spreadsheet ID Go Live Activate the workflow and send your first receipt photo to your Telegram bot
by EoCi - Mr.Eo
How It Works Telegram Trigger → Sub-Workflow (Separate Text/Files) → IF (Is PDF?) ├── True: Extract PDF → Set Text → Code (Clean) → Done! └── False: NoOp (Ignore) 9 nodes**: Lightweight, efficient (no loops). Key Steps**: Triggers on Telegram messages - downloads files. Sub-workflow splits text/files. (Optional but recommended) IF node checks MIME type/extension for PDFs. Extracts text via "Extract From File" node. AI writes summary. Responds to user. Setup Instructions IMPORTANT* : Requiring an n8n instance to be configured with public/production webhook ! Recommendation: Ngrok or Cloudflare Tunnel Add the Telegram credential Go to Credentials → New Credential → Telegram API. Paste your Bot Token. Configure the Trigger Open the Telegram Trigger node. Ensure Trigger On contains Message. Enable Download under Additional Fields so file payloads are attached to the node output. (Optional) Enable the sub‑workflow Automate Telegram Message Processing - Separate Text and Files 💬📁 If you need both message text and attachments, enable the Automate Telegram Message Processing – Separate Text and Files node and import the referenced workflow into your workspace. Deploy the workflow and activate it. Testing Send a message with a PDF attachment to your Telegram bot. There will be a response from the chat that summarizes the PDF content. Nodes Used Telegram Trigger**: Listens for incoming messages and downloads attachments. Execute Workflow**: Calls a sub-workflow to separate text and file data (recommended for reliable file handling). If**: Checks MIME type to ensure the file is a PDF. Extract From File**: Converts the PDF binary into text. Set**: Assignments to organize the text data. Code**: Cleans the text (removes excess newlines) to prepare it for the LLM. Basic LLM Chain**: Orchestrates the AI prompt and processing. AI Chat Model**: The specific LLM provider used for high-speed inference. Telegram**: Sends the final summary back to the user. Output Example The bot will reply with a message formatted like this: > Title: Q3 Financial Report > Type: Financial Report > Exec Summary: Revenue increased by 15% due to new product lines... > Key Insights: > • Growth in APAC region. > • Reduced operational costs by 5%. 🙏 Thank You for Trying This Workflow! Your time and trust mean a lot! I truly appreciate you using this template. Your feedback shapes future updates: 💡 Suggestions for improvement 🆕 Ideas for new features 📝 Requests for other automation workflows Please share your thoughts! Every idea helps shape the next update. 🙋♂️ Join & Follow For More Free Templates! Discord Community: We Work Together Get help, share builds, collaborate! Daily tips, tutorials, and updates Thank you again for being part of this journey! 🚀 Together, we automate better! 🤖✨
by jellyfish
This workflow automates the process of monitoring Twitter accounts for intelligence gathering. It fetches new tweets from specified accounts via RSS, uses a powerful AI model (Google Gemini) to analyze the content based on your custom prompts, and sends formatted alerts to a Telegram chat for high-priority findings. Key Features: Scheduled Execution: Runs automatically at your desired interval. Dynamic Configuration: Manage which Twitter accounts to follow and what AI prompts to use directly from a Postgres database. AI-Powered Analysis: Leverages Google Gemini to extract summaries, keywords, and assign an importance level to each tweet. Duplicate Prevention: Keeps track of the last processed tweet to ensure you only get new updates. Customizable Alerts: Sends well-structured and easy-to-read notifications to Telegram. Setup Required: Postgres Database: Set up a table to store your configuration (see the Sticky Note in the workflow for the required schema). RSSHub: You need access to an RSSHub instance to convert Twitter user timelines into RSS feeds. Credentials: Add your credentials for Postgres, Google AI (Gemini), and your Telegram Bot in n8n. Configuration: Update the placeholder values in the RSS and Telegram nodes (e.g., your RSSHub URL, your Telegram Chat ID).
by Ejaz
How it works Run workflow on schedule** fires on a set interval to pull Reddit accounts from a Google Sheets spreadsheet (filtered to exclude shadowbanned accounts) Smart action calculator* randomly selects 3–8 accounts and decides whether each should *post* or *comment**, respecting cooldown timers (1–3hr gap for posts, 30–120min gap for comments) and only operating during active hours (midnight–noon) IP validation loop** routes each account through a proxy, verifies the IP against the account's creation IP using httpbin, and skips if there's a match (to avoid fingerprint overlap) Multilogin browser profile launch** opens an anti-detect browser session per account via the Multilogin API, then connects to the browser's DevTools WebSocket AI Agent (DeepSeek + Browser MCP)** autonomously navigates Reddit, reads subreddit rules, scans recent posts, and either creates a new text post or writes a context-aware comment — all with human-like scroll behavior and natural language Post-action processing** parses the AI's output to extract karma stats, permalinks, and success/failure status, then updates the Google Sheet with timestamps, karma totals, and links Profile cleanup** closes the Multilogin browser profile after each account finishes, then loops to the next account Setup steps ~20 minutes** to configure all credentials and services Connect your Google Sheets service account and point it to your Reddit accounts spreadsheet (columns: multilogin_profile_id, proxy_provider, shadowban?, account_id, account_password, creation_ip, karma, posts_made_today, comments_made_today, time_of_post, time_of_comment, last_allocated_ip, posts_links, comments_links, row_number) Set up your Multilogin API bearer token credential and update the folder ID in the "Open Multilogin Profile" node URL Add your DeepSeek API credential for both AI Agent model nodes Install the Browser MCP community node (n8n-nodes-browser-mcp) and ensure the MCP server is running at the configured baseUrl Update the proxy URL in the "Get Proxy Exit IP" HTTP Request node with your actual proxy credentials Adjust the Run workflow on schedule interval to your desired frequency Review the sticky notes inside the workflow for detailed logic explanations