by Rapiwa
Who is this for? This workflow is for Shopify store owners, customer success, and marketing teams who want to automatically check customers’ WhatsApp numbers and send personalized messages with discount codes for canceled orders. It helps recover lost sales by reaching out with special offers. What This Workflow Does Automatically checks for canceled orders on a schedule Fetches canceled orders from Shopify Creates personalized recovery messages based on customer data Verifies customers’ WhatsApp numbers via Rapiwa Logs results in Google Sheets: “Verified & Sent” for successful messages, “Unverified & Not Sent” for unverified numbers Requirements Shopify store with API access enabled Shopify API credentials with access to orders and customer data Rapiwa account and a valid Bearer token Google account with Sheets access and OAuth2 credentials Setup plan Add your credentials Rapiwa: Create an HTTP Bearer credential in n8n and paste your token (example name: Rapiwa Bearer Auth). Google Sheets: Add an OAuth2 credential (example name: Google Sheets). Set up Shopify Replace your_shopify_domain with your real Shopify domain. Replace your_shop_access-token with your actual Shopify API token. Set up Google Sheets Update the example spreadsheet ID and sheet gid with your own. Make sure your sheet’s column headers match the mapping keys exactly—same spelling, case, and no extra spaces. Configure the Schedule Trigger Choose how often you want the workflow to check for canceled orders (daily, weekly, etc.). Check the HTTP Request nodes Verify endpoint: Should call Rapiwa’s verifyWhatsAppNumber. Send endpoint: Should use Rapiwa’s send-message API with your template (includes customer name, reorder link, discount code). Google Sheet Column Structure The Google Sheets nodes in the flow append rows with these column. Make sure the sheet headers match exactly. A Google Sheet formatted like this ➤ sample | Name | Number | Item Name | Coupon | Item Link | Validity | Status | | ------------ | ------------- | ------------------------------------------- | -------- | --------------------------------------------------------------------------------------------------------------------------------------------- | ----------- | ---------- | ------ | | Abdul Mannan | 8801322827799 | Samsung Galaxy S24 Ultra 5G 256GB-512GB-1TB | REORDER5 | Re-order Link | verified | sent | | Abdul Mannan | 8801322827790 | Samsung Galaxy S24 Ultra 5G 256GB-512GB-1TB | REORDER5 | Re-order Link | unverified | not sent | Important Notes Do not hard-code API keys or tokens; always use n8n credentials. Google Sheets column header names must match the mapping keys used in the nodes. Trailing spaces are common accidental problems — trim them in the spreadsheet or adjust the mapping. Message templates reference - update templates if you need to reference different data. The workflow processes cancelled orders in batches to avoid rate limits. Adjust the batch size if needed. Useful Links Install Rapiwa**: How to install Rapiwa Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Shopify API Documentation:** https://shopify.dev/docs/admin-api Support & Help WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by TheStock
MLB "Hits" Workflow — Overview • Pulls today's MLB schedule incl. probablePitcher + lineups (statsapi.mlb.com) • Batches season stats for all involved players • Builds pitcher vs. batter matchup rows Setup Steps: 1) Google Sheets OAuth2 → point to your Google Sheet → Tab name 2) Create Telegram Bot cred + chatId 3) Optional: tweak ERA/OPS thresholds or Top N in nodes 6 or 8. Optional Setup: Conditional Formatting in the Sheet for ColorScale on best stat in the column, as well as Lefty vs Right highlighting. COMING SOON! Results Tracker tab and Apps Scripts and formulas.**
by NAZIA AI ACADEMY
How it works This workflow lets users generate AI images directly from Telegram messages using: Google Gemini API – to convert text to detailed, high-quality image prompts. Pollinations API – to generate free AI images from the prompts. Telegram Bot – to interact with users and return generated images instantly. It’s fully automated using n8n — from text message to stunning image, all in one flow. Perfect for creators, content marketers, or anyone wanting quick visuals on the go. Set up steps 🧩 Estimated setup time: ~10–15 minutes Create a Telegram Bot via @BotFather, copy your token, and set up the Telegram Trigger node in n8n with your credentials. Set up Google Gemini API via Google AI Studio or Cloud Console. Make sure your API key is added in the credentials section of the Gemini node. Customize prompt structure or image size in the Fields - Set Values or Prompt Agent node. (Optional) Enable Save to Disk if you want to keep a local copy of every image. Deploy and run the workflow — done 🎉 🛠️ All technical details and logic are fully documented inside the workflow using sticky notes. ⚠️ Requirements n8n (Self-hosted or Cloud) Telegram Bot Token Google Gemini API key (with billing enabled — includes some free usage) No key needed for Pollinations API — it's 100% free 🆓
by Yves Tkaczyk
Automated image processing for e-commerce product catalog Use cases Monitor a Google Drive folder, process each image based on the prompt defined in Workflow Configuration and save the new image to the specified output Google Drive folder. Maintain a processing log in Google Sheets. 👍 This use case can be extended to any scenario requiring batch image processing, for example, unifying the look and feel of team photos on a company website. How it works Trigger: Watches a Google Drive folder for new or updated files. Downloads the image, processes it using Google Gemini (Nano Banana), and uploads the new image to the specified output folder. How to use Google Drive and Google Sheets nodes: Create Google credentials with access to Google Drive and Google Sheets. Read more about Google Credentials. Update all Google Drive and Google Sheets nodes (6 nodes total) to use these credentials Gemini AI node: Create Google Gemini(PaLM) Api credentials. Read more about Google Gemini(PaLM) credentials. Update the Edit Image node to use the Gemini Api credentials. Create a Google Sheets spreadsheet following the steps in Google Sheets Configuration (see right ➡️). Ensure the spreadsheet can be accessed as Editor by the account used for the Google Credentials. Create input and output directories in Google Drive. Ensure these directories are accessible by the account used for the credentials. Update the File Created, File Updated and Workflow Configuration node following the steps in the green Notes (see right ➡️). Requirements Google account with Google API access Google AI Studio account with ability to create a Google Gemini API key. Basic n8n knowledge: understanding of triggers, expressions, and credential management Who’s it for Anyone wanting to batch process images for product catalog. Other use cases are applicable. Please reach out reach out if you need help customizing this workflow. 🔒 Security All credentials are stored securely using n8n's credential system. The only potentially sensitive information stored in the workflow is the Google Drive folder and Sheet IDs. These should be secured according to your organization’s needs. Need Help? Reach out on LinkedIn or Ask in the Forum!
by Ninja - Abbas
Overview This workflow automates quiz delivery from a Google Sheet directly into a Telegram group. It ensures that quizzes are posted one by one as polls, and once a quiz is sent, its status in the sheet is automatically updated to prevent duplication. If no pending quiz is available, the workflow notifies a separate Telegram channel to refill the sheet. Target Audience This template is designed for teachers, community managers, and Telegram group admins who want an easy way to run quizzes, trivia games, or knowledge checks without manual posting. Problem It Solves Manually sending quizzes is repetitive and prone to mistakes (like re-posting the same question). This template ensures quizzes are sent in order, tracked, and managed automatically with minimal human effort. Requirements An active Google account with Google Sheets enabled A Telegram bot token (via BotFather) Chat IDs for both the quiz group and the notification group Google Sheet Structure Create a sheet with the following columns: quiz_number | question | option_a | option_b | option_c | option_d | status status**: Use 🟨 for pending quizzes, ✅ once completed Setup Instructions Copy the workflow into your n8n instance. Add your Google Sheets credentials. Replace the YOUR_SHEET_ID placeholder with your sheet ID. Set environment variables: TELEGRAM_BOT_TOKEN TELEGRAM_CHAT_ID (for the group) TELEGRAM_NOTIFY_CHAT_ID (for refill notifications) Run the workflow. Customization Options Adjust the sheet name if not using “Sheet1” Change emoji markers (🟨 / ✅) to your preferred system Modify the Telegram notification message
by Trung Tran
Beginner’s Tutorial: Manage Azure Storage Account Container & Blob with n8n > This beginner-friendly n8n workflow shows you how to generate AI images using OpenAI, store them in Azure Blob Storage, and manage blob containers, all with zero code. 👤 Who’s it for This workflow is perfect for: Beginners learning Azure + OpenAI integration** No-code developers** experimenting with image generation Cloud learners** who want hands-on Blob Storage use cases Anyone who wants to automate storing AI-generated content in the cloud ⚙️ How it works / What it does 🖱️ Trigger the workflow manually using the Execute Workflow node. ✏️ Use the Edit Fields node to input: containerName (e.g., demo-images) imageIdea (e.g., "a robot holding a coffee cup") 📦 Create a new Azure Blob container (Create container). 🤖 Use an OpenAI-powered Prompt Generation Agent to craft the perfect image prompt. 🎨 Generate an image using OpenAI’s DALL·E model. ☁️ Upload the generated image to Azure Blob Storage (Create Blob). 📂 List blobs in the container (Get many blobs). 🧹 Delete any blob as needed (Delete Blob). (Optional) 🗑️ Remove the entire container (Delete container). 🔧 How to set up 🧠 Set up OpenAI Create an OpenAI account and get your API key. In n8n, go to Credentials → OpenAI and paste your key. 🪣 Set up Azure Blob Storage Log in to your Azure Portal. Create a Storage Account (e.g., mystorageaccount). Go to Access Keys tab and copy: Storage Account Name Key1 In n8n, create a new Azure Blob Storage Credential using: Account Name = your storage account name Access Key = key1 value > 📝 This demo uses Access Key authentication. You can also configure Shared Access Signatures (SAS) or OAuth in production setups. Run the Workflow Enter your image idea and container name. Click “Execute Workflow” to test it. 📋 Requirements | Requirement | Description | |------------------------|--------------------------------------------------| | Azure Storage Account | With container-level read/write access | | OpenAI API Key | For image and prompt generation | | n8n Version | v1.0+ recommended | | Image Credits | OpenAI charges tokens for DALL·E image creation | 🛠️ How to customize the workflow 🧠 Adjust Prompt Generation Update the Prompt Agent to include: Specific style (3D, anime, cyberpunk) Brand elements Multiple language options 📁 Organize by Date/User Modify the containerName to auto-include: Date (e.g., images-2025-08-20) Username or session ID 📤 Send Image Output Add Slack, Telegram, or Email nodes to deliver the image Create public links using Azure’s blob permissions 🔁 Cleanup Logic Auto-delete blobs after X days Add versioning or backup logic
by Aliz
This workflow automates the daily backfill of Google Analytics 4 (GA4) data into BigQuery. It fetches 13 essential pre-processed reports (including User Acquisition, Traffic, and E-commerce) and uploads them to automatically created tables in BigQuery, and then send an alert in telegram. How it works Configuration:** You define your Project ID, Dataset, and Date Range in a central "Config" node. Parallel Fetching:** The workflow runs 13 parallel API calls to GA4 to retrieve key reports (e.g., ga4_traffic_sources, ga4_ecommerce_items). Dynamic Tables:** It automatically checks if the target BigQuery table exists and creates it with the correct schema if it's missing. Telegram Alerts:** After execution, it sends a summary message to Telegram indicating success or failure for the day's run. Set up steps Google Credentials (OAuth): This workflow uses n8n's built-in "Google OAuth2 API" credential. You do not need a Service Account key. Connect your Google account and ensure you grant scopes for Google Analytics API and BigQuery API. Config Node: Open the "Backfill Config" node and fill in: GA4 Property ID Google Cloud Project ID BigQuery Dataset ID Telegram Setup (Optional): If you want alerts, configure the Telegram node with your Bot Token and Chat ID. If not, you can disable/remove this node. Schedule: By default, this is set to run daily. It is recommended to use a date expression (e.g., Today - 2 Days) to allow GA4 time to process data fully before fetching.
by Dahiana
Generate hyper-realistic images from Telegram messages with Nano Banana 2 Send any image description to your Telegram bot and receive a hyper-realistic AI-generated photo back in seconds. How it works A user sends a natural language image request to the Telegram bot The bot confirms the receipt Gemini Pro 3 expands the request into a detailed JSON prompt: focal length, aperture, ISO, lighting behavior, material physics, etc. Gemini Flash generates the image and returns it as a base64 string The image is sent back as a photo in the same Telegram chat Setup steps Telegram — Create a bot via @BotFather, copy the token, and add it as a Telegram credential in n8n. Connect it to the Telegram Trigger node and both Telegram send nodes OpenRouter — Add your OpenRouter API key (or any other tool you use) as a credential in n8n. Connect it to the Expand to JSON Prompt and Generate Image nodes. Activate the workflow and send your bot an image description to test Optional Modify the config node with your system prompt preferences for a more diverse pool of results.
by Sankar Battula
Send a product name to Telegram, compare offers across Amazon, Walmart, and Google Shopping, and get the best sensible buying recommendation with confidence level, price range, and a direct purchase link. What this template does This template transforms Telegram into a smart product price comparison tool. Once a user sends a product name, the workflow simultaneously searches Amazon, Walmart, and Google Shopping, standardizes all results into a unified format, and delivers the most practical buying recommendation. Rather than blindly picking the lowest price, an AI step filters out poor matches, suspiciously cheap listings, and mismatched product variants to ensure the recommendation is trustworthy. How it works A user sends a product name to the Telegram bot. The workflow queries Walmart, Amazon, and Google Shopping in parallel. Each marketplace response is cleaned and mapped into a consistent offer structure. All offers are merged and consolidated into a single dataset. An AI agent evaluates price, seller, reviews, and match quality to identify the best sensible offer. A Telegram message is sent back with the recommended price, seller, price range, offer count, confidence level, reasoning, and a direct purchase link. Requirements Before using this template, connect the following credentials: Telegram API OpenAI SerpApi You'll also need to add your own SerpApi key, review the marketplace query settings, and adjust or remove any fixed search location values as needed. Example input DJI Osmo Pocket 3 Creator Combo [file:73] Example output 🔎 DJI Osmo Pocket 3 (Creator Combo / Osmo Pocket 3 family) ✅ Best Price: $557.91 🏪 Seller: B&Z Legend 📊 Range: $6.85 - $1349.99 🧩 Comparable Offers: 69 🎯 Confidence: Medium 📝 Why this was chosen: Chosen the lowest sensible price among New/unspecified offers: B&Z Legend at $557.91. Much cheaper listings (< $200) appear anomalous or risky (very low prices, unclear sellers). There are lower open-box offers (e.g., $498.75) but they carry added risk; established retailers (Best Buy, Official DJI store, OE USA) list in the $569–$999 range if you prefer retailer security. 🔗 Link: https://www.walmart.com/ip/DJI-Osmo-Pocket-3-Creator-Combo-4K-120fps-Camera-1-CMOS-3-Axis-Stabilization-Face-Object-Tracking-Mic-Clear-Sound-Vlogging-Photography/10583721243?classType=VARIANT If no strong match is found, the workflow prompts the user to try a more specific product name. Good to know Results may vary depending on marketplace availability, search query quality, and regional settings. Using specific product names typically yields more accurate recommendations than broad or generic searches. More specific product names usually produce better recommendations than broad searches.
by Patrick Jennings
Sleeper NFL Team Chatbot Starter A Telegram chatbot built to look up your fantasy football team in the Sleeper app and return your roster details, player names, positions, and team info. This starter workflow is perfect for users who want a simple, conversational way to view their Sleeper team in-season or pre-draft. What It Does When a user types their Sleeper username into Telegram, this workflow: Extracts the username from Telegram Pulls their Sleeper User ID Retrieves their Leagues and selects the first one (by default) Pulls the full league Rosters Finds the matching roster owned by that user Uses player_ids to look up full player info from a connected database (e.g. Airtable or Google Sheets) Returns a clean list of player names, positions, and teams via Telegram Requirements To get this running, you’ll need: A Telegram bot (set up through BotFather) A Sleeper Fantasy Football account A synced player database that matches player_id to full player details (we recommend using the companion template: Sleeper NFL Players Daily Sync) Setup Instructions Import the workflow into your n8n instance Add the required credentials: Telegram (API Key from BotFather) Airtable (or replace with another database method like Google Sheets or HTTP request to a hosted JSON file) Trigger the workflow by sending your exact Sleeper username to the bot Your full team roster will return as a formatted message > If the user is in multiple Sleeper leagues, the current logic returns the first league found. Example Output You have 19 players on your roster: Cam Akers (RB - NO), Jared Goff (QB - DET), ... Customization Notes Replace Telegram Trigger with any other input method (webhook, form input, etc.) Replace Airtable node with Google Sheets, SQL DB, or even a local file if preferred You can hardcode a Sleeper username if you're using this for a single user Related Templates Sleeper NFL Players Daily Sync (syncs player_id to player name, position, team) -Create Player Sync first then either integrate it to this template or reate a subworkflow from it & use most recent data set. Difficulty Rating & Comment (from the author) 3 out of 10 if this ain't you're first rodeo, respectfully. Just a little bit more work on adding the Players Sync as your data table & knowing how to GET from Sleeper. If you use Sleeper for fantasy football, lets go win some games!
by ARofiqi Maulana
🚀 Overview This workflow automatically generates IELTS practice content using AI and sends it to Telegram on a schedule. It covers three key IELTS skills: 🧠 Grammar Practice (Monday) ✍️ Writing Task (Wednesday) 📘 Reading Practice (Friday) The workflow is designed to deliver structured, clean, and readable exercises directly to your Telegram chat. ⚙️ How it works The Schedule Trigger runs the workflow on specific days The "Select Test by Day" node determines which practice to generate AI generates structured IELTS content in JSON format The output is parsed and formatted into readable messages Long messages are split into chunks The content is sent to Telegram 🧩 Features AI-powered IELTS content generation Clean JSON parsing and formatting Telegram-ready message delivery Automatic scheduling by day Supports multiple IELTS skills in one workflow 🛠 Setup Create a Telegram Bot via @BotFather Copy your Bot Token Add Telegram credentials in n8n Add your AI credentials (Gemini or OpenAI) Replace the Chat ID in the "Send" nodes with your own 📩 Telegram Setup To get your Chat ID: Send a message to your bot Open: https://api.telegram.org/bot<TOKEN>/getUpdates Copy the chat.id ⚠️ Important Notes This workflow uses a fixed Chat ID for scheduled messages Make sure to replace the Chat ID before running Do not hardcode sensitive credentials Ensure the AI output format remains consistent (JSON) 💡 Customization You can easily customize: Topics and difficulty level Message format Schedule timing Add new modules (e.g., Speaking, Vocabulary) 🎯 Use Cases IELTS learners who want daily practice Teachers sharing exercises with students Automated English learning systems AI-powered educational bots 🚀 Creator Note This workflow demonstrates how to combine AI generation, structured parsing, and messaging automation into a scalable learning system. Feel free to adapt and extend it for your own use case.
by Panth1823
Stop manually checking dozens of career pages. This workflow runs every morning, hits the public APIs of 8+ ATS platforms and job boards, normalizes every listing into a single clean schema, and syncs everything to Supabase and Google Sheets deduplicated and ready to query. Who it's for Job seekers, recruiters, or career platforms that want a consolidated, up-to-date feed of openings from specific companies without scraping, without API keys, and without paying for a jobs aggregator. How it works A Schedule Trigger fires daily at 8 AM IST A Company List code node defines all sources grouped by ATS type (Greenhouse, Lever, Ashby, Workable, SmartRecruiters, RemoteOK, and board APIs like Remotive, Himalayas, Arbeitnow, Jobicy) A Prepare Request node builds the correct API URL and headers for each source, including multi-page pagination for SmartRecruiters (up to 500 jobs via offset) and Himalayas (up to 500 via page param) An HTTP Request node fetches all sources in batches of 5 A Parse + Enrich + Filter node normalizes all divergent JSON structures into a unified schema — resolving ISO country codes, Indian city detection, salary parsing across all formats, and domain-based filtering Deduplicated results are upserted to a Supabase (Postgres) table and written to Google Sheets ATS platforms supported Greenhouse, Lever, Ashby, Workable, SmartRecruiters, RemoteOK, Remotive, Himalayas, Arbeitnow, Jobicy Normalized output schema job_id, title, company, location, country, salary, job_type, apply_url, posted_at, source_ats Setup Open the Company List node and edit the sources array — add or remove companies and their ATS slugs Update ALLOWED_DOMAINS in the Parse node to filter by location or job type relevant to you Add your Supabase credentials in the Postgres node and confirm your table name and schema match the output fields Connect your Google Sheets credentials and set the target spreadsheet and sheet ID (Optional) Adjust pagination limits per source in the Prepare Request node Requirements Self-hosted or cloud n8n instance Supabase project with a jobs table Google Sheets with headers matching the normalized schema No external API keys required — all sources use public endpoints