by Felix
How it works I wanted to avoid the rush at end of month to log expenses. I tried existing expense apps but found them either too expensive for what they offer, or frustrating with inconsistent extraction results. That is why I built my own Telegram expense bot that: Lets users send receipt photos or PDFs via Telegram Automatically extracts vendor, amount, date, and category using AI Applies expense rules like partial reimbursement rates (for example, 80% for phone bills) Organizes expenses into monthly Google Sheets tabs Asks for clarification when the category is unclear Supports flexible descriptions via Telegram caption Sends a confirmation message with expense details The whole extraction process takes about 10 seconds and is fully GDPR compliant. No coding. No manual typing. Just snap and send. Step-by-step guide Initial Setup Import the JSON workflow Sign up and log in to easybits at https://extractor.easybits.tech Create a pipeline by uploading an example receipt and mapping the fields you want to extract: -- vendor_name -- total_amount -- currency -- transaction_date -- category -- extraction_confidence For more details, visit our Quick Start Guide Get Your easybits Credentials Once you have finalized your pipeline, go back to your dashboard and click Pipelines in the left sidebar Click View Pipeline on the pipeline you want to connect On the Pipeline Details page, you will find: API URL:** https://extractor.easybits.tech/api/pipelines/[YOUR_PIPELINE_ID] API Key:** Your unique authentication token Copy both values and integrate them into the "Extract with easybits" HTTP Request node To keep in mind: Each pipeline has its own API Key and Pipeline ID. If you have multiple pipelines (for example, one for receipts and one for invoices), you will need separate credentials for each. Important: To integrate your API Key, make sure to set it up in the following format: > Bearer [YOUR_API_KEY] Set Up Telegram Bot Open Telegram and search for @BotFather Send /newbot and follow the prompts Copy your Bot Token and add it to the Telegram credentials in n8n Connect Google Sheets Create a new spreadsheet for expenses Copy the Spreadsheet ID from the URL Update the Google Sheets nodes with your Spreadsheet ID Go Live Activate the workflow and send your first receipt photo to your Telegram bot
by Rapiwa
Who is this for? This workflow is for Shopify store owners, customer success, and marketing teams who want to automatically check customers’ WhatsApp numbers and send personalized messages with discount codes for canceled orders. It helps recover lost sales by reaching out with special offers. What This Workflow Does Automatically checks for canceled orders on a schedule Fetches canceled orders from Shopify Creates personalized recovery messages based on customer data Verifies customers’ WhatsApp numbers via Rapiwa Logs results in Google Sheets: “Verified & Sent” for successful messages, “Unverified & Not Sent” for unverified numbers Requirements Shopify store with API access enabled Shopify API credentials with access to orders and customer data Rapiwa account and a valid Bearer token Google account with Sheets access and OAuth2 credentials Setup plan Add your credentials Rapiwa: Create an HTTP Bearer credential in n8n and paste your token (example name: Rapiwa Bearer Auth). Google Sheets: Add an OAuth2 credential (example name: Google Sheets). Set up Shopify Replace your_shopify_domain with your real Shopify domain. Replace your_shop_access-token with your actual Shopify API token. Set up Google Sheets Update the example spreadsheet ID and sheet gid with your own. Make sure your sheet’s column headers match the mapping keys exactly—same spelling, case, and no extra spaces. Configure the Schedule Trigger Choose how often you want the workflow to check for canceled orders (daily, weekly, etc.). Check the HTTP Request nodes Verify endpoint: Should call Rapiwa’s verifyWhatsAppNumber. Send endpoint: Should use Rapiwa’s send-message API with your template (includes customer name, reorder link, discount code). Google Sheet Column Structure The Google Sheets nodes in the flow append rows with these column. Make sure the sheet headers match exactly. A Google Sheet formatted like this ➤ sample | Name | Number | Item Name | Coupon | Item Link | Validity | Status | | ------------ | ------------- | ------------------------------------------- | -------- | --------------------------------------------------------------------------------------------------------------------------------------------- | ----------- | ---------- | ------ | | Abdul Mannan | 8801322827799 | Samsung Galaxy S24 Ultra 5G 256GB-512GB-1TB | REORDER5 | Re-order Link | verified | sent | | Abdul Mannan | 8801322827790 | Samsung Galaxy S24 Ultra 5G 256GB-512GB-1TB | REORDER5 | Re-order Link | unverified | not sent | Important Notes Do not hard-code API keys or tokens; always use n8n credentials. Google Sheets column header names must match the mapping keys used in the nodes. Trailing spaces are common accidental problems — trim them in the spreadsheet or adjust the mapping. Message templates reference - update templates if you need to reference different data. The workflow processes cancelled orders in batches to avoid rate limits. Adjust the batch size if needed. Useful Links Install Rapiwa**: How to install Rapiwa Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Shopify API Documentation:** https://shopify.dev/docs/admin-api Support & Help WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Yves Tkaczyk
Automated image processing for e-commerce product catalog Use cases Monitor a Google Drive folder, process each image based on the prompt defined in Workflow Configuration and save the new image to the specified output Google Drive folder. Maintain a processing log in Google Sheets. 👍 This use case can be extended to any scenario requiring batch image processing, for example, unifying the look and feel of team photos on a company website. How it works Trigger: Watches a Google Drive folder for new or updated files. Downloads the image, processes it using Google Gemini (Nano Banana), and uploads the new image to the specified output folder. How to use Google Drive and Google Sheets nodes: Create Google credentials with access to Google Drive and Google Sheets. Read more about Google Credentials. Update all Google Drive and Google Sheets nodes (6 nodes total) to use these credentials Gemini AI node: Create Google Gemini(PaLM) Api credentials. Read more about Google Gemini(PaLM) credentials. Update the Edit Image node to use the Gemini Api credentials. Create a Google Sheets spreadsheet following the steps in Google Sheets Configuration (see right ➡️). Ensure the spreadsheet can be accessed as Editor by the account used for the Google Credentials. Create input and output directories in Google Drive. Ensure these directories are accessible by the account used for the credentials. Update the File Created, File Updated and Workflow Configuration node following the steps in the green Notes (see right ➡️). Requirements Google account with Google API access Google AI Studio account with ability to create a Google Gemini API key. Basic n8n knowledge: understanding of triggers, expressions, and credential management Who’s it for Anyone wanting to batch process images for product catalog. Other use cases are applicable. Please reach out reach out if you need help customizing this workflow. 🔒 Security All credentials are stored securely using n8n's credential system. The only potentially sensitive information stored in the workflow is the Google Drive folder and Sheet IDs. These should be secured according to your organization’s needs. Need Help? Reach out on LinkedIn or Ask in the Forum!
by JJ Tham
Automate Google Ads Search Term Analysis and Send Insights to Slack Stop manually digging through endless Google Ads search term reports! 📊 This workflow puts your brand campaign analysis on autopilot, acting as an AI-powered performance marketer that works for you 24/7. This template fetches your recent search term data, uses AI to identify wasted ad spend and new keyword opportunities, and delivers a concise, actionable report directly to your Slack channel—complete with buttons to approve the changes. ⚙️ How it works This workflow connects to your Google Ads account to pull search term data from your brand campaigns. It then feeds this data to Google Gemini with a specific prompt to: Identify Non-Brand Keywords: Isolate all search terms that are not related to your brand. Calculate Wasted Spend: Find terms with zero conversions and sum up the total cost. Flag Opportunities: Highlight non-brand terms that are converting for manual review. Send to Slack: Format the findings into a beautiful, easy-to-read Slack message with interactive buttons to approve adding the wasteful terms as negative keywords. 👥 Who’s it for? PPC & SEM Managers: Save hours each week by automating the search query mining process. Performance Marketers: Instantly spot and plug budget leaks in your brand campaigns. Digital Marketing Agencies: Provide proactive, data-driven insights to clients with zero manual effort. 🛠️ How to set up This is an advanced workflow that requires several connection points. Setup involves connecting your Google Ads account, providing your Manager and Client IDs, specifying which campaign and brand terms to analyze, configuring the direct API call with your developer token, and finally connecting your Slack workspace. 👉 For a detailed, step-by-step guide, please refer to the yellow sticky note inside the workflow.
by Aziz dev
Description This workflow automates the daily reporting of Google Ads campaign performance. It pulls click and conversion data from the Google Ads API, merges both datasets, and stores the results into Notion databases and Google Sheets. It includes a campaign-level log and a daily performance summary. The workflow is triggered automatically every day at 08:00 AM, helping marketing teams maintain a consistent and centralized reporting system without manual effort. How It Works Scheduled Trigger at 08:00 AM The workflow begins with a Schedule Trigger node that runs once per day at 08:00. Set Yesterday’s Date The Set node defines a variable for the target date (yesterday), which is used in the API queries. Query Google Ads API – Clicks & Cost The first HTTP request pulls campaign-level metrics: campaign.id, campaign.name metrics.clicks, metrics.impressions, metrics.cost_micros Query Google Ads API – Conversions The second HTTP request pulls conversion-related data: metrics.conversions, segments.conversion_action_name Split and Merge Both responses are split into individual campaign rows and merged using: campaign.id segments.date Store Campaign-Level Data Stored in Notion database: "Google Ads Campaign Tracker" Appended to Google Sheets tab: "Campaign Daily Report" Generate Daily Summary A code node calculates daily totals across all campaigns: Total impressions, clicks, conversions, cost Unique conversion types The summary is stored in: Notion database: "Google Ads Daily Summary" Google Sheets tab: "Summary Report" Setup Steps 1. Schedule the Workflow The workflow is triggered using a Schedule Trigger node Set the schedule to run every day at 08:00 AM Connect it to the Set Yesterday Date node 2. Google Ads API Access Create a Google Ads developer account and obtain a developer token Set up OAuth2 credentials with Google Ads scope In n8n, configure the Google Ads OAuth2 API credential Ensure HTTP request headers include: developer-token login-customer-id Content-Type: application/json 3. Notion Database Setup Create two databases in Notion: Google Ads Campaign Tracker** Fields: Campaign Name, Campaign ID, Impressions, Clicks, Cost, Conversion Type, Conversions, Date Google Ads Daily Summary** Fields: Date, Total Impressions, Total Clicks, Total Conversions, Total Cost, Conversion Types Share both databases with your Notion integration 4. Google Sheets Setup Create a spreadsheet with two tabs: Campaign Daily Report → for campaign-level rows Summary Report → for daily aggregated metrics Match all column headers to the workflow fields Connect your Google account to n8n using Google Sheets OAuth2 Output Summary Notion Databases: Google Ads Campaign Tracker: stores individual campaign metrics Google Ads Daily Summary: stores daily totals and conversion types Google Sheets Tabs: Campaign Daily Report: per-campaign data Summary Report: aggregated daily performance
by Nishant Rayan
Create Video with HeyGen and Upload to YouTube Overview This workflow automates the process of creating an AI-generated avatar video using HeyGen and directly uploading it to YouTube. By sending text input via a webhook, the workflow generates a video with a chosen avatar and voice, waits for processing, downloads the completed file, and publishes it to your configured YouTube channel. This template is ideal for automating content creation pipelines, such as daily news updates, explainer videos, or narrated scripts, without manual intervention. Use Case Marketing teams**: Automate explainer or promotional video creation from text input. Content creators**: Generate AI-based avatar videos for YouTube directly from scripts. Organizations**: Streamline video generation for announcements, product updates, or tutorials. Instead of recording and editing videos manually, this template allows you to feed text content into a webhook and have a ready-to-publish video on your YouTube channel within minutes. How It Works Webhook Trigger: The workflow starts when text content and a title are sent to the webhook endpoint. Code Node: Cleans and formats the input text by removing unnecessary newlines and returns it with the title. Set Node: Prepares HeyGen parameters, including API key, avatar ID, voice ID, title, and content. HeyGen API Call: Sends the request to generate a video with the provided avatar and voice. Wait Node: Pauses briefly to allow HeyGen to process the video. Video Status Check: Polls HeyGen to check whether the video has finished processing. Conditional Check: If the video is still processing, it loops back to wait. Once complete, it moves forward. Download Node: Retrieves the generated video file. YouTube Upload Node: Uploads the video to your YouTube channel with the provided title and default settings. Requirements HeyGen API Key**: Required to authenticate with HeyGen’s video generation API. HeyGen Avatar & Voice IDs**: Unique identifiers for the avatar and voice you want to use. YouTube OAuth2 Credentials**: Connected account for video uploads. Setup Instructions Import the Workflow: Download and import this template JSON into your n8n instance. Configure the Webhook: Copy the webhook URL from n8n and use it to send requests with title and content. Example payload: { "title": "Tech News Update", "content": "Today’s top story is about AI advancements in video generation..." } Add HeyGen Credentials: Insert your HeyGen API key in the Set Node under x-api-key. Provide your chosen avatar_id and voice_id from HeyGen. To find your HeyGen avatar_id and voice_id, first retrieve your API key from the HeyGen dashboard. With this key, you can use HeyGen’s API to look up available options: run a GET request to https://api.heygen.com/v2/avatars to see a list of avatars along with their avatar_id, and then run a GET request to https://api.heygen.com/v2/voices to see a list of voices with their voice_id. Once you’ve identified the avatar and voice you want to use, copy their IDs and paste them into the Set HeyGen Parameters node in your n8n workflow. Set Up YouTube Credentials: Connect your YouTube account in n8n using OAuth2. Ensure proper permissions are granted for video uploads. To set up YouTube credentials in n8n, go to the Google Cloud Console, enable YouTube Data API v3, and create an OAuth Client ID (choose Web Application and add the redirect URI: https://<your-n8n-domain>/rest/oauth2-credential/callback). Copy the Client ID and Client Secret, then in n8n create new credentials for YouTube OAuth2 API. Enter the values, authenticate with your Google account to grant upload permissions, and test the connection. Once complete, the YouTube node will be ready to upload videos automatically. Activate the Workflow: Once configured, enable the workflow. Sending a POST request to the webhook with title and content will trigger the full process. Notes You can adjust video dimensions (default: 1280x720) in the HeyGen API request. Processing time may vary depending on script length. The workflow uses a wait-and-poll loop until the video is ready. Default YouTube upload category is Education (28) and region is US. These can be customized in the YouTube node.
by SpaGreen Creative
Shopify Order Fulfillment & Send Tracking Link via WhatsApp Using Rapiwa API Who is this for? This n8n workflow automatically sends WhatsApp notifications to customers when their Shopify orders are fulfilled. It extracts order details, validates customer phone numbers for WhatsApp compatibility using the Rapiwa API, sends tracking information via WhatsApp, and logs all interactions in Google Sheets with appropriate verification status. What this Workflow Does This n8n workflow listens for new order fulfillments on Shopify and automatically sends a WhatsApp message with tracking details to customers. It uses the Rapiwa API to verify if the customer's number is on WhatsApp, formats all the data, sends a message, and logs everything to Google Sheets for tracking and auditing purposes. Key Features Webhook-Triggered**: Activates on new Shopify fulfillment events Phone Number Validation**: Uses Rapiwa to check WhatsApp compatibility Tracking Message Automation**: Sends real-time tracking messages via WhatsApp Data Cleaning**: Formats phone numbers and customer data Smart Branching**: Separates verified and unverified WhatsApp users Google Sheets Logging**: Stores data with status labels for all messages Rate-Limit Protection**: Wait node helps space API calls Dual Sheet Logging**: Maintains separate records for verified and unverified numbers Requirements Tools & Services An n8n instance (self-hosted or cloud) A Shopify store with REST API access enabled A Rapiwa.com account with: Valid Bearer Token Connected and verified WhatsApp number A Google Sheet with the following columns: Like this Sample Sheet Google Sheets OAuth2 credentials** set up in n8n Shopify API credentials** added to n8n Rapiwa Bearer Token** added as httpBearerAuth credentials How to Use Step-by-step Setup Connect Shopify to n8n Use the Shopify Trigger node Set event to fulfillments/create to capture new fulfillment events Extract Webhook Data Use a Code Node to format the webhook response Capture order, customer, and tracking details Fetch Complete Order Information Add an HTTP Request Node using Shopify Admin API Include the order ID to retrieve customer phone, email, and product details Clean the Phone Number Use a Code Node to: Remove non-numeric characters Format number to international standard Combine customer first and last name Batch Process Orders Use the Split In Batches node to handle customers one-by-one Validate WhatsApp Number Use Rapiwa’s /verify-whatsapp endpoint with a Bearer Token Check if number exists on WhatsApp Conditional Branching Use an If Node: If data.exists === "true" → Verified path Else → Unverified path Send WhatsApp Message Send tracking info with a personalized message: Hi [Customer Name], Good news! Your order has just been fulfilled. Tracking Number: [Tracking Number] Track your package here: [Tracking URL] Thank you for shopping with us. -Team SpaGreen Creative Log Data to Google Sheets Log verified and unverified entries in separate sheets Include all relevant customer and tracking data Add Delay Between Messages Use the Wait Node to avoid rate limits on Rapiwa API Requirements A Shopify store with API access enabled A Google Sheet with required column like this ➤ Sample Rapiwa API account**: Connected WhatsApp number Valid Bearer Token n8n** with: Shopify API credentials Rapiwa Bearer Token Google Sheets OAuth2 credentials Google Sheet Column Reference A Google Sheet formatted like this ➤ Sample | customer_id | name | email | number | tracking_company | tracking_number | tracking_url | product_title | status | |---------------|-----------------|--------------------------------|---------------|------------------|-----------------|---------------------------------------------|----------------------------------------|------------| | 8986XXXX06 | Abdul Mannan | contact@spagreen.net | 8801322827799 | Amazon Logistics | SG-OT-02 | https://traxxxG-OT-02 | S25 Ultra 5G Smartphone | verified | | 883XXX7982 | Abdul Mannan | contact@spagreen.net | 8801322827799 | Amazon Logistics | SG-OT-N03 | https://traxxxGOT-N03| Samsung Galaxy S24 Ultra | verified | Workflow Logic Summary Shopify Webhook Trigger: On order fulfillment Extract Webhook Payload Fetch Order + Customer Details Clean and Format Phone Number Split into Single-Item Batch Check WhatsApp Validity via Rapiwa If Verified: Send WhatsApp Message Log to verified sheet If Not Verified: Skip message Log to unverified sheet Add Delay with Wait Node Repeat for Next Fulfillment Customization Ideas Modify WhatsApp message to include delivery date or store contact Send different messages for different product categories Use product_type or shipping_zone to trigger separate workflows Add admin alerts for unverified numbers Store message delivery status (e.g., success, failed) Notes & Warnings Rapiwa is an unofficial WhatsApp API — delivery reliability is not guaranteed The Google Sheet column name must include the space at the end Wait node** may need longer delay for high-volume stores Always format phone numbers in international format (e.g., 8801XXXXXXXXX) Shopify API version used is 2025-07 — update as newer versions release You must comply with WhatsApp terms and data privacy laws when messaging users Useful Links Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Support WhatsApp Support: Chat Now Discord: Join SpaGreen Community Facebook Group: SpaGreen Support Website: https://spagreen.net Developer Portfolio: Codecanyon SpaGreen
by Oriol Seguí
Web Consultation & Crawling Chatbot with Google Sheets Memory Who is this workflow for? This workflow is designed for SEO analysts, content creators, marketing agencies, and developers who need to index a website and then interact with its content as if it were a chatbot. ⚠ Note: if the site contains many pages, AI token consumption can generate high costs, especially during the initial crawling and analysis phase. 1. Initial Mode (first use with a URL) When the user enters a URL for the first time: URL validation using AI (gpt-5-nano). Automatic sitemap discovery via robots.txt. Relevant sitemap selection (pages, posts, categories, or tags) using GPT-4o according to configured options. (Includes “OPTIONS” node to precisely choose which types of URLs to process) Crawling of all selected pages: Downloads HTML of each page. Converts HTML to Markdown. AI analysis to extract: Detected language. Heading hierarchy (H1, H2, etc.). Internal and external links. Content summary. Structured storage in Google Sheets: Lang H1 and hierarchy External URLs Internal URLs Summary Content Data schema (flag to enable agent mode) When finished, the sheet is marked with Data schema = true, signaling that the site is indexed. 2. Agent Mode (subsequent queries) If the URL has already been indexed (Data schema = true): The chat becomes a LangChain Agent that: Reads the database in Google Sheets. Can perform real-time HTTP requests if it needs updated information. Responds as if it were the website, using stored and live data. This allows the user to ask questions such as: "What’s on the contact page?" "How many external links are there on the homepage?" "Give me all the H1 headings from the services pages" "What CTA would you suggest for my page?" "How would you expand X content?" Use cases Build a chatbot that answers questions about a website’s content. Index and analyze full websites for future queries. SEO tool to list headings, links, and content summaries. Assistant for quick exploration of a site’s structure. Generate improvement recommendations and content strategies from site data.
by Hybroht
Using Mistral API, you can use this n8n workflow to automate the process of: collecting, filtering, analyzing, and summarizing news articles from multiple sources. The sources come from pre-built RSS feeds and a custom DuckDuckGo node, which you can change if you need. It will deliver the most relevant news of the day in a concise manner. ++How It Works++** The workflow begins each weekday at noon. The news are gathered from RSS feeds and a custom DuckDuckGo node, using HTTPS GET when needed. News not from today or containing unwanted keywords are filtered out. The first AI Agent will select the top news from their titles alone and generate a general title & summary. The next AI Agent will summarize the full content of the selected top news articles. The general summary and title will be combined with the top 10 news summaries into a final output. ++Requirements++ An active n8n instance (self-hosted or cloud). Install the custom DuckDuckGo node: n8n-nodes-duckduckgo-search A Mistral API key Configure the Sub-Workflow for the content which requires HTTP GET requests. It is provided in the template itself. ++Fair Notice++ This is an older version of the template. There is a superior updated version which isn't restricted to tech news, with enhanced capabilities such as communication through different channels (email, social media) and advanced keyword filtering. It was recently published in n8n. You can find it here. If you are interested or would like to discuss specific needs, then feel free to contact us.
by Jan Oberhauser
Receives data from an incoming HTTP Request (set up to use respond to webhook node) Create dummy data Convert JSON to XML which gets returned Respond to Webhook which returns the data and the content type of the data
by Ludwig
How it Works As n8n instances scale, teams often lose track of sub-workflows—who uses them, where they are referenced, and whether they can be safely updated. This leads to inefficiencies like unnecessary copies of workflows or reluctance to modify existing ones. This workflow solves that problem by: Fetching all workflows and identifying which ones execute others. Verifying that referenced subworkflows exist. Building a caller-subworkflow dependency graph for visibility. Automatically tagging sub-workflows based on their parent workflows. Providing a chart visualization to highlight the most-used sub-workflows. Set Up Steps Estimated time: ~10–15 minutes Set up n8n API credentials to allow access to workflows and tags. Replace instance_url with your n8n instance URL. Run the workflow to analyze dependencies and generate the graph. Review and validate assigned tags for sub-workflows. (Optional) Enable pie chart visualization to see the most-used sub-workflows. This workflow is essential for enterprise teams managing large n8n instances, preventing workflow duplication, reducing uncertainty around dependencies, and allowing safe, informed updates to sub-workflows.
by Holger
++How it Works:++ This RSS reader retrieves links from a Google Sheets file and goes through each link to retrieve the messages that are younger than 3 days, then saves them in a second Google Sheets file and then deletes all older entries in the second Google Sheets file! The retrieval can take a while due to the Google API block prevention, depending on the number of news feeds that were retrieved!==* Detailed Description is in the sticky Notes from the Workflow!*