by OwenLee
🤯 Problem of Traditional Bookkeeping 🔀 Context switch kills the habit: Because bookkeeping lives outside the apps you use every day, you postpone it → forget to log. 🧱 High input friction: You’re forced to fill rigid fields (amount/category/date/notes…), which is slow and discouraging for quick capture. 🎙️💸 Weak or pricey natural-language options: A few tools support voice/chat, but they’re often expensive, and the experience is hit-or-miss. 🔒📦 Limited data ownership: Records live on third-party servers, so privacy and control are diluted. 📲 How This Workflow Fixes It 💬 Put the capture back where you already are: Log expenses directly inside Telegram (or other channels) in a familiar chat—no new app to learn. ⚡ Ultra-low-friction, unstructured input: Send text, a voice note, or a receipt photo—the flow extracts amount · item · date, supports multiple languages and relative dates, and can split multiple expenses from one message. 🗂️📝 Your data, your sheet: Final records are written to your own Google Sheet (columnar fields or a JSON column). You keep full control. 🔗 Demo Google Sheet: click me 👥 Who Is This For 😤 Anyone fed up with traditional bookkeeping but curious about an AI-assisted, chat-based way to log expenses. 🤖 People who tried AI bookkeeping apps but found them pricey, inflexible, or clunky. 💵 Bookkeeping beginners who want frictionless capture first, simple review and categorize later. 🧩 How It Works 💬 Captures expenses from Telegram (text, voice note, or receipt photo). 🔎 Normalizes inputs into raw text (uses Gemini to transcribe voice and extract text from images). 🧠 Parses amount · item · date with an LLM expense parser. 📊 Appends tidy rows to Google Sheets. 🔔 Sends a Telegram confirmation summarizing exactly what was recorded. 🛠️ How to Set Up 1) 🔑 Connect credentials (once) TELEGRAM_BOT_TOKEN LLM_API_KEY GOOGLE_SHEETS_OAUTH 2) 🚀 Quick Start Setup:* Create a Google Sheet to store *Log Expense** data and configure it in n8n. Telegram:* Fill in and verify the *Telegram chatId**. Remember enable the workflow!* 🧰 How to Customize the Workflow 📝 Other user interaction channels: Add Gmail, Slack, or a website Webhook to accept email/command/form submissions that map into the same parser. 🌍 Currency: Extract and store currency in its own column (e.g., MYR, USD); keep amount numeric only (no symbols). 🔎 Higher-accuracy OCR / STT to reduce errors 📩 Help Contact: owenlzyxg@gmail.com
by takuma
Who is this for This template is perfect for: Market Researchers** tracking industry trends. Tech Teams** wanting to stay updated on specific technologies (e.g., "AI", "Cybersecurity"). Content Creators** looking for curated news topics. Busy Professionals** who need a high-signal, low-noise news digest. What it does Fetches News: Pulls daily articles via NewsAPI based on your chosen keyword (default: "technology"). AI Filtering: Uses an AI Agent (via OpenRouter) to filter out low-quality or irrelevant clickbait. Daily Digest (Slack): Summarizes the top 3 articles in English. Translates the summaries to Japanese using DeepL (optional). Posts both versions to a Slack channel. Data Archiving (Sheets): Extracts structured data (Title, Author, Summary, URL) and saves it to Google Sheets. Weekly Trend Report: Every Monday, it reads the past week's data from Google Sheets and uses AI to generate a high-level trend report and strategic insights. How to set up Configure Credentials: You will need API keys/auth for NewsAPI, OpenRouter (or OpenAI), DeepL, Google Sheets, and Slack. Setup Google Sheet: Create a sheet with the following headers in the first row: title, author, summary, url. Map the Sheet: In the "Append row in sheet" and "Read sheet (weekly)" nodes, select your file and map the columns. Define Keyword: Open the "Set Keyword" node and change chatInput to the topic you want to track (e.g., "Crypto", "SaaS", "Climate Change"). Slack Setup: Select your desired channel in the Slack nodes. Requirements n8n** (Self-hosted or Cloud) NewsAPI** Key (Free tier available) OpenRouter** (or any LangChain compatible Chat Model like OpenAI) DeepL** API Key (for translation) Google Sheets** account Slack** Workspace How to customize Change the Language:** Remove the DeepL node if you only want English, or change the target language code. Adjust the Prompt:** Modify the "AI Agent (Filter)" system message to change how strict the news filtering is. Change Schedule:** Adjust the Cron nodes to run at your preferred time (currently set to Daily 8 AM and Weekly Monday 9 AM).
by Cheng Siong Chin
Introduction Automate peer review assignment and grading with AI-powered evaluation. Designed for educators managing collaborative assessments efficiently. How It Works Webhook receives assignments, distributes them, AI generates review rubrics, emails reviewers, collects responses, calculates scores, stores results, emails reports, updates dashboards, and posts analytics to Slack. Workflow Template Webhook → Store Assignment → Distribute → Generate Review Rubric → Notify Slack → Email Reviewers → Prepare Response → Calculate Score → Store Results → Check Status → Generate Report → Email Report → Update Dashboard → Analytics → Post to Slack → Respond to Webhook Workflow Steps Receive & Store: Webhook captures assignments, stores data. Distribute & Generate: Assigns peer reviewers, AI creates rubrics. Notify & Email: Alerts via Slack, sends review requests. Collect & Score: Gathers responses, calculates peer scores. Report & Update: Generates reports, emails results, updates dashboard. Analyze & Alert: Posts analytics to Slack, confirms completion. Setup Instructions Webhook & Storage: Configure endpoint, set up database. AI Configuration: Add OpenAI key, customize rubric prompts. Communication: Connect Gmail, Slack credentials. Dashboard: Link analytics platform, configure metrics. Prerequisites OpenAI API key Gmail account Slack workspace Database or storage system Dashboard tool Use Cases University peer review assignments Corporate training evaluations Research paper assessments Customization Multi-round review cycles Custom scoring algorithms LMS integration (Canvas, Moodle) Benefits Eliminates manual distribution Ensures consistent evaluation Provides instant feedback and analytics
by InfyOm Technologies
✅ What problem does this workflow solve? Tracking what people say about your brand on Twitter can be overwhelming, especially when important mentions slip through the cracks. This workflow automates the process: it scrapes Twitter mentions, analyzes sentiment using OpenAI, logs everything in a Google Sheet, and sends real-time Slack alerts for negative tweets. No manual monitoring needed. ⚙️ What does this workflow do? Runs on a schedule to monitor Twitter mentions or hashtags. Uses Apify to scrape tweets based on brand keywords. Filters out tweets already processed (avoids duplicates). Performs sentiment analysis with OpenAI (Positive, Neutral, Negative). Logs tweet content, sentiment, and reply (if any) in a Google Sheet. Sends an instant Slack notification for negative tweets. Generates thank-you replies for positive tweets and logs them. 🔧 Setup Instructions 🗓 Schedule Trigger Use the Cron node to schedule checks (e.g., every hour, daily). 🐦 Apify Twitter Scraper Setup Sign up on Apify Generate your Apify API Token and use it in the HTTP node to run the actor and get tweet results. 🤖 OpenAI Sentiment Analysis Get your API key from OpenAI 📄 Google Sheet Configuration Prepare a Google Sheet with this sample format. Connect it using the Google Sheets node in n8n. 💬 Slack Notifications Connect your Slack workspace via the Slack node. Set up the channel where negative tweets should be sent as alerts. 🧠 How it Works 1. Scheduled Run Triggered at a fixed interval using the Schedule (Cron) node. 2. Scrape Mentions from Twitter The Apify actor runs and collects recent tweets mentioning your brand or using your hashtag. Links to the tweets are extracted. 3. Filter Previously Seen Tweets Each tweet is checked against the Google Sheet. If already present, it’s skipped to avoid duplicate analysis. 4. Analyze Sentiment with OpenAI For new tweets, sentiment is classified into: ✅ Positive ⚪ Neutral ❌ Negative 5. Store Results in Google Sheet The tweet link, content, and sentiment are stored in a row. If sentiment is positive, a thank-you reply is also generated and saved. 6. Notify Slack for Negative Tweets When a tweet is tagged Negative, a Slack message is sent to the designated channel with the tweet link. 👤 Who can use this? This workflow is ideal for: 📢 Social Media Teams 🧠 PR and Brand Managers 🧑💻 Solo Founders 🏢 Startups & SaaS Companies Stay ahead of your brand's reputation—automatically. 🛠 Customization Ideas 🎯 Add filters for specific campaign hashtags. 📬 Send weekly summary reports via email. 📥 Auto-open support tickets for negative mentions. 🗣 Expand sentiment categories with more detailed tagging. 🚀 Ready to get started? Just plug in: 🔑 Your Apify API Token 🔑 Your OpenAI API Key 📄 Your Google Sheet 💬 Your Slack Channel Then deploy the workflow, and let it monitor Twitter for you!
by AbSa~
🚀 Overview This workflow automates video uploads from Telegram directly to Google Drive, complete with smart file renaming, Google Sheets logging, and AI assistance via Google Gemini. It’s perfect for creators, educators, or organizations that want to streamline video submissions and file management. ⚙️ How It Works Telegram Trigger -> Start the workflow when a user sends a video file to your Telegram bot. Switch Node -> Detects file type or command and routes the flow accordingly. Get File -> Downloads the Telegram video file. Upload to Google Drive -> Automatically uploads the video to your chosen Drive folder. Smart Rename -> The file name is auto-formatted using dynamic logic (date, username, or custom tags). Google Sheets Logging -> Appends or updates upload data (e.g., filename, sender, timestamp) for easy tracking. AI Agent Integration -> Uses Google Gemini AI connected to Data Vidio memory to analyze or respond intelligently to user queries. Telegram Notification -> Sends confirmation or status messages back to Telegram. 🧠 Highlights Seamlessly integrates Telegram → Google Drive → Google Sheets → Gemini AI Supports file update or append mode Auto-rename logic via the Code node Works with custom memory tools for smarter AI responses Easy to clone and adapt, just connect your own credentials 🪄 Ideal Use Cases Video assignment submissions for schools or academies Media upload management for marketing teams Automated video archiving and AI-assisted review Personal Telegram-to-Drive backup assistant 🧩 Setup Tips Copy and use the provided Google Sheet template (SheetTemplate) Configure your Telegram Bot token, Google Drive, and Sheets credentials Update the AI Agent node with your Gemini API key and connect the Data Vidio sheet Test with a sample Telegram video before full automation
by n8nwizard
📌 Overview This advanced multi-phase n8n workflow automates the complete research, analysis, and ideation pipeline for a YouTube strategist. It scrapes competitor channels, analyzes top-performing titles and thumbnails, identifies niche trends, gathers audience sentiment from comments, and produces data-driven content ideas—automatically writing them into a structured Google Sheets dashboard. This system is ideal for: YouTube creators Agencies Content strategists Automation engineers Anyone who wants to generate YouTube content ideas backed by real data 🧠 High‑Level Architecture The workflow is split into 5 phases, each building on the previous: Phase 1 — Niche Outliers (Input: User Form) Scrapes 3 high‑quality channels from your niche, extracts their outlier videos, and analyzes why they work. Phase 2 — Broad Niche Insights (Weekly) Scrapes the top trending content in your broad niche (e.g., "AI", "fitness", "personal finance") and logs weekly insights. Phase 3 — Niche Insights (Daily) Scrapes the top videos in your specific micro‑niche daily to keep track of content momentum. Phase 4 — Comment Analysis Analyzes real comments from your channel to understand what your audience likes, dislikes, and wants more of. Phase 5 — Content Ideation Generates 3 highly‑optimized title + thumbnail concepts using all prior insights. Everything is automatically logged into a Google Sheets dashboard. 🧩 Phase-by-Phase Breakdown ⭐ Phase 1 — Niche Outliers (Form Trigger) User enters 3 YouTube channel URLs in a form. Workflow scrapes each channel using Apify YouTube Scraper. Filters for top-performing videos. Extracts: title, views, likes, thumbnail, URL. AI analyzes: Power words in titles (OpenRouter/GPT 4.1-mini) Thumbnail attention hooks (OpenAI Vision) All insights are appended into the “Niche Outliers” sheet. Purpose: Understand what the best creators in your niche are doing. 🌐 Phase 2 — Broad Niche Insights (Weekly — Sundays @ 5 AM) Workflow scrapes the top videos for a broad niche (e.g., “artificial intelligence”). Analyzes: Title structure Power words Thumbnail cues Writes weekly insights to “Broad Niche Weekly” sheet. Purpose: Stay informed about macro‑level trends. 🎯 Phase 3 — Niche Insights (Daily @ 6 AM) Scrapes the top videos in your specific micro‑niche (e.g., “n8n automations”). Runs title + thumbnail analysis. Appends daily results to “Niche Daily”. Results feed directly into Phase 5. Purpose: Track daily momentum and trending formats. 💬 Phase 4 — Comment Analysis (Channel Feedback) Scrapes your channel’s latest 5 videos. Extracts up to 30 comments from each. Aggregates comments. AI identifies: What viewers love What viewers dislike What viewers are asking for Stores patterns in “Comment Analysis” sheet. Purpose: Understand real audience demand. 💡 Phase 5 — Content Ideation Using AI Using insights from all previous phases: Top titles Power words Thumbnail patterns Daily niche trends Audience comment analysis Channel positioning The Creative Agent produces: 3 optimized video titles 3 matching thumbnail concepts These are appended to the “Ideation” sheet. A Slack notification is sent when ideation is ready. Purpose: Fully automated content idea generation. 🗂️ Outputs in Google Sheets The workflow populates these tabs: 📌 Niche Outliers (top competitor videos) 📌 Broad Niche Weekly (weekly trend analysis) 📌 Niche Daily (daily trend analysis) 📌 Comment Analysis (audience sentiment) 📌 Ideation (final titles + thumbnails) 🔧 What This Workflow Automates ✔ Competitor analysis ✔ Thumbnail + title breakdowns ✔ Daily niche tracking ✔ Weekly niche tracking ✔ Viewer sentiment analysis ✔ Fully AI‑generated content ideas ✔ Automatic data logging to Google Sheets ✔ Slack notifications This is essentially a 24/7 AI YouTube strategist. ⚙️ Setup Requirements Apify API Key** (used in 5 scraper nodes) OpenRouter API Key** (for GPT 4.1-mini intelligence) OpenAI API Key** (for thumbnail image analysis) Google Sheets OAuth2 Credential** Make a copy of the provided sheet template Fill Set nodes: Phase II: Broad niche (e.g., "AI") Phase III: Micro niche (e.g., "n8n automations") Phase IV: Your Channel URL Phase V: Your Channel Description 🧪 Testing Guide Test the Form Trigger with 3 competitor channel URLs. Test both Schedule Triggers (weekly + daily) manually. Verify Sheets are receiving rows. Run the full pipeline end‑to‑end. Confirm Slack notification. Everything should chain together smoothly. 🎉 Final Result By the end of this workflow, you have a: 🧠 Data‑driven YouTube strategy system that: studies your niche finds outliers understands your audience detects trends generates smart content ideas every day
by Oneclick AI Squad
This automated workflow monitors your website's keyword rankings daily and sends instant alerts to your team when significant ranking drops occur. It fetches current ranking positions, compares them with historical data, and triggers notifications through Slack and email when keywords drop beyond your defined threshold. Good to know The workflow uses SERP API for accurate ranking data; API costs apply based on your usage volume Ranking checks are performed daily to avoid overwhelming search engines with requests The system tracks ranking changes over time and maintains historical data for trend analysis Slack integration requires workspace permissions and proper bot configuration False positives may occur due to personalized search results or data center variations How it works Daily SEO Check Trigger** initiates the workflow on a scheduled basis Get Keywords Database** retrieves your keyword list and current ranking data Filter Active Keywords Only** processes only keywords marked as active for monitoring Fetch Google Rankings via SERP API** gets current ranking positions for each keyword Wait For Response** Wait for gets current ranking positions Parse Rankings & Detect Changes** compares new rankings with historical data and identifies significant drops Filter Significant Ranking Drops** isolates keywords that dropped beyond your threshold (e.g., 5+ positions) Send Slack Ranking Alert** notifies your team channel about ranking drops Send Email Ranking Alert** sends detailed email reports to stakeholders Update Rankings in Google Sheet** saves new ranking data for historical tracking Generate SEO Monitoring Summary** creates a comprehensive report of all ranking changes How to use Import the workflow into n8n and configure your SERP API credentials Set up your Google Sheet with the required keyword database structure Configure Slack webhook URL and email SMTP settings Define your ranking drop threshold (recommended: 5+ position drops) Test the workflow with a small keyword set before full deployment Schedule the workflow to run daily during off-peak hours Requirements SERP API account** with sufficient credits for daily keyword checks Google Sheets access** for keyword database and ranking storage Slack workspace** with webhook permissions for team notifications Email service** (SMTP or API) for stakeholder alerts Keywords database** properly formatted in Google Sheets Database/Sheet Columns Required Google Sheet: "Keywords Database" Create a Google Sheet with the following columns: | Column Name | Description | Example | |-------------|-------------|---------| | keyword | Target keyword to monitor | "best seo tools" | | domain | Your website domain | "yourwebsite.com" | | current_rank | Latest ranking position | 5 | | previous_rank | Previous day's ranking | 3 | | status | Monitoring status | "active" | | target_url | Expected ranking URL | "/best-seo-tools-guide" | | search_volume | Monthly search volume | 1200 | | difficulty | Keyword difficulty score | 65 | | date_added | When keyword was added | "2025-01-15" | | last_checked | Last monitoring date | "2025-07-30" | | drop_threshold | Custom drop alert threshold | 5 | | category | Keyword grouping | "Product Pages" | Customising this workflow Modify ranking thresholds** in the "Filter Significant Ranking Drops" node to adjust sensitivity (e.g., 3+ positions vs 10+ positions) Add competitor monitoring** by duplicating the SERP API node and tracking competitor rankings for the same keywords Customize alert messages** in Slack and email nodes to include your brand voice and specific stakeholder information Extend to multiple search engines** by adding Bing or Yahoo ranking checks alongside Google Implement ranking improvement alerts** to celebrate when keywords move up significantly Add mobile vs desktop tracking** by configuring separate SERP API calls for different device types
by Hardikkumar
This workflow is the AI analysis and alerting engine for a complete social media monitoring system. It's designed to work with data scraped from X (formerly Twitter) using a tool like the Apify Tweet Scraper, which logs the data into a Google Sheet. The workflow then automatically analyzes new tweets with Google Gemini and sends tailored alerts to Slack. How it works This workflow automates the analysis and reporting part of your social media monitoring: tweet Hunting:** It finds tweets for the query entered in the set node and passes the data to the google sheets Fetches New Tweets:** It gets all new rows from your Google Sheet that haven't been processed yet (it looks for "Notmarked" in the 'action taken' column). Prepares for AI:** It combines the data from all new tweets into a single, clean prompt for the AI to analyze. AI Analysis with Gemini:* It sends the compiled data to Google Gemini, asking for a full summary report *and a separate, machine-readable JSON list of any urgent items. Splits the Response:** The workflow intelligently separates the AI's text summary from the JSON data for urgent alerts. Sends Notifications:** The high-level summary is sent to a general Slack channel (e.g., #brand-alerts). Each urgent item is sent as a separate, detailed alert to a high-priority Slack channel (e.g., #urgent). Set up steps It should take about 5-10 minutes to get this workflow running. Prerequisite - Data Source: Ensure you have a Google Sheet being populated with tweet data. For a complete automation, you can set up a new google sheet with the same structure for saving the tweets data and run the Tweet Scraper on a schedule. Configure Credentials: Make sure you have credentials set up in your n8n instance for Google Sheets, Google Gemini (PaLM) API, and Slack. Google Sheets Node ("Get row(s) in sheet"): Select your Google Sheet containing the tweet data. Choose the specific sheet name from the dropdown. Ensure your sheet has a column named action taken so the filter works correctly. Google Gemini Chat Model Node: Select your Google Gemini credential from the dropdown. Slack Nodes ("Send a message" & "Send a message1"): In the first Slack node, choose the channel for the summary report. In the second Slack node, choose the channel for urgent alerts. Save and Activate: Once configured, save your workflow and turn it on!
by Emilio Loewenstein
This workflow automates customer email support by combining Gmail, AI classification, and a knowledge base to provide instant, accurate, and friendly responses. It’s designed for businesses that want to improve customer satisfaction while reducing manual workload. 🚀 How it Works Gmail Trigger The workflow listens for new incoming Gmail messages. Text Classification Each email is classified using AI as either Customer Support or Other. If it’s Other, the workflow stops. If it’s Customer Support, the email continues to the AI agent. AI Agent with Knowledge Base The AI agent: Reads the customer’s message. Searches the Pinecone Knowledge Base for FAQs and policies. Generates a helpful, polite, and detailed reply using an OpenRouter model. Signs off as Mrs. Helpful from Tech Haven Solutions. Reply to Gmail The drafted email is automatically sent back to the customer. 💡 Value ✅ Save Time – No more manual triaging and drafting of replies. ✅ Consistency – Every answer is based on your own FAQ/policies. ✅ Customer Satisfaction – Fast, friendly, and accurate responses 24/7. ✅ Scalable – Handle higher email volume without scaling headcount. 🔑 Credentials Needed To use this workflow, connect the following accounts: Gmail OAuth2** → for receiving and sending emails. OpenRouter API** → for text classification and AI-generated replies. OpenAI API** → for embeddings (to connect FAQs with AI). Pinecone API** → for storing and retrieving knowledge base content. 🛠 Example Use Case A customer writes: > “Hi, I placed an order last week but haven’t received a shipping confirmation yet. Can you check the status?” The workflow will: Detect it’s a support-related email. Retrieve order policy information from the knowledge base. Generate a friendly response explaining order tracking steps. Automatically reply to the customer in Gmail. ⚡️ Setup Instructions Import this workflow into your n8n instance. Connect your Gmail, OpenRouter, OpenAI, and Pinecone credentials. Populate your Pinecone knowledge base with FAQs and policies. Activate the workflow. From now on, all support-related emails will be automatically answered by your AI-powered support agent.
by Davide
This is an exaple of advanced automated data extraction and enrichment pipeline with ScrapeGraphAI. Its primary purpose is to systematically scrape the n8n community workflows website, extract detailed information about recently added workflows, process that data using multiple AI models, and store the structured results in a Google Sheets spreadsheet. This workflow demonstrates a sophisticated use of n8n to move beyond simple API calls and into the realm of intelligent, AI-driven web scraping and data processing, turning unstructured website content into valuable, structured business intelligence. Key Advantages ✅ Full Automation: Once triggered (manually or on a schedule via the Schedule Trigger node), the entire process runs hands-free, from data collection to spreadsheet population. ✅ Powerful AI-Augmented Scraping: It doesn't just scrape raw HTML. It uses multiple AI agents (Google Gemini, OpenAI) to: Understand page structure to find the right data on the main list. Clean and purify content from individual pages, removing and irrelevant information. Perform precise information extraction to parse unstructured text into structured JSON data based on a defined schema (author, price, etc.). Generate intelligent summaries, adding significant value by explaining the workflow's purpose in Italian. ✅ Robust and Structured Data Output: The use of the Structured Output Parser and Information Extractor nodes ensures the data is clean, consistent, and ready for analysis. It outputs perfectly formatted JSON that maps directly to spreadsheet columns. ✅ Scalability via Batching: The Split In Batches and Loop Over Items nodes allow the workflow to process a dynamically sized list of workflows. Whether there are 5 or 50 new workflows, it will process each one sequentially without failing. ✅ Effective Data Integration: It seamlessly integrates with Google Sheets, acting as a simple and powerful database. This makes the collected data immediately accessible, shareable, and available for visualization in tools like Looker Studio. ✅ Resilience to Website Changes: By using AI models trained to understand content and context (like "find the 'Recently Added' section" or "find the author's name"), the workflow is more resilient to minor cosmetic changes on the target website compared to traditional CSS/XPath selectors. How It Works The workflow operates in two main phases: Phase 1: Scraping the Main List Trigger: The workflow can be started manually ("Execute Workflow") or automatically on a schedule. Scraping: The "Scrape main page" node (using ScrapeGraphAI) fetches and converts the https://n8n.io/workflows/ page into clean Markdown format. Data Extraction: An LLM chain ("Extract 'Recently added'") analyzes the Markdown. It is specifically instructed to identify all workflow titles and URLs within the "Recently Added" section and output them as a structured JSON array named workflows. Data Preparation: The resulting array is set as a variable and then split out into individual items, preparing them for processing one-by-one. Phase 2: Processing Individual Workflows Loop: The "Loop Over Items" node iterates through each workflow URL obtained from Phase 1. Scrape & Clean Detail Page: For each URL, the "Scrape single Workflow" node fetches the detail page. Another LLM chain ("Main content") cleans the resulting Markdown, removing superfluous content and focusing only on the core article text. Information Extraction: The cleaned Markdown is passed to an "Information Extractor" node. This uses a language model to locate and structure specific data points (title, URL, ID, author, categories, price) into a defined JSON schema. Summarization: The cleaned Markdown is also sent to a Google Gemini node ("Summarization content"), which generates a concise Italian summary of the workflow's purpose and tools used. Data Consolidation & Export: The extracted information and the generated summary are merged into a single data object. Finally, the "Add row" node maps all this data to the appropriate columns and appends it as a new row in a designated Google Sheet. Set Up Steps To run this workflow, you need to configure the following credentials in your n8n instance: ScrapeGraphAI Account: The "Scrape main page" and "Scrape single Workflow" nodes require valid ScrapeGraphAI API credentials named ScrapegraphAI account. Install the related Community node. Google Gemini Account: Multiple nodes ("Google Gemini Chat Model", "Summarization content", etc.) require API credentials for Google Gemini named Google Gemini(PaLM) (Eure). OpenAI Account: The "OpenAI Chat Model1" node requires API credentials for OpenAI named OpenAi account (Eure). Google Sheets Account: The "Add row" node requires OAuth2 credentials for Google Sheets named Google Sheets account. You must also ensure the node is configured with the correct Google Sheet ID and that the sheet has a worksheet named Foglio1 (or update the node to match your sheet's name). Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Weiser22
Shopify Multilingual Product Copy with n8n & Gemini 2.5 Flash-Lite Use for free Created by <Weiser22> · Last update 2025-09-02 Categories: E-commerce, Product Content, Translation, Computer Vision Description Generate language-specific Shopify product copy (ES, DE, EN, FR, IT, PT) from each product’s main image and metadata. The workflow performs a vision analysis to extract objective, verifiable details, then produces product names, descriptions, and handles per language, and stores the results in Google Sheets for review or publishing. Good to know Model:** models/gemini-2.5-flash-lite (supports image input). Confirm pricing/limits in your account before scaling. Image requirement:** products should have images[0].src; add a fallback if some products lack a primary image. Sheets mapping:** the sheet node uses Auto-map; ensure your matching column aligns with the field you emit (id vs product_id). Strict output:** the Agent enforces a multilingual JSON contract (es,de,en,fr,it,pt), each with shopify_product_name, shopify_description, handle. How it works Manual Trigger:** start a test run on demand. Get many products (Shopify):** fetch products and their images. Analyze image (Gemini Vision):** send images[0].src with an objective, 3–5 sentence prompt. AI Agent (Gemini Chat):** merge Shopify fields + vision text under anti-hallucination rules and a strict JSON schema. Structured Output Parser:** validates the exact JSON shape. Expand Languages & Sanitize (Code):** split into 6 items and normalize handles/HTML content as needed. Append row in sheet (Google Sheets):** add one row per language to your spreadsheet. Requirements Shopify Access Token with product read permissions. Google AI Studio (Gemini) API key for Vision + Chat Model nodes. Google Sheets credentials (OAuth or Service Account) with access to the target spreadsheet. How to use Connect credentials: Shopify, Gemini (same key for Vision and Chat), and Google Sheets. Configure nodes: Get many products: adjust limit/filters. Analyze image: verify ={{ $json.images[0].src }} resolves to a public image URL. AI Agent & Parser: keep the strict JSON contract as provided. Code (Expand & Sanitize): emits product_id, lang, handle, shopify_product_name, shopify_description, base_handle_es. Google Sheets (Append): set documentId and tab name; confirm the matching column. Run a test: execute the workflow and confirm six rows per product (one per language) appear in the sheet. Data contract (Agent output) { "es": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "de": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "en": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "fr": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "it": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "pt": {"shopify_product_name": "", "shopify_description": "", "handle": ""} } Customising this workflow Publish to Shopify:** after review in Sheets, add a product.update step to write finalized copy/handles. Handle policy:** tweak slug rules (diacritics, separators, max length) in the Code node to match store conventions. No-image fallback:** add an IF/Switch to skip vision when images[0].src is missing and generate copy from title + body only. Tone/length:** adjust temperature and token limits on the Chat Model for brand-fit. Troubleshooting No rows in Sheets:** confirm spreadsheet ID, tab name, Auto-map status, and that the matching column matches your emitted field. Vision errors:** ensure images[0].src is reachable. Parser failures:* the Agent must return *bare JSON** with the six root keys and three fields per language—no extra text.
by David Olusola
GPT-4o Resume Screener with Error Handling - Google Sheets & Drive Pipeline How it works Enterprise-grade resume screening automation built for production environments. This workflow combines intelligent AI analysis with comprehensive error handling to ensure reliable processing of candidate applications. Every potential failure point is monitored with automatic recovery and notification systems. Core workflow steps: Intelligent Email Processing - Monitors Gmail with attachment validation and file type detection Robust File Handling - Multi-format support with upload verification and extraction validation Quality-Controlled AI Analysis - GPT-4o evaluation with output validation and fallback mechanisms Verified Data Extraction - Contact and qualification extraction with data integrity checks Dual Logging System - Success tracking in main dashboard, error logging in separate audit trail Error Recovery Features: Upload failure detection with retry mechanisms Text extraction validation with quality thresholds AI processing timeout protection and fallback responses Data validation before final logging Comprehensive error notification and tracking system Set up steps Total setup time: 25-35 minutes Core Credentials Setup (8 minutes) Gmail OAuth2 with attachment permissions Google Drive API with folder creation rights Google Sheets API with read/write access OpenAI API key with GPT-4o model access Primary Configuration (12 minutes) Configure monitoring systems - Set up Gmail trigger with error detection Establish file processing pipeline - Create Drive folders for resumes and configure upload validation Deploy dual spreadsheet system - Set up main tracking sheet and error logging sheet Initialize AI processing - Configure GPT-4o with structured output parsing and timeout settings Customize job requirements - Update role specifications and scoring criteria Error Handling Setup (10 minutes) Configure error notifications - Set administrator email for failure alerts Set up error logging spreadsheet - Create audit trail for failed processing attempts Customize timeout settings - Adjust processing limits based on expected file sizes Test error pathways - Validate notification system with sample failures Advanced Customization (5 minutes) Modify validation thresholds for resume quality Adjust AI prompt for industry-specific requirements Configure custom error messages and escalation rules Set up automated retry logic for transient failures Production-Ready Features: Comprehensive logging for compliance and auditing Graceful degradation when services are temporarily unavailable Detailed error context for troubleshooting Scalable architecture for high-volume processing Template Features Enterprise Error Management Multi-layer validation at every processing stage Automatic error categorization and routing Administrative alerts with detailed context Separate error logging for audit compliance Timeout protection preventing workflow hangs Advanced File Processing Upload success verification before processing Text extraction quality validation Resume content quality thresholds Corrupted file detection and handling Format conversion error recovery Robust AI Integration GPT-4o processing with output validation Structured response parsing with error checking AI timeout protection and fallback responses Failed analysis logging with manual review triggers Retry logic for transient API failures Production Monitoring Real-time error notifications via email Comprehensive error logging dashboard Processing success/failure metrics Failed resume tracking for manual review Audit trail for compliance requirements Data Integrity Controls Pre-logging validation of all extracted data Missing information detection and flagging Contact information verification checks Score validation and boundary enforcement Duplicate detection and handling Designed for HR departments and recruiting agencies that need reliable, scalable resume processing with enterprise-level monitoring and error recovery capabilities.