by ueharayuuki
Who's it for This template is perfect for anyone struggling with habit consistency who wants a fun, engaging way to maintain daily routines. It's ideal for productivity enthusiasts, fitness beginners, remote workers, students, or anyone who loves gamification and RPG elements. If you've ever wished your daily tasks felt more like an adventure game, this workflow is for you. How it works The workflow runs automatically every morning at your chosen time (default: 6 AM) and transforms your daily habits into an RPG adventure: Daily Trigger - Scheduled to run every morning Player Stats Generation - Creates random level, XP, and streak data (in production, connect to a database) Quest Generation - Assigns daily "quests" based on the day of week (weekday routines, weekend specials, Monday goals) Quote Fetching - Gets a motivational quote from a free API Achievement Processing - Checks for milestone achievements (7-day streak, 30-day streak, level 10) Email Creation - Builds a beautiful HTML email with game-like design Email Delivery - Sends the quest email via Gmail Stats Logging - Records execution statistics The system includes different habit sets for weekdays and weekends, random bonus quests, a rank system (D to SSS), achievement unlocks, and progress tracking with visual elements like progress bars. Setup steps Setup takes approximately 5 minutes: Import the workflow into your n8n instance Connect Gmail - Click "Create New" in the Gmail node credentials and authenticate via OAuth2 Update recipient email - Change "your-email@gmail.com" to your actual email in the "Send Quest Email" node Customize habits (optional) - Edit the quest arrays in the "Generate Daily Quests" node Test the workflow - Click "Execute Workflow" to send a test email Activate - Toggle the workflow to "Active" when ready for daily automation Requirements Gmail account with OAuth2 authentication n8n instance (cloud or self-hosted) No external API keys required (uses free Quotable API) No database required (uses random data for demonstration) How to customize the workflow Modify Daily Habits Edit the questDatabase object in the "Generate Daily Quests" node: Add your own habits with custom names Adjust XP and coin rewards Change difficulty levels (Easy, Medium, Hard, Epic, Bonus) Set different quests for weekdays vs weekends Visual Customization In the "Create Email Template" node: Modify color schemes in the CSS Adjust font sizes and layouts Change emoji icons for quests Update achievement thresholds Timing and Schedule In the "Daily Morning Trigger" node: Change the trigger hour (default: 6 AM) Adjust timezone if needed Set different schedules for weekdays/weekends Motivational Content Update daily motivation messages for each day of the week Customize achievement names and descriptions Modify rank titles and progression Add your own fallback quotes This workflow brings the addictive nature of RPG games to your daily habits, making routine tasks feel like an epic adventure. Perfect for anyone who wants to level up their life, one quest at a time!
by Cheng Siong Chin
Introduction Automates gold market tracking using AI forecasting by collecting live prices, financial news, and macro indicators (inflation, interest rates, employment) to produce real-time insights and trend predictions for analysts and investors. How It Works Every 6 hours, the system fetches market data and news → runs AI sentiment and trend analysis → generates a concise forecast report → publishes it to WordPress → and alerts users via Slack or email. Workflow Template Trigger → Fetch → Format → Merge → AI Analyze → Report → Publish → Notify Workflow Steps Schedule: Executes automatically every 6 hours using a Cron trigger. Fetch: Retrieves live gold prices (MetalPriceAPI), financial headlines (NewsAPI), and macroeconomic indicators (FRED). Format & Merge: Cleans, normalizes, and merges all data into a single structured dataset for AI analysis. AI Analyze (OpenAI): Performs sentiment, trend, and correlation analysis to forecast short-term gold price movements. Report Generation: Creates a concise summary report with forecasts, insights, and confidence metrics. Publish & Notify: Automatically posts to WordPress and sends alerts via Slack and Email to keep analysts updated. Setup Add API keys: MetalPrice, NewsAPI, FRED, OpenAI, WordPress, Slack, Gmail. Configure scheduling interval, API endpoints, and authentication in n8n. Predefine WordPress post format and Slack message templates for smooth automation. Prerequisites n8n v1.0+, API keys, OAuth credentials, and internet access. Use Cases Investment forecasting, financial newsletter automation, or market monitoring dashboards. Customization Add cryptocurrency or stock tracking, modify AI prompts, or route summaries to Telegram, Notion, or Google Sheets. Benefits Saves analyst time, ensures consistent insights, enhances accuracy, and delivers timely, AI-driven financial intelligence.
by Ruthwik
🚀 AI-Powered WhatsApp Customer Support for Shopify Brands This n8n template builds a WhatsApp support copilot that answers **order status* and *product availability** from Shopify using LLM "agents," then replies to the customer in WhatsApp or routes to human support. Use cases "Where is my order?" → live status + tracking link "What are your best-selling T-shirts?" → in-stock sizes & variants Greetings / small talk → welcome message Anything unclear → handoff to support channel Good to know WhatsApp Business conversations are billed by Meta/Twilio/Exotel; plan accordingly. Shopify Admin API has rate limits (leaky bucket) --- stagger requests. LLM usage incurs token costs; cap max tokens and enable caching where possible. Avoid sending PII to the model; only pass minimal order/product fields. How it works WhatsApp Trigger\ Receives an incoming message (e.g., "Where is my order?"). Get Customer from Shopify → Customer Details → Normalize Input\ Looks up the customer by phone, formats the query (lower-case, emoji & punctuation normalization). Switch (intent router)\ Classifies into welcome, orderStatusQuery, productQuery, or supportQuery. Welcome path\ Welcome message → polite greeting → (noop placeholder). Order status path (Orders Agent) Orders Agent (LLM + Memory) interprets the user request and extracts needed fields. Get Customer Orders (HTTP to Shopify) fetches the user's latest order(s). Structured Output Parser cleans the agent's output into a strict schema. Send Order Status (WhatsApp message) returns status, ETA, and tracking link. Products path (Products Agent) Products Agent (LLM + Memory) turns the ask into a product query. Get Products from Shopify (HTTP) pulls best sellers / inventory & sizes. Structured Output Parser formats name, price, sizes, stock. Send Products message (WhatsApp) sends a tidy, human-readable reply Support path Send a message to support posts the transcript/context to your agent/helpdesk channel and informs the user a human will respond How to use Replace the manual/WhatsApp trigger with your live WhatsApp number/webhook. Set env vars/credentials: Shopify domain + Admin API token, WhatsApp provider keys, LLM key (OpenAI/OpenRouter), and (optionally) your support channel webhook. Edit message templates for tone, add your brand name, and localize if needed. Test with samples: "Where is my order?", "Show best sellers", "Hi". Requirements WhatsApp Business API (Meta/Twilio/Exotel) Shopify store + Admin API access LLM provider (OpenAI/OpenRouter etc.) Slack webhook for human handoff Prerequisites Active WhatsApp Business Account connected via API provider (Meta, Twilio, or Exotel). Shopify Admin API credentials** (API key, secret, store domain). Slack OAuth app** or webhook for human support escalation. API key for your LLM provider (OpenAI, OpenRouter, etc.). Customising this workflow Add intents: returns/exchanges, COD confirmation, address changes. Enrich product replies with images, price ranges, and "Buy" deep links. Add multilingual support by detecting locale and templating responses. Log all interactions to a DB/Sheet for analytics and quality review. Guardrails: confidence thresholds → fallback to support; redact PII; retry on API errors.
by Davide
This workflow automates the process of generating advertising (ADV) images from multiple reference images and publishing them directly to social media (Instagram and Facebook with Upload-Post) with Seedream v4 AI. This workflow automates the process of generating an AI image based on a user's text prompt and up to 6 reference images. The process is triggered by a user submitting a web form. Key Advantages ✅ Automated Image Creation – Generates high-quality, consistent visuals from multiple references without manual editing. ✅ Seamless Social Media Publishing – Automatically posts to Instagram and Facebook with minimal effort. ✅ SEO-Optimized Titles – Ensures your posts get better reach with AI-generated, keyword-friendly titles. ✅ Scalable Workflow – Can be triggered manually, on schedule, or via form submissions. ✅ Time-Saving – Reduces manual steps from design to publishing, enabling faster content production. ✅ Multi-Platform Support – Easily extendable to other platforms (TikTok, LinkedIn, etc.) with Upload-Post API. How It Works Form Trigger: A user fills out a form with a "Prompt" (text description) and a list of "Reference images" (comma-separated URLs). Data Processing: The workflow converts the submitted image URL string into a proper array for the AI API. AI Image Generation: The workflow sends the prompt and image URLs to the fal.ai API (specifically, the ByteDance seedream model) to generate a new, consistent image. Status Polling: It periodically checks the status of the AI job until the image generation is COMPLETED. Result Retrieval: Once complete, it fetches the URL of the generated image and downloads the image file itself. SEO Title Generation: The original user prompt is sent to OpenAI's GPT-4o-mini model to generate an optimized, engaging social media title. Cloud Backup: The generated image is uploaded to a specified Google Drive folder for storage. Social Media Posting: Finally, the workflow posts the downloaded image file to both Instagram and Facebook via the Upload-Post.com API, using the AI-generated title. Set Up Steps To make this workflow functional, you need to configure several third-party services and their corresponding credentials within n8n. Obtain fal.ai API Key: Create an account at fal.ai. Locate your API key in your account settings. In the "Create Video" and "Get status" nodes, edit the HTTP Header Auth credentials. Set the Header Name to Authorization and the Value to Key YOUR_FAL_AI_API_KEY. Configure Upload-Post.com API: Create an account at Upload-Post.com and get your API key. Create a profile within the Upload-Post app (e.g., test1); this profile manages your social account connections. In both the "Post to Instagram" and "Post to Facebook" nodes, edit the HTTP Header Auth credentials. Set the Header Name to Authorization and the Value to Apikey YOUR_UPLOAD_POST_API_KEY. Crucially, in the same nodes, find the user parameter in the body and replace the placeholder YOUR_USERNAME with the profile name you created (e.g., test1). Configure OpenAI/OpenRouter (Optional for Title Generation): The "Generate title" node uses an OpenAI-compatible API. The provided example uses OpenRouter. Ensure you have valid credentials (e.g., for OpenRouter or directly for OpenAI) configured in n8n and selected in this node. Configure Google Drive (Optional for Backup): The "Upload Image" node requires Google OAuth credentials. Set up a Google Cloud project, enable the Drive API, and create OAuth 2.0 credentials in the n8n settings. Authenticate and select the desired destination folder in the node's parameters. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by takuma
Who is this for This template is perfect for: Market Researchers** tracking industry trends. Tech Teams** wanting to stay updated on specific technologies (e.g., "AI", "Cybersecurity"). Content Creators** looking for curated news topics. Busy Professionals** who need a high-signal, low-noise news digest. What it does Fetches News: Pulls daily articles via NewsAPI based on your chosen keyword (default: "technology"). AI Filtering: Uses an AI Agent (via OpenRouter) to filter out low-quality or irrelevant clickbait. Daily Digest (Slack): Summarizes the top 3 articles in English. Translates the summaries to Japanese using DeepL (optional). Posts both versions to a Slack channel. Data Archiving (Sheets): Extracts structured data (Title, Author, Summary, URL) and saves it to Google Sheets. Weekly Trend Report: Every Monday, it reads the past week's data from Google Sheets and uses AI to generate a high-level trend report and strategic insights. How to set up Configure Credentials: You will need API keys/auth for NewsAPI, OpenRouter (or OpenAI), DeepL, Google Sheets, and Slack. Setup Google Sheet: Create a sheet with the following headers in the first row: title, author, summary, url. Map the Sheet: In the "Append row in sheet" and "Read sheet (weekly)" nodes, select your file and map the columns. Define Keyword: Open the "Set Keyword" node and change chatInput to the topic you want to track (e.g., "Crypto", "SaaS", "Climate Change"). Slack Setup: Select your desired channel in the Slack nodes. Requirements n8n** (Self-hosted or Cloud) NewsAPI** Key (Free tier available) OpenRouter** (or any LangChain compatible Chat Model like OpenAI) DeepL** API Key (for translation) Google Sheets** account Slack** Workspace How to customize Change the Language:** Remove the DeepL node if you only want English, or change the target language code. Adjust the Prompt:** Modify the "AI Agent (Filter)" system message to change how strict the news filtering is. Change Schedule:** Adjust the Cron nodes to run at your preferred time (currently set to Daily 8 AM and Weekly Monday 9 AM).
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Accelerate your research analysis with this Automated Research Intelligence System! This workflow uses AI and web scraping to analyze research papers and articles, extracting key insights, validating content quality, and generating comprehensive research documents. Perfect for research teams, academics, and AI enthusiasts staying current with the latest developments in artificial intelligence and machine learning. What This Template Does Triggers via form submission for on-demand research URL analysis. Validates URL accessibility and prepares for processing. Uses Decodo scraper to extract research content from target URLs. Analyzes research papers with AI for comprehensive understanding. Validates summaries for accuracy, completeness, and relevance. Generates key insights and actionable takeaways from research. Creates professional Google Docs with formatted research summaries. Evaluates research quality with AI-powered rating system. Saves all research to Google Sheets for historical tracking. Sends Slack alerts for high-quality research findings (9+ rating). Key Benefits Automated research analysis saves hours of manual reading time AI-powered insights extraction from complex research papers Quality validation ensures accurate and relevant summaries Centralized research database for team collaboration Real-time alerts for breakthrough research findings Professional documentation automatically generated Features Form-based trigger for easy research submission URL validation and accessibility checking AI-powered research analysis and summarization Decodo web scraping for reliable content extraction Multi-stage validation for accuracy and relevance Automated Google Docs report generation Quality assessment with structured rating system Google Sheets integration for research tracking Slack notifications for premium research findings Quality threshold filtering for optimal results Requirements Decodo API credentials for research scraping OpenAI API credentials for AI analysis Google Docs OAuth2 credentials for document creation Google Sheets OAuth2 credentials with edit access Slack Bot Token with chat:write permission Environment variables for configuration settings Target Audience AI research teams and data scientists Academic researchers and university labs Machine learning engineers and developers Technology innovation teams Research and development departments Content creators in AI/ML space Step-by-Step Setup Instructions Connect Decodo API credentials for research scraping functionality Set up OpenAI credentials for AI analysis and quality assessment Configure Google Docs for automated research document generation Add Google Sheets credentials for research tracking and history Set up Slack credentials for high-quality research alerts Customize quality thresholds for research rating (default: 6+ for processing, 9+ for alerts) Test with sample research URLs to verify analysis and formatting Deploy the form for team access to research analysis requests Monitor research database for trends and insights Pro Tip: Use coupon code "YARON" for free Decodo credits to enhance your research intelligence capabilities! This workflow transforms complex research into actionable intelligence with automated analysis, quality validation, and professional documentation!
by Rahul Joshi
Description Synchronize OKRs (Objectives and Key Results) between Monday.com and Jira to automatically calculate progress variance, update dashboards, and share variance reports via Slack and Outlook. This workflow ensures teams have accurate, real-time visibility into performance metrics and project alignment — without manual reconciliation. 🎯📈💬 What This Template Does Step 1: Triggers daily at a scheduled time to fetch the latest OKRs from Monday.com. ⏰ Step 2: Extracts Key Results and their linked Jira epic keys from the OKR board. 🔗 Step 3: Fetches corresponding Jira epic details such as status, assignee, and last updated date. 🧩 Step 4: Merges Monday.com KR data with Jira epic progress through SQL-style joins. 📋 Step 5: Calculates real-time progress and variance against target goals. 📊 Step 6: Updates Monday.com KR items with actual progress, variance percentage, and status (“On Track”, “At Risk”, or “Ahead”). 🔄 Step 7: Aggregates all KR data into a consolidated report for communication. 📦 Step 8: Sends formatted variance reports to Slack and Outlook, with summaries of owner, progress, and variance metrics. 📢 Key Benefits ✅ Automates end-to-end OKR and Jira synchronization ✅ Eliminates manual progress tracking errors ✅ Provides daily visibility on team and project health ✅ Enables proactive risk detection via variance thresholds ✅ Keeps all stakeholders updated via Slack and Outlook ✅ Centralizes OKR performance metrics for reporting Features Daily scheduled trigger for automatic OKR sync Monday.com → Jira data integration via API Real-time variance computation logic Automatic updates of OKR fields in Monday.com SQL-style data merging and aggregation Slack notification with variance summaries Outlook email digest with formatted HTML tables Requirements Monday.com API credentials with board access Jira API credentials with permission to view epics Slack Bot token with chat:write permissions Microsoft Outlook OAuth2 credentials for sending emails Environment variables for board, channel, and recipient configuration Target Audience Product and engineering teams managing OKRs across platforms 🎯 Project managers tracking cross-tool performance metrics 📋 Leadership teams needing automated OKR reporting 💼 Operations and strategy teams monitoring execution health 🧭 Step-by-Step Setup Instructions Connect your Monday.com, Jira, Slack, and Outlook credentials in n8n. 🔑 Replace MONDAY_BOARD_ID, GROUP_ID, and column identifiers with your own. 🧩 Set environment variables for SLACK_CHANNEL_ID and REPORT_RECIPIENT_EMAIL. 💬 Adjust the cron expression to define your sync frequency (e.g., daily at 9 AM). ⏰ Test the workflow with a single OKR item to confirm successful synchronization. 🧠 Enable the workflow to automate daily OKR variance tracking and reporting. ✅
by Cheng Siong Chin
Introduction Automate peer review assignment and grading with AI-powered evaluation. Designed for educators managing collaborative assessments efficiently. How It Works Webhook receives assignments, distributes them, AI generates review rubrics, emails reviewers, collects responses, calculates scores, stores results, emails reports, updates dashboards, and posts analytics to Slack. Workflow Template Webhook → Store Assignment → Distribute → Generate Review Rubric → Notify Slack → Email Reviewers → Prepare Response → Calculate Score → Store Results → Check Status → Generate Report → Email Report → Update Dashboard → Analytics → Post to Slack → Respond to Webhook Workflow Steps Receive & Store: Webhook captures assignments, stores data. Distribute & Generate: Assigns peer reviewers, AI creates rubrics. Notify & Email: Alerts via Slack, sends review requests. Collect & Score: Gathers responses, calculates peer scores. Report & Update: Generates reports, emails results, updates dashboard. Analyze & Alert: Posts analytics to Slack, confirms completion. Setup Instructions Webhook & Storage: Configure endpoint, set up database. AI Configuration: Add OpenAI key, customize rubric prompts. Communication: Connect Gmail, Slack credentials. Dashboard: Link analytics platform, configure metrics. Prerequisites OpenAI API key Gmail account Slack workspace Database or storage system Dashboard tool Use Cases University peer review assignments Corporate training evaluations Research paper assessments Customization Multi-round review cycles Custom scoring algorithms LMS integration (Canvas, Moodle) Benefits Eliminates manual distribution Ensures consistent evaluation Provides instant feedback and analytics
by Haruki Kuwai
Title (suggested): Extract business card data from Telegram to Google Sheets Who’s it for Teams that receive business cards digitally (sales, marketing, back-office) and want a plug-and-play way to capture contacts into a sheet without manual typing. What it does / How it works This workflow ingests a business card sent to your Telegram bot, detects whether the message contains an image or text, extracts key fields with an AI Vision Agent (company, full name, department, job title, postal code, address, phone, mobile, fax, email, website), and appends or updates a contact row in Google Sheets automatically. How to set up Connect Telegram (bot token) and enable file download. Connect your AI provider (OpenRouter or equivalent) used by the AI Vision Agent. Connect Google Sheets and select your spreadsheet + sheet tab. Rename nodes clearly and keep sticky notes: one overview note (this description) + step notes. Test by sending a sample card image to your bot and verify the row is appended/updated. Requirements Telegram Bot API credential AI chat/vision credential Google Sheets OAuth credential and an accessible spreadsheet How to customize the workflow Map fields to your sheet headers (add/remove columns as needed). Adjust the system prompt to prefer your locale or specific field formats. Change the matching key for update logic (e.g., company name or email). Add downstream steps (CRM push, dedupe rules, notifications). Security note: Do not hardcode API keys or include real IDs/emails. Use credentials and environment configs only. JSON [ { "company_name": "Example Company Ltd.", "department": "Sales", "job_title": "Sales Manager", "full_name": "Taro Yamada", "postal_code": "100-0001", "address": "1-1-1 Marunouchi, Chiyoda-ku, Tokyo", "phone_number": "+81-3-0000-0000", "mobile_phone_number": "+81-90-0000-0000", "fax_number": "+81-3-1111-1111", "email": "example@company.com", "website_url": "https://example.com" } ] Troubleshooting Nothing appears in Google Sheets Solution: Verify that your Google Sheets credentials are correctly authorized. Confirm the Spreadsheet ID and Sheet Name in the node match your target file. Make sure the Google Sheets node is connected downstream of the AI Vision Agent. If the workflow runs successfully but nothing is added, check whether the matching column (company_name) already exists — in appendOrUpdate mode it will only update that row. AI returns incomplete or invalid data Solution: Review the system prompt in the AI Vision Agent to ensure it instructs the model to return a structured JSON object with all required fields (company name, full name, department, job title, address, etc.). If the result is partial, verify the image quality of the uploaded business card — low contrast or skewed images can reduce OCR accuracy. You can reduce temperature in the AI node to make output more deterministic. Workflow doesn’t start automatically Solution: Check that the workflow is activated (toggle is ON in the top right of n8n). Verify the Webhook URL is correctly registered in Telegram’s bot settings. Run manually once to ensure all credentials and nodes are configured correctly.
by Davide
This is an exaple of advanced automated data extraction and enrichment pipeline with ScrapeGraphAI. Its primary purpose is to systematically scrape the n8n community workflows website, extract detailed information about recently added workflows, process that data using multiple AI models, and store the structured results in a Google Sheets spreadsheet. This workflow demonstrates a sophisticated use of n8n to move beyond simple API calls and into the realm of intelligent, AI-driven web scraping and data processing, turning unstructured website content into valuable, structured business intelligence. Key Advantages ✅ Full Automation: Once triggered (manually or on a schedule via the Schedule Trigger node), the entire process runs hands-free, from data collection to spreadsheet population. ✅ Powerful AI-Augmented Scraping: It doesn't just scrape raw HTML. It uses multiple AI agents (Google Gemini, OpenAI) to: Understand page structure to find the right data on the main list. Clean and purify content from individual pages, removing and irrelevant information. Perform precise information extraction to parse unstructured text into structured JSON data based on a defined schema (author, price, etc.). Generate intelligent summaries, adding significant value by explaining the workflow's purpose in Italian. ✅ Robust and Structured Data Output: The use of the Structured Output Parser and Information Extractor nodes ensures the data is clean, consistent, and ready for analysis. It outputs perfectly formatted JSON that maps directly to spreadsheet columns. ✅ Scalability via Batching: The Split In Batches and Loop Over Items nodes allow the workflow to process a dynamically sized list of workflows. Whether there are 5 or 50 new workflows, it will process each one sequentially without failing. ✅ Effective Data Integration: It seamlessly integrates with Google Sheets, acting as a simple and powerful database. This makes the collected data immediately accessible, shareable, and available for visualization in tools like Looker Studio. ✅ Resilience to Website Changes: By using AI models trained to understand content and context (like "find the 'Recently Added' section" or "find the author's name"), the workflow is more resilient to minor cosmetic changes on the target website compared to traditional CSS/XPath selectors. How It Works The workflow operates in two main phases: Phase 1: Scraping the Main List Trigger: The workflow can be started manually ("Execute Workflow") or automatically on a schedule. Scraping: The "Scrape main page" node (using ScrapeGraphAI) fetches and converts the https://n8n.io/workflows/ page into clean Markdown format. Data Extraction: An LLM chain ("Extract 'Recently added'") analyzes the Markdown. It is specifically instructed to identify all workflow titles and URLs within the "Recently Added" section and output them as a structured JSON array named workflows. Data Preparation: The resulting array is set as a variable and then split out into individual items, preparing them for processing one-by-one. Phase 2: Processing Individual Workflows Loop: The "Loop Over Items" node iterates through each workflow URL obtained from Phase 1. Scrape & Clean Detail Page: For each URL, the "Scrape single Workflow" node fetches the detail page. Another LLM chain ("Main content") cleans the resulting Markdown, removing superfluous content and focusing only on the core article text. Information Extraction: The cleaned Markdown is passed to an "Information Extractor" node. This uses a language model to locate and structure specific data points (title, URL, ID, author, categories, price) into a defined JSON schema. Summarization: The cleaned Markdown is also sent to a Google Gemini node ("Summarization content"), which generates a concise Italian summary of the workflow's purpose and tools used. Data Consolidation & Export: The extracted information and the generated summary are merged into a single data object. Finally, the "Add row" node maps all this data to the appropriate columns and appends it as a new row in a designated Google Sheet. Set Up Steps To run this workflow, you need to configure the following credentials in your n8n instance: ScrapeGraphAI Account: The "Scrape main page" and "Scrape single Workflow" nodes require valid ScrapeGraphAI API credentials named ScrapegraphAI account. Install the related Community node. Google Gemini Account: Multiple nodes ("Google Gemini Chat Model", "Summarization content", etc.) require API credentials for Google Gemini named Google Gemini(PaLM) (Eure). OpenAI Account: The "OpenAI Chat Model1" node requires API credentials for OpenAI named OpenAi account (Eure). Google Sheets Account: The "Add row" node requires OAuth2 credentials for Google Sheets named Google Sheets account. You must also ensure the node is configured with the correct Google Sheet ID and that the sheet has a worksheet named Foglio1 (or update the node to match your sheet's name). Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Ranjan Dailata
This workflow automates competitor keyword research using OpenAI LLM and Decodo for intelligent web scraping. Who this is for SEO specialists, content strategists, and growth marketers who want to automate keyword research and competitive intelligence. Marketing analysts managing multiple clients or websites who need consistent SEO tracking without manual data pulls. Agencies or automation engineers using Google Sheets as an SEO data dashboard for keyword monitoring and reporting. What problem this workflow solves Tracking competitor keywords manually is slow and inconsistent. Most SEO tools provide limited API access or lack contextual keyword analysis. This workflow solves that by: Automatically scraping any competitor’s webpage with Decodo. Using OpenAI GPT-4.1-mini to interpret keyword intent, density, and semantic focus. Storing structured keyword insights directly in Google Sheets for ongoing tracking and trend analysis. What this workflow does Trigger — Manually start the workflow or schedule it to run periodically. Input Setup — Define the website URL and target country (e.g., https://dev.to, france). Data Scraping (Decodo) — Fetch competitor web content and metadata. Keyword Analysis (OpenAI GPT-4.1-mini) Extract primary and secondary keywords. Identify focus topics and semantic entities. Generate a keyword density summary and SEO strength score. Recommend optimization and internal linking opportunities. Data Structuring — Clean and convert GPT output into JSON format. Data Storage (Google Sheets) — Append structured keyword data to a Google Sheet for long-term tracking. Setup Prerequisites n8n account with workflow editor access Decodo API credentials OpenAI API key Google Sheets account connected via OAuth2 Make sure to install the Decodo Community node. Create a Google Sheet Add columns for: primary_keywords, seo_strength_score, keyword_density_summary, etc. Share with your n8n Google account. Connect Credentials Add credentials for: Decodo API credentials - You need to register, login and obtain the Basic Authentication Token via Decodo Dashboard OpenAI API (for GPT-4o-mini) Google Sheets OAuth2 Configure Input Fields Edit the “Set Input Fields” node to set your target site and region. Run the Workflow Click Execute Workflow in n8n. View structured results in your connected Google Sheet. How to customize this workflow Track Multiple Competitors** → Use a Google Sheet or CSV list of URLs; loop through them using the Split In Batches node. Add Language Detection** → Add a Gemini or GPT node before keyword analysis to detect content language and adjust prompts. Enhance the SEO Report** → Expand the GPT prompt to include backlink insights, metadata optimization, or readability checks. Integrate Visualization** → Connect your Google Sheet to Looker Studio for SEO performance dashboards. Schedule Auto-Runs** → Use the Cron Node to run weekly or monthly for competitor keyword refreshes. Summary This workflow automates competitor keyword research using: Decodo** for intelligent web scraping OpenAI GPT-4.1-mini** for keyword and SEO analysis Google Sheets** for live tracking and reporting It’s a complete AI-powered SEO intelligence pipeline ideal for teams that want actionable insights on keyword gaps, optimization opportunities, and content focus trends, without relying on expensive SEO SaaS tools.
by InfyOm Technologies
✅ What problem does this workflow solve? Tracking what people say about your brand on Twitter can be overwhelming, especially when important mentions slip through the cracks. This workflow automates the process: it scrapes Twitter mentions, analyzes sentiment using OpenAI, logs everything in a Google Sheet, and sends real-time Slack alerts for negative tweets. No manual monitoring needed. ⚙️ What does this workflow do? Runs on a schedule to monitor Twitter mentions or hashtags. Uses Apify to scrape tweets based on brand keywords. Filters out tweets already processed (avoids duplicates). Performs sentiment analysis with OpenAI (Positive, Neutral, Negative). Logs tweet content, sentiment, and reply (if any) in a Google Sheet. Sends an instant Slack notification for negative tweets. Generates thank-you replies for positive tweets and logs them. 🔧 Setup Instructions 🗓 Schedule Trigger Use the Cron node to schedule checks (e.g., every hour, daily). 🐦 Apify Twitter Scraper Setup Sign up on Apify Generate your Apify API Token and use it in the HTTP node to run the actor and get tweet results. 🤖 OpenAI Sentiment Analysis Get your API key from OpenAI 📄 Google Sheet Configuration Prepare a Google Sheet with this sample format. Connect it using the Google Sheets node in n8n. 💬 Slack Notifications Connect your Slack workspace via the Slack node. Set up the channel where negative tweets should be sent as alerts. 🧠 How it Works 1. Scheduled Run Triggered at a fixed interval using the Schedule (Cron) node. 2. Scrape Mentions from Twitter The Apify actor runs and collects recent tweets mentioning your brand or using your hashtag. Links to the tweets are extracted. 3. Filter Previously Seen Tweets Each tweet is checked against the Google Sheet. If already present, it’s skipped to avoid duplicate analysis. 4. Analyze Sentiment with OpenAI For new tweets, sentiment is classified into: ✅ Positive ⚪ Neutral ❌ Negative 5. Store Results in Google Sheet The tweet link, content, and sentiment are stored in a row. If sentiment is positive, a thank-you reply is also generated and saved. 6. Notify Slack for Negative Tweets When a tweet is tagged Negative, a Slack message is sent to the designated channel with the tweet link. 👤 Who can use this? This workflow is ideal for: 📢 Social Media Teams 🧠 PR and Brand Managers 🧑💻 Solo Founders 🏢 Startups & SaaS Companies Stay ahead of your brand's reputation—automatically. 🛠 Customization Ideas 🎯 Add filters for specific campaign hashtags. 📬 Send weekly summary reports via email. 📥 Auto-open support tickets for negative mentions. 🗣 Expand sentiment categories with more detailed tagging. 🚀 Ready to get started? Just plug in: 🔑 Your Apify API Token 🔑 Your OpenAI API Key 📄 Your Google Sheet 💬 Your Slack Channel Then deploy the workflow, and let it monitor Twitter for you!