by Pramod Kumar Rathoure
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🚀 Automating Google Maps Lead Generation with n8n + Apify Finding quality leads can be time-consuming. What if you could scrape restaurant data from Google Maps, filter the best ones, and email a Morning Brew–style newsletter automatically? That’s exactly what this n8n workflow does. 🔎 What This Workflow Does Takes a location input (Bangkok or Bareilly in this case) Runs a Google Maps scraper via Apify Actor Extracts restaurant essentials (name, category, rating, reviews, address, phone, Google Maps link) Sorts & filters results (only high-review, highly-rated places) Saves data to Airtable for lead management Uses AI to generate a newsletter in a Morning Brew–style HTML email Emails the newsletter automatically to your chosen recipients 🛠️ Workflow Breakdown 1. Form Trigger User selects location from a dropdown (Bangkok or Bareilly) Submits form to kickstart the process 2. Google Maps Scraper Powered by Apify Collects up to 1,000 restaurants with details: Name Category Price Range Rating Reviews Address Phone Google Maps URL Skips closed places and pulls detailed contact data 3. Extract & Transform Data n8n Set node extracts only the essentials Formats them into a clean text block (Restaurant_Data) 4. Sort & Filter Sorted by**: Review Count (descending) Rating (descending) Filter*: Only restaurants with *500+ reviews 5. Airtable Lead Storage Each record is saved to Google Map Leads - Restaurants Airtable table Fields include: Title Category Price Range Rating Review Count Address Phone Location 6. AI-Powered Newsletter n8n’s LangChain + OpenAI node generates an HTML newsletter Tone**: Breezy, witty, like Morning Brew Content**: Sorted restaurant picks with ratings, reviews, and contact links Output is JSON with "Subject" and "Body" 7. Automatic Email Gmail node sends the newsletter directly to your inbox Example recipient: prath002@gmail.com 🎯 Why This Workflow Rocks End-to-End Automation**: From scraping → filtering → emailing, no manual effort Lead Enrichment**: Only keeps high-quality restaurants with strong social proof Scalable**: Works for any city you plug into the form Engaging Output**: AI crafts the results into a ready-to-send newsletter 🔮 Next Steps Add more cities to the dropdown for multi-location scraping Customize the email template for branding Integrate with CRM tools for automated outreach 👉 With just a few clicks, you can go from raw Google Maps data → polished newsletter → fresh leads in Airtable. That’s the power of n8n + Apify + AI.
by Maksym Brashenko
Advanced n8n Workflow Sync with GitHub A robust workflow to back up and synchronize your n8n workflows to a GitHub repository, with intelligent change detection and support for file renames. 🎯 Who's it for? This workflow is for n8n administrators, developers, and power users who want a reliable, automated way to: Keep a version-controlled history of their workflows. Collaborate on workflows using GitHub's infrastructure. Prevent data loss and have a disaster recovery plan for their n8n instance. ✨ Key Features (What it does) Intelligent Sync**: Backs up all your n8n workflows to a designated GitHub repository. Human-Readable Filenames**: Saves workflows with filenames based on their actual names in n8n (e.g., My Awesome Workflow.json). Reliable Matching**: Uses the unique n8n workflow ID to reliably track files, even if their names change. Rename Detection**: If you rename a workflow in n8n, it intelligently deletes the old file and creates a new one in a single logical operation. Efficient Commits**: Commits changes to GitHub only when there are actual modifications to a workflow's logic or structure. It performs a deep comparison, ignoring metadata changes. Clear Commit History**: Generates clean, informative commit messages: create: workflowName update: workflowName rename: oldName - newName ⚙️ How It Works (Simple Steps) Get n8n Workflows: The workflow starts by fetching all your current workflows from n8n. Get GitHub Files: At the same time, it lists all existing workflow files from your GitHub repository. Compare & Decide: It then compares each n8n workflow with its GitHub counterpart. It checks if anything changed, if it was renamed, or if it's new. Take Action: If a workflow is new, it's created on GitHub. If a workflow is updated, the file content is changed on GitHub. If a workflow was renamed, the old file is deleted, and a new one is created. If nothing changed, the workflow is skipped. Send Report: Finally, it can send a summary report to Telegram about what happened. 🚀 How to Set Up Credentials: GitHub: Go to Credentials > New and add your GitHub credentials. You'll need a token with repo permissions. n8n API: In the same Credentials section, create n8n API credentials. You'll need your n8n instance's Base URL and an API key (you can create one in your n8n user settings). Telegram (Optional): If you want notifications, add your Telegram Bot credentials. Configure the Workflow: Open the Configuration node (the green one at the start). Fill in the following values: repo.owner: Your GitHub username or organization name. repo.name: The name of the repository for backups. repo.path: The folder inside the repository to store workflows (e.g., workflows/). report.tg.chatID (Optional): Your Telegram chat ID for notifications. Set to 0 to disable. report.verbose: Set to true to receive a report even if there were no changes. Connect Credentials: Select your newly created credentials in the following nodes: Get all workflows: Select your n8n API credentials. All GitHub nodes (e.g., List files, Create new file): Select your GitHub credentials. Send a message (Telegram): Select your Telegram credentials. Set the Schedule: In the Schedule Trigger node, configure how often you want the backup to run (e.g., every hour, once a day). Activate the Workflow: Save the workflow and toggle it to "Active". 🔧 How to Customize Change Report Destination**: The final part of the workflow sends a report to Telegram. You can easily replace the Send a message node with a node for Slack, Discord, or email to change where notifications go. The message is pre-formatted in the Render summary node. 💡 What's Next? (Future Updates) This workflow is actively maintained! Here's a sneak peek at what's planned for future versions: Automatic Archive Handling**: The next major update will introduce logic to automatically detect when a workflow is archived in n8n and move it to a dedicated archived/ folder in your GitHub repository, keeping your main backup directory clean. Performance Optimizations**: I'm exploring ways to reduce API traffic by intelligently checking for changes before fetching full workflow data. To get the latest version with these features when it's released, be sure to follow my profile for new workflow publications!
by Calistus Christian
What this workflow does Pulls free security/tech headlines from multiple RSS feeds (e.g., CISA, BleepingComputer, Krebs, SecurityWeek, Ars Technica, TechCrunch, Hacker News). De-duplicates stories, keeps only the last 24 hours, and limits to a manageable number. Uses OpenAI to write a concise brief with sections and "Why it matters." Sends a clean HTML email via Gmail. Category: Security / News\ Time to set up: ~10--15 minutes\ Difficulty: Beginner--Intermediate\ Cost: Mostly free (OpenAI tokens + Gmail) * What you'll need n8n (recent version) OpenAI credentials Gmail (or SMTP) credentials A few free RSS feed URLs (swap in/out as you like) * Set up steps Trigger -- Add a Cron to run daily (pick your time and timezone). Fetch -- Add one RSS Read node per source and connect all to a Merge (append). De-duplicate (this run) -- Add Remove Duplicates and compare on a stable key (prefer the article URL). Freshness -- Add an IF to pass only items published in the last 24 hours. Limit -- Add Limit to cap the total items (e.g., 25). Summarize -- Add OpenAI → Message a model to produce a JSON brief with subject + HTML body. Email -- Add Gmail → Send to deliver the brief to your inbox. * Tips & troubleshooting If everything gets discarded at de-dup e while testing, switch to "within current input" or reset the node's stored values. If no items pass the IF, widen the date window temporarily (some feeds publish late). If the email arrives blank, ensure Gmail email type is set to HTML and the subject/body fields map to the model's output. * Sources you can start with (swap freely) CISA, BleepingComputer, KrebsOnSecurity, SecurityWeek, Ars Technica (Security), TechCrunch (Security), Hacker News (front page).
by Rakin Jakaria
📄 AI Invoice Agent The AI Invoice Agent automates the invoice creation, email delivery, and status tracking process for client billing. It ensures invoices are generated, sent professionally, and updated in Google Sheets with minimal manual work. 🔹 How It Works Trigger Activated manually (Execute Workflow) when you want to process invoices. Fetch Invoices Reads client invoice data from a Google Sheet (Client Invoices). Filter Pending Invoices Passes through only invoices with Status = Pending. Prepare Invoice Data Collects and formats details: Invoice ID Client Name & Address Project Name Amount (USD) Invoice Date (today’s date) Due Date (7 days later) Loop Over Invoices Processes each invoice one by one. AI Email Draft Uses GPT-4.1-mini to generate a polite, professional email. Tone: friendly but business-oriented. Signed as Upward Engine Team. Extract Email Parts Separates subject and body from the AI output using an Information Extractor. Generate Invoice PDF Uses CraftMyPDF to create a formatted invoice PDF with: Company details (Upward Engine) Client details Invoice ID, Date, Due Date Amount due Footer message Send Email to Client Sends invoice email via Gmail, attaching the PDF invoice. Update Invoice Status Updates Google Sheets to mark the invoice as Completed. Saves Invoice ID, Date, Due Date, and updated status. Loop Continuation Continues until all pending invoices are processed. 🔹 Tools & Integrations Google Sheets** → Stores client & invoice data Filter Node** → Selects only Pending invoices GPT-4.1-mini (OpenAI)** → Generates professional emails Information Extractor** → Separates subject & body CraftMyPDF** → Creates PDF invoices Gmail** → Sends invoice emails with PDF attachments 🔹 Example Workflow ✅ Google Sheets: Invoice marked as Pending ➡️ AI generates email → “Invoice INV-1023 for Web Design Project – Due Sep 5” ➡️ PDF invoice created & attached ➡️ Email sent to client with subject + body ➡️ Status updated in Google Sheet → Completed ⚡ This agent ensures zero missed invoices, professional client communication, and up-to-date tracking — fully automated for agencies and small businesses.
by Mezie
What it does Submit a LinkedIn profile URL through a form. The workflow finds their email and company info using Wiza, then researches the prospect and their company with Perplexity AI to uncover recent news, growth signals, and pain points. Your choice of AI model uses that research to write a personalized icebreaker email with a relevant hook. The finished draft shows up in your Gmail inbox, ready to review and send. Who's it for Sales teams, recruiters, and marketers scaling personalized outreach without manual research. Requirements n8n (self-hosted or cloud) Wiza API Key OpenAI API Key Perplexity API Key Gmail OAuth2 credentials How to set up Import workflow JSON into n8n Configure Wiza, OpenAI, Perplexity, and Gmail credentials Create Leads and Case Studies data tables in n8n Update business context in the "Your Offer" node Activate workflow and use the form URL How to customize Modify email templates in the "Ice Breaker Email Generator" prompt Update business profile and case studies for relevance Adjust AI model settings for tone and creativity
by Davide
This workflow automates the creation of realistic Multi-speaker podcasts using ElevenLabsv3 API by reading a script from Google Sheets and saving the final MP3 file to Google Drive. Data Source – Dialogue scripts are stored in a Google Sheet. Each row contains: Speaker name (optional) Voice ID (from ElevenLabs) Text to be spoken Data Preparation – The workflow transforms the spreadsheet content into the proper JSON format required by the ElevenLabs API. Podcast Generation – ElevenLabs’ Eleven v3 model converts the prepared text into expressive, natural-sounding dialogue. It supports not only speech but also non-verbal cues and audio effects (e.g., \[laughs], \[whispers], \[clapping]). File Storage – The generated audio file is automatically uploaded to Google Drive, organized by timestamped filenames. Key Advantages Seamless Automation** – From dialogue writing to final audio upload, everything runs automatically in one workflow. Multi-Speaker Support** – Easily assign different voices to multiple characters for dynamic conversations. Expressive & Realistic Output** – Supports emotions, speech styles, and ambient effects, making podcasts more immersive. Flexible Content Input** – Scripts can be collaboratively written and edited in Google Sheets, with no technical knowledge required. Scalable & Reusable** – Can generate multiple podcast episodes in seconds, ideal for content creators, educators, or businesses. Cloud Integration** – Final audio files are securely stored in Google Drive, ready to be shared or published. How It Works The workflow processes a structured script from a spreadsheet and uses AI to generate a realistic conversation. Manual Trigger: The workflow is started manually by a user clicking "Execute workflow" in n8n. Get Dialogue: The "Get dialogue" node fetches the podcast script data from a specified Google Sheet. The sheet should contain columns for Speaker (optional), Voice ID, and the dialogue Input/Text. Prepare Dialogue: The "Code" node transforms the raw sheet data into the precise JSON format required by the ElevenLabs API. It creates an array of objects where each object contains the text and the corresponding voice_id for each line of dialogue. Generate Podcast: The "HTTP Request" node sends a POST request to the ElevenLabs Text-to-Dialogue API endpoint (/v1/text-to-dialogue). It sends the transformed dialogue array in the request body, instructing the API to generate a single audio file with a conversation between the specified voices. Upload File: The "Upload file" node takes the audio file response from ElevenLabs and saves it to a designated folder in Google Drive.. Set Up Steps To use this workflow, you must complete the following configuration steps: Prepare the Google Sheet: Clone the Template: Duplicate the provided Google Sheet template into your own Google Drive. Fill the Script: Column A (Speaker): Optional. Add speaker names for your reference (e.g., "Host", "Guest"). Column B (Voice ID): Mandatory. Enter the unique Voice ID for each line from ElevenLabs. Column C (Input): Mandatory. Write the dialogue text for each speaker. You can use [non-speech audio events] like [laughs] or [whispers] to add expression. Configure ElevenLabs API Credentials: Login or create FREE account on Elevenlabs Edit the "Generate podcast" node's credentials. Create an HTTP Header Auth credential named "ElevenLabs API". Set the Name to xi-api-key and the Value to your actual ElevenLabs API key. Configure Google Services: Google Sheets: Ensure the "Get dialogue" node has valid OAuth credentials and that the documentId points to your copy of the script sheet. Google Drive: Ensure the "Upload file" node has valid OAuth credentials and that the folderId points to the correct Google Drive folder where you want the audio files saved. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Kamran habib
| N8N Workflow | AI Reddit Problem Detection & Auto-Solution Commenter 🤖 This n8n workflow automates Reddit community engagement by detecting posts that discuss problems and automatically replying with AI-generated solutions — powered by Google Gemini. It’s designed for developers, automation creators, and brands who want to provide helpful, automated responses to Reddit users discussing issues in their niche. How It Works The workflow starts with a Manual Trigger (When clicking ‘Execute workflow’). Search for a Post: It scans the r/n8n subreddit (or any subreddit you set) for recent posts containing the keyword “Why I stopped using”. Filter Posts (If Node): Filters posts that have 2 or more upvotes and non-empty text, ensuring only quality discussions are analyzed. Edit Fields: Extracts post details such as title, body text, upvotes, creation time, and subreddit ID for AI processing. AI Agent + Google Gemini Chat Model: The first AI node analyzes the post and decides whether it’s describing a problem or frustration related to AI automation. Gemini responds with “Yes” or “No.” Conditional Branch (If1 Node): If “Yes,” the post is confirmed as discussing a problem. The workflow then triggers the second AI Agent. AI Agent 2 + Gemini: The second AI node uses Gemini to generate a helpful and concise solution addressing the issue mentioned in the Reddit post (for example, offering a fix, suggestion, or new idea). Merge & Log Data: The AI’s findings (post details + solution) are merged and saved into a connected Google Sheet for tracking community insights. Comment on Reddit: The workflow automatically posts the AI-generated solution as a comment reply on the original Reddit thread, engaging users directly. How To Use Import the provided JSON workflow into your n8n dashboard. Set up the required credentials: Reddit OAuth2 API – for searching and posting comments. Google Gemini (PaLM) API – for AI text analysis and solution generation. Google Sheets API – for logging post data and AI results. Adjust the subreddit name, search keyword, or prompts to fit your niche. Click Execute Workflow to run the automation. Requirements Reddit Developer Account (OAuth2 credentials). Google Gemini (PaLM) API account for AI processing. Google Sheets account for saving analysis results. How To Customize Change the search keyword (e.g., “help with automation,” “issue with API,” etc.). Modify the AI prompts to tailor the solution style (technical, friendly, educational, etc.). Edit the Google Sheet fields to capture more or fewer details. Enable/disable the comment node if you want to manually approve replies before posting. Adjust the Gemini model name (e.g., models/gemini-2.0-flash) or parameters for faster or more creative responses.
by Rosh Ragel
Automatically Send Square Summary Report for Yesterday's Sales via Gmail What It Does This workflow automatically connects to the Square API and generates a daily sales summary report for all your Square locations. The report matches the figures displayed in Square Dashboard > Reports > Sales Summary. It's designed to run daily and pull the previous day's sales into a CSV file, which is then sent to a manager/finance team for analysis. This workflow builds on my previous template, which allows users to automatically pull data from the Square API into n8n for processing. (See here: https://n8n.io/workflows/6358) Prerequisites To use this workflow, you'll need: A Square API credential (configured as a Header Auth credential) A Gmail credential How to Set Up Square Credentials: Go to Credentials > Create New Choose Header Auth Set the Name to Authorization Set the Value to your Square Access Token (e.g., Bearer <your-api-key>) How It Works Trigger: The workflow runs every day at 4:00 AM Fetch Locations: An HTTP request retrieves all Square locations linked to your account Fetch Orders: For each location, an HTTP request pulls completed orders for the specified report_date Filter Empty Locations: Locations with no sales are ignored Aggregate Sales Data: A Code node processes the order data and produces a summary identical to Square’s built-in Sales Summary report Create CSV File: A CSV file is created containing the relevant data Send Email: An email is sent to the chosen third party Example Use Cases Automatically send Square sales data to management to improve the quality of planning and scheduling decisions Automatically send data to an external third party, such as a landlord or agent, who is paid via commission Automatically send data to a bookkeeper for entry into QuickBooks How to Use Configure both HTTP Request nodes to use your Square API credential Set the workflow to Active so it runs automatically Enter the email address of the person you want to send the report to and update the message body If you want to remove the n8n attribution, you can do so in the last node Customization Options Add pagination to handle locations with more than 1,000 orders per day Instead of a daily summary, you can modify this workflow to produce a weekly summary once a week Why It's Useful This workflow saves time, reduces manual report pulling from Square, and enables smarter automation around sales data — whether for operations, finance, or performance monitoring.
by Oneclick AI Squad
This automated n8n workflow leverages AI to monitor construction site progress using daily site photos, providing real-time updates and detailed tracking. The system analyzes images, generates progress summaries, and appends data to a Google Sheet for comprehensive project management. Good to Know Supports daily photo analysis for construction progress tracking Utilizes Google Gemini AI for intelligent image processing and analysis Includes memory to maintain context of ongoing construction updates Automates data appending to Google Sheets for real-time tracking Integrates with scheduled triggers for consistent daily updates How It Works Schedule Trigger** - Initiates the workflow daily to check for new site photos List a file** - Retrieves the latest photos from the designated drive folder Tack All Images** - Passes each photo to the AI for individual analysis AI Analysis** - Processes images using Google Gemini to detect construction progress Append data to a sheet** - Updates a Google Sheet with the analyzed progress data Send email** - Notifies stakeholders with a daily progress update How to Use Import workflow into n8n Configure Google Drive API for photo access Set up Google Gemini AI model for image analysis Configure Google Sheets API for data appending Set up email notification settings Test with sample photos and monitor updates Adjust AI parameters for accuracy as needed Requirements Access to Google Drive API Google Gemini AI model integration Google Sheets API access Email service configuration Scheduled trigger setup in n8n Sheet Columns: Date** Photo ID** Progress Summary** Status** Notes** Customizing This Workflow Modify AI prompts for specific construction milestones Adjust photo analysis frequency based on project needs Configure custom Google Sheet columns for additional data Set up custom email templates for stakeholder updates Integrate additional AI models for enhanced analysis
by Yang
Who is this for? This workflow is perfect for content marketers, bloggers, SEO professionals, and virtual assistants who need to transform keyword research into complete blog posts without spending hours writing and formatting. What problem is this workflow solving? Writing a blog post from scratch requires research, summarizing content, and structuring it into a polished article. This workflow automates that process by taking a single keyword, fetching related news articles, cleaning the data, and generating a professional blog draft automatically in Google Docs. What this workflow does The workflow begins when a keyword is submitted through a form. It expands the keyword into trending suggestions using Dumpling AI Autocomplete, then fetches recent news articles with Dumpling AI Google News. Articles are filtered to only include those published within the last 1–2 days, then scraped and cleaned for quality text. The aggregated content is sent to OpenAI, which generates a polished blog draft with a clear title. Finally, the draft is saved directly into Google Docs for easy editing and publishing. Nodes Overview Form Trigger – Form Submission (Keywords) Starts the workflow when a keyword is submitted through a form. HTTP Request – Dumpling AI Autocomplete Expands the keyword into multiple trending search suggestions. Split Out – Split Autocomplete Suggestions Breaks the list of autocomplete suggestions into individual items for processing. Loop – Loop Suggestions Iterates through each suggestion to process articles separately. Wait – Delay Between Requests Adds a pause to avoid sending too many requests at once. HTTP Request – Dumpling AI Google News Fetches recent news articles for each suggestion. Split Out – Split News Articles Splits the returned news results into individual articles. Code – Filter Articles (1–2 Days Old) Keeps only articles that are between 1 and 2 days old for fresh content. Limit – Limit Articles Restricts the workflow to the top 2 articles for each suggestion. HTTP Request – Dumpling AI Scraper Scrapes and cleans the full text content from the article URLs. Code – Clean & Prepare Article Content Removes clutter like links, images, and unrelated sections to ensure clean input. Aggregate – Aggregate Articles Combines the cleaned article content into one dataset. OpenAI – Generate Blog Draft Uses OpenAI to create a polished blog post draft and title in Markdown format. Google Docs – Create Blog File Creates a new Google Doc with the generated blog title. Google Docs – Insert Blog Content Inserts the full blog draft into the created document. 📝 Notes Set up Dumpling AI and generate your API key from: Dumpling AI OpenAI must be connected with an active API key for blog generation. Google Docs must be connected with write permissions to create and update blog posts. You can adjust the article filter (currently set to 1–2 days old) in the code node depending on your needs.
by Baris Cem Ant
Workflow Objective This n8n workflow automates the entire content creation process by monitoring specified RSS feeds for new articles. It then leverages Google Gemini AI to generate comprehensive, SEO-optimized blog posts inspired by these articles, creates unique cover images, and distributes the final content as a JSON file to stakeholders via Telegram. The primary goal is to automate the end-to-end content pipeline, saving significant time and ensuring a consistent output of high-quality content. Step-by-Step Breakdown Monitor News Sources (RSS Triggers): The workflow is triggered periodically (e.g., hourly, weekly) by multiple RSS Feed nodes that monitor sources like "Search Engine Journal" and "Tech Crunch" for new publications. Prevent Duplicate Content (Deduplication): For each new article fetched from the RSS feeds, the workflow checks an AWS DynamoDB database to see if the article's URL has been processed before. If the link already exists in the database, the process for that item is halted, and a debug notification is sent to Telegram via the "Telegram Debugger" node. This prevents the generation of duplicate content. AI-Powered Content Generation (Gemini Content Generation): If the article is new, its link is passed to a Google Gemini node. Using a highly detailed and structured prompt, Gemini generates a complete blog post in a specific JSON format. This output includes a title, meta description, SEO-friendly slug, a descriptive prompt for generating a cover image, and the full markdown body of the article (including an introduction, subheadings, conclusion, FAQ section, etc.). Data Cleaning and Parsing (JSON Parser): The raw text response from the AI is processed by a "Code" node. This custom script cleans the output—removing markdown code blocks, fixing potential syntax errors—and reliably parses it into a valid JSON object, ensuring the data is clean for subsequent steps. Image Generation and Cloud Storage: The image_generation_prompt from the parsed JSON is sent to another Google Gemini node configured for image generation, creating a 1200x630 cover image for the blog post. The newly created image is renamed using the slug. Finally, the image is uploaded to a cloud storage service like Cloudflare R2. If the upload fails, an error message is sent to Telegram. Final Data Assembly and Distribution: The generated text content is merged with the URL of the uploaded image to create the final, complete blog post data object. This entire data structure is converted into a JSON file, named using the format [slug].json. In the final step, this JSON file is sent as a document to designated recipients User via the Telegram nodes. Technologies and Services Used Trigger:** RSS Feed Reader Artificial Intelligence:** Google Gemini (for both text and image generation) Database:** AWS DynamoDB (for content deduplication) Cloud Storage:** Cloudflare R2 (S3-compatible) Notification & Distribution:** Telegram Data Processing:** n8n's native nodes (Merge, If, Set, Code)
by Candra Reza
Unleash the full potential of your website's search engine performance and user experience with this all-in-one n8n automation template. Designed for SEO professionals and webmasters, this suite provides meticulous on-page and technical SEO auditing, deep insights into Core Web Vitals (LCP & INP), and an intelligent AI-powered chatbot for instant insights and troubleshooting. Key Features: Comprehensive On-Page SEO Audit: Automatically checks for **missing or malformed titles, meta descriptions, H1s (including multiple H1s), missing alt text on images, and canonical tag issues. Detailed Technical SEO Scan: Verifies **HTTPS implementation, robots.txt accessibility and content, and sitemap.xml presence. Core Web Vitals Monitoring: Leverages **Google PageSpeed Insights to continuously track and alert on critical performance metrics like Largest Contentful Paint (LCP) and Interaction to Next Paint (INP). AI-Powered Analysis & Recommendations: Integrates advanced AI models (ChatGPT, Claude, or Gemini) to **analyze audit findings, provide actionable recommendations for improvements, and even suggest better alt text for images based on content context. Intelligent SEO Chatbot: A dynamic chatbot triggered by webhooks understands natural language queries, extracts entities (URLs, keywords, SEO topics), and provides **instant, AI-generated answers about SEO best practices, Core Web Vitals explanations, or even specific site data (via Google Search Console integration). Automated Reporting & Alerts: Logs all audit data to **Google Sheets for historical tracking and sends real-time Slack alerts for critical SEO issues or performance degradations. Streamline your SEO workflow, ensure optimal website health, and react swiftly to performance challenges. This template is your ultimate tool for staying ahead in the competitive digital landscape.