by mourya
๐ What it does: Captures leads from Facebook Ads Sends instant WhatsApp to leads Waits & checks for bookings If booked anytime during 5-day nurturing, it stops the follow-ups and sends confirmations to both lead & gym owner If not booked, continues to nurture daily for 5 days with reminders
by Automation for you
๐ค Automated AI Article Generation from Google Sheets to WordPress ๐ Short Description Transform a Google Sheet into an automated content factory! This workflow reads article topics, scrapes source content, uses AI to create original articles, and publishes drafts to WordPress automatically. ๐ ๐ Full Description This workflow automates the entire content creation pipeline by connecting Google Sheets, web scraping, AI content generation, and WordPress publishing. It's designed for content marketers, bloggers, and news publishers who need to scale their content production efficiently. ๐ช The system monitors a Google Sheet for new article ideas, processes source URLs through a dual-AI system for summarization and content creation, then automatically generates WordPress drafts while tracking everything back to the spreadsheet. ๐โ๐คโ๐ ๐ฅ Who's It For ๐ Content marketing agencies** managing multiple clients โ๏ธ Bloggers** looking to scale their content output ๐ฐ News publishers** automating article aggregation ๐ SEO specialists** creating keyword-optimized content ๐ฏ Digital marketers** running content campaigns โ๏ธ How It Works ๐ Sheet Monitoring: Watches Google Sheets for rows marked "New" in the Flow Status column ๐ Content Processing: Fetches and analyzes source articles using dual AI agents ๐ค Article Generation: Creates SEO-optimized articles with proper formatting and structure ๐ WordPress Integration: Automatically publishes drafts to your WordPress site โ Status Tracking: Updates the sheet with progress and final draft links ๐ ๏ธ How to Set Up ๐ Prerequisites ๐ Google Sheets API access (OAuth2) ๐ง OpenAI API key ๐ WordPress REST API credentials ๐ Source URLs for article inspiration โก Configuration Steps ๐ Clone the workflow into your n8n instance ๐ Connect credentials for Google Sheets, OpenAI, and WordPress ๐ Update the Google Sheet ID in all Sheet nodes to point to your document ๐ Configure the sheet columns to match: Topic, Source, Flow Status, Publish Status, Publish Link ๐งช Test with one row marked as "New" in your sheet ๐ Requirements ๐งฉ n8n Nodes Used ๐ Google Sheets (read/update operations) ๐ HTTP Request (web scraping) ๐ค OpenAI/LangChain (AI content processing) ๐ WordPress (draft creation) ๐ป Code node (content formatting) โ๏ธ If node (error handling) ๐ SplitInBatches (item processing) โ๏ธ External Services ๐ Google Sheets with specific column structure ๐ง OpenAI API access ๐ WordPress installation with REST API enabled ๐จ How to Customize the Workflow โ๏ธ Content Style Adjustments Modify the "Article Creator" AI node's system prompt to change: ๐ญ Writing tone and style ๐ SEO keyword density ๐ Article structure and headings ๐ฌ Call-to-action format ๐ Source Processing Adjust the "Article Summarizer" node to: ๐ธ๏ธ Handle different website structures ๐ Extract specific content elements ๐ Modify markdown output format ๐ค Publishing Options Customize the "Create a Draft" WordPress node to: ๐ Change post status from "draft" to "publish" ๐ฅ Assign different authors or categories ๐ท๏ธ Add custom fields or tags โ ๏ธ Error Handling Modify the conditional logic in the "If" node to handle different failure scenarios or add additional validation steps. โน๏ธ Note: This workflow uses community nodes (LangChain/OpenAI) and requires a self-hosted n8n instance. โจ Workflow features comprehensive error handling, real-time status tracking, and batch processing for efficient content pipeline management.
by Takumi Oku
Who is this for This workflow is designed for Innovation Managers, Tech Transfer Offices, and Business Development Representatives looking to find commercial partners for new technologies. What it does This template automates the process of scouting startups that might be a good fit for NASA patents. Search: It fetches patents from the NASA Tech Transfer API based on a keyword you define. Find: It searches Google to identify startups operating in related fields. Enrich: It crawls the identified startup's website to extract context about their business. Analyze: Using OpenAI, it scores the "fit" between the patent and the startup and drafts a personalized outreach email. Save: High-scoring leads are enriched with LinkedIn company pages and saved directly to a Notion database. How to set up Configuration: In the Configuration node, set the keyword variable to the technology topic you want to search for (e.g., "robotics"). NASA API: Get a free API key from api.nasa.gov and enter it in the NASA Patents API node parameters. Apify: Connect your Apify account credential. You will need credits to run the google-search-scraper and website-content-crawler actors. OpenAI: Connect your OpenAI credential. Notion: Create a database with the following properties and connect it in the Create Notion Lead node: Company (Text) Website (URL) LinkedIn (URL) Email (Email) Score (Number) Draft Email (Text) NASA Tech (Text) Requirements NASA API Key**: Free to obtain. Apify Account**: Requires google-search-scraper and website-content-crawler actors. OpenAI API Key**: For analysis and text generation. Notion Account**: To store the leads.
by Lucas Hideki
How it works Webhook receives a job ID and list of candidate IDs from your database If the job has no template yet, Prompt 0 reads the job description and automatically extracts mandatory requirements, differentials, behavioral competencies and sets the weight of each criterion For each candidate, 3 prompts run sequentially with accumulated context: Prompt 1 scores the candidate (0โ100) against the job template using calibration anchors to avoid score inflation, plus a breakdown score per criterion Prompt 2 receives the score as context and identifies strengths with concrete resume evidence, separating critical gaps (missing mandatory requirements) from secondary gaps (missing differentials) Prompt 3 receives the gaps as context and generates personalized interview questions for that specific candidate โ not generic HR templates Results are saved directly to PostgreSQL after each candidate When all candidates are processed, Prompt 4 automatically generates an executive summary of the entire pool with recommendations on who to interview Set up steps Add your OpenAI credentials to all AI nodes (~2 min) Add your PostgreSQL credentials to all Postgres nodes (~2 min) Create the required tables using the SQL schema provided in the workflow sticky note (~5 min) Trigger via POST /webhook/cv-analyze with { "job_id": 1, "candidate_ids": [1, 2, 3] }
by Lucas Hideki
How it works Any external system triggers a reminder via webhook with a tenant token โ the workflow validates the token, fetches the tenant's channel config and message template from PostgreSQL, renders the message with event variables, and sends it immediately A schedule trigger runs every minute and queries events approaching their deadline window per tenant โ idempotency via a reminders_sent table ensures the same reminder is never sent twice A built-in n8n form lets you register new tenants with their channel, message template and timing rules โ no external backend needed Every send attempt is logged to the database with status, message sent and error details Set up steps Add your PostgreSQL credentials to all Postgres nodes (~2 min) Add your Telegram credentials to the Send Message node (~2 min) Create the required tables using the SQL schema provided in the workflow sticky note (~10 min) Register your first tenant at /form/multi-tenant-register Send events via POST /webhook/multi-tenant-webhook with x-tenant-token header
by Ertay Kaya
Apple App Store Connect: Featuring Nominations Report This workflow automates the process of tracking and reporting app nominations submitted to Apple for App Store featuring consideration. It connects to the App Store Connect API to fetch your list of apps and submitted nominations, stores the data in a MySQL database, and generates a report of all nominations. The report is then exported as a CSV file and can be automatically shared via Google Drive and Slack. Key features Authenticates with App Store Connect using JWT. Fetches all apps and submitted nominations, including details and related in-app events (API documentation: https://developer.apple.com/documentation/appstoreconnectapi/featuring-nominations) Stores and updates app and nomination data in MySQL tables. Generates a comprehensive nominations report with app and nomination details. Exports the report as a CSV file. Shares the report automatically to Google Drive and Slack. Runs on a weekly schedule, but can be triggered manually as well. Setup Instructions Obtain your App Store Connect API credentials (Issuer ID, Key ID, and private key) from your Apple Developer account. Set up a MySQL database and configure the connection details in the workflowโs MySQL node(s). (Optional) Connect your Google Drive and Slack accounts using the respective n8n nodes if you want to share the report automatically. Update any credentials in the workflow to match your setup. Activate the workflow and set the schedule as needed. This template is ideal for teams who regularly submit apps or updates for featuring on the App Store and want to keep track of their nomination history and status in a structured, automated way.
by sebastian pineda
๐ค AI-Powered Hardware Store Assistant with PostgreSQL & MCP Supercharge your customer service with this conversational AI agent! This n8n workflow provides a complete solution for a hardware store chatbot that connects to a PostgreSQL database in real-time. It uses Google Gemini for natural language understanding and the powerful MCP (My Credential Provider) nodes to securely expose database operations as tools for the AI agent. โจ Key Features ๐ฌ Conversational Product Queries: Allow users to ask for products by name, category, description, or even technical notes. ๐ฆ Real-time Inventory & Pricing: The agent fetches live data directly from your PostgreSQL database, ensuring accurate stock and price information. ๐ฐ Automatic Quote Generation: Ask the agent to create a detailed quote for a list of materials, and it will calculate quantities and totals. ๐ง Smart Project Advice: The agent is primed with a system message to act as an expert, helping users calculate materials for projects (e.g., "How much drywall do I need for a 10x12 foot room?"). ๐ ๏ธ Tech Stack & Core Components Technologies Used ๐๏ธ PostgreSQL: For storing and managing product data. โจ Google Gemini API: The large language model that powers the agent's conversational abilities. ๐ MCP (My Credential Provider): Securely exposes database queries as callable tools without exposing credentials directly to the agent. n8n Nodes Used @n8n/n8n-nodes-langchain.agent: The core AI agent that orchestrates the workflow. @n8n/n8n-nodes-langchain.chatTrigger: To start a conversation. @n8n/n8n-nodes-langchain.lmChatGoogleGemini: The connection to the Google Gemini model. n8n-nodes-base.postgresTool: Individual nodes for querying products by ID, name, category, etc. @n8n/n8n-nodes-langchain.mcpTrigger: Exposes the PostgresTools. @n8n/n8n-nodes-langchain.mcpClientTool: Allows the AI agent to consume the tools exposed by the MCP Trigger. ๐ How to Get Started: Setup & Configuration Follow these steps to get your AI assistant up and running: Configure your Database: This template assumes a PostgreSQL database named bd_ferreteria with a productos table. You can adapt the PostgresTool nodes to match your own schema. Set up Credentials: Create and assign your PostgreSQL credentials to each of the six PostgresTool nodes. Create and assign your Google Gemini API credentials in the Language Model (Google Gemini) node. Review the System Prompt: The main AI Agent node has a detailed system prompt that defines its persona and capabilities. Feel free to customize it to better fit your business's tone and product line. Activate the Workflow: Save and activate the workflow. You can now start interacting with your new AI sales assistant through the chat interface! ๐ก Use Cases & Customization While designed for a hardware store, this template is highly adaptable. You can use it for: Any e-commerce store with a product database (e.g., electronics, clothing, books). An internal IT support bot that queries a database of company assets. A booking assistant that checks availability in a database of appointments or reservations.
by Avkash Kakdiya
How it works This workflow automatically pulls daily signup stats from your PostgreSQL database and shares them with your team across multiple channels. Every morning, it counts the number of new signups in the last 24 hours, formats the results into a concise report, and posts it to Slack, Microsoft Teams, and Telegram. This ensures your entire team stays updated on customer growth without manual queries or reporting. Step-by-step Daily Trigger & Data Fetching The Daily Report Trigger runs at 9:00 AM each day. The Fetch Signup Count node queries the customers table in PostgreSQL. It calculates the number of new signups in the last 24 hours using the created_at timestamp column. Report Preparation The Prepare Report Message node formats the results into a structured message: Report date Signup count A clear summary line: Daily Signup Report โ New signups in the last 24h: X Multi-Channel Delivery The prepared message is sent to multiple platforms simultaneously: Slack Microsoft Teams Telegram This ensures all teams receive the update in their preferred communication tool. Why use this? Automates daily customer growth reporting. Eliminates manual SQL queries and report sharing. Keeps the whole team aligned with real-time growth metrics. Delivers updates across Slack, Teams, and Telegram at once. Provides simple, consistent reporting every day.
by Jitesh Dugar
Transform raw product images into fully-optimized e-commerce listings in seconds. This workflow automates the bridge between a photo upload and a live product page by combining UploadToURL for hosting, GPT-4o Vision for content generation, and native integrations for Shopify and WooCommerce. ๐ฏ What This Workflow Does Turns a single product photo into a comprehensive, SEO-ready store listing: ๐ Captures Product Assets - Receives an image via mobile upload (binary) or a remote URL via Webhook. โ๏ธ Instant CDN Hosting - UploadToURL hosts the image and generates a permanent, high-speed link for your store. ๐๏ธ Vision AI Analysis - GPT-4o Vision "looks" at the product to generate titles, HTML descriptions, SEO tags, and even suggested categories. ๐ฆ Smart Platform Routing - Automatically detects your target platform and formats the data for: Shopify: Creates products via GraphQL-compatible REST with full SEO metafields. WooCommerce: Creates listings via REST API with Yoast SEO support and marketing blurbs. ๐ Data Enrichment - Sanitizes SKUs, coerces pricing, and maps inventory data for a production-ready entry. โจ Key Features Seamless Asset Hosting: Uses the **UploadToURL community node to eliminate the need for manual cloud storage management. Zero-Copywriting Required**: AI generates 5-point bullet features, SEO titles (max 70 chars), and rich HTML descriptions. Dual-Platform Support**: Toggle between Shopify and WooCommerce within a single workflow. Automated Slugs**: Generates URL-friendly "handles" based on AI-suggested product names. Robust Error Handling**: Centralized logic to catch upload or API failures and return structured feedback. ๐ผ Perfect For E-commerce Managers**: Adding hundreds of products without manual data entry. Dropshippers**: Quickly importing products from supplier URLs with fresh, unique AI copy. Retailers**: Taking photos of new stock on a phone and pushing them live to the store instantly. Agencies**: Automating catalog management for multiple client stores. ๐ง What You'll Need Required Integrations UploadToURL** - To host product images and provide public CDN links. n8n Community Node** - n8n-nodes-uploadtourl must be installed. OpenAI API** - GPT-4o Vision for image analysis and copywriting. Shopify or WooCommerce** - Credentials for your specific store platform. Optional Integrations Google Sheets** - To log all generated product data for an offline backup. Slack** - To notify the team whenever a new product "Draft" is created. ๐ Quick Start Import Template - Copy the JSON and import it into your n8n instance. Install Node - Verify the UploadToURL community node is installed. Set Credentials - Connect your UploadToURL, OpenAI, and Store (Shopify/WooCommerce) accounts. Set Default Platform - Configure the DEFAULT_PLATFORM variable (shopify/woocommerce). Test Upload - Send a POST request with an image and price to the Webhook URL. Go Live - Switch to "Active" to begin your automated catalog expansion. ๐จ Customization Options Pricing Logic**: Add a node to calculate dynamic markups or currency conversions. Publishing Workflow**: Set publishImmediately to false to create all AI products as "Drafts" for human review. Image Processing**: Add watermarking or resizing steps before uploading to the CDN. Multi-Store Routing**: Use tags to route products to different regional store locations. ๐ Expected Results 95% reduction** in manual listing time (from 15 minutes to 30 seconds per product). SEO-Optimized listings** from day one with zero manual keyword research. Professional, consistent descriptions** across your entire product catalog. Immediate mobile-to-store** capability for on-the-go inventory management. ๐ Use Cases High-Volume Inventory A warehouse team snaps photos of 50 new arrivals; the workflow creates 50 draft listings with descriptions and prices ready for final approval. Competitor Migration Input a list of product image URLs from a supplier site; the AI re-writes all titles and descriptions to ensure unique content for SEO. Boutique E-commerce Small business owners can manage their entire store from their smartphone by simply "sharing" a photo to the n8n webhook. ๐ก Pro Tips High-Res Images**: Better image quality results in significantly more accurate AI feature extraction. SKU Naming**: Send a custom SKU in the webhook to maintain sync with your physical inventory or ERP system. Confidence Scores**: The AI returns a confidenceScore; you can set a filter to only auto-publish products with a score above 0.9. Ready to automate your storefront? Import this template and connect UploadToURL to start building your AI-driven product catalog today. Questions about store-specific fields? Detailed sticky notes inside the workflow explain how to map custom attributes for both Shopify and WooCommerce.
by CapSolver
How it works Triggers at a regular interval or via a webhook request. Solves AWS WAF challenge then makes a request to fetch the product page. Extracts product data from the retrieved HTML page. Compares the current and previously stored data to detect any changes. Sends an alert if data has changed; else logs no change. Returns results if triggered via webhook. Setup steps [ ] Configure schedule settings in 'Every 6 Hours' node. [ ] Set up AWS WAF credentials in 'Solve AWS WAF' nodes. [ ] Input target URL in 'Fetch Product Page' nodes. [ ] Configure webhook URL in 'Receive Monitor Request' node. Customization Adjust the target site URL in 'Fetch Product Page' and related nodes to match different sites or specific pages.
by Madame AI
Product Hunt Launch Monitor - Scraping & Summarization of Product Hunt Feedbacks This n8n template provides automated competitive intelligence by scraping and summarizing Product Hunt launch feedback with a specialized AI analyst. This workflow is essential for product managers, marketing teams, and founders who need to quickly gather and distill actionable insights from competitor launches to inform their own product strategy and positioning. Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. How it works The workflow can be triggered manually but is designed to be easily switched to a Schedule Trigger for continuous competitive monitoring. A Google Sheet node fetches a list of product names you wish to monitor, which the workflow processes in a loop. A BrowserAct node then initiates a web scraping task to collect all the public comments from the specified Product Hunt launch page. An AI Agent, powered by Google Gemini, acts as a competitive intelligence analyst, processing the raw comments. The AI distills the feedback into a structured format, providing a concise Summary, pinpointing key Positive and Negative feedback, and generating Recommendations for a similar product to be successful. The structured analysis is saved to a Google Sheet for easy review and tracking. Finally, a Slack notification confirms that the Product Hunt results have been processed and updated. Requirements BrowserAct** API account for web scraping BrowserAct* "Product Hunt Launch Monitor*" Template BrowserAct** n8n Community Node -> (n8n Nodes BrowserAct) Gemini** account for the AI Agent Google Sheets** credentials for input and saving the analysis Slack** credentials for sending notifications Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node Workflow Guidance and Showcase Steal Your Competitor's Weaknesses (Product Hunt + BrowserAct + n8n)
by Kornel Dubieniecki
AI LinkedIn Content Assistant using Bright Data and NocoDB Whoโs it for This template is designed for creators, founders, and automation builders who publish regularly on LinkedIn and want to analyze their content performance using real data. Itโs especially useful for users who are already comfortable with n8n and want to build data-grounded AI assistants instead of relying on generic prompts or manual spreadsheets. What this workflow does This workflow builds an AI-powered LinkedIn content assistant backed by real engagement data. It automatically: Scrapes LinkedIn posts and engagement metrics using Bright Data Stores structured post data in NocoDB Enables an AI chat interface in n8n to query and analyze your content Returns insights based on historical performance (not hallucinated data) You can ask questions like: โWhich posts performed best last month?โ โWhat content got the most engagement?โ โWhat should I post next?โ Requirements Self-hosted or cloud n8n instance Bright Data โ LinkedIn scraping & data extraction NocoDB โ Open-source Airtable-style database Open AI API โ For AI reasoning & insights Setup Import the workflow into your n8n instance Open the Config node and fill in required variables Connect your credentials for Bright Data, NocoDB, and Open AI API Activate the workflow and run the scraper once to populate data How to customize the workflow You can extend this template by: Adding new metrics or post fields in NocoDB Scheduling regular data refreshes Changing the AI system prompt to match your content strategy Connecting additional channels (email, Slack, dashboards) This template is fully modular and designed to be adapted to your workflow. Questions or Need Help? For setup help, customization, or advanced AI workflows, join my ๐ FREE ๐ community: Tech Builders Club Happy building! ๐ - Kornel Dubieniecki