by Influencers Club
How it works: Discover and filter creators with social platform data and find lookalikes to add to your CRM for influencer marketing platforms or creator outreach. Step by step workflow to discover new creators with multi social (Instagram, Tiktok, Youtube, Twitter, Onlyfans, Twitch and more) profiles, analytics and metrics using the influencers.club API, find similar ones and create/update records in Hubspot Set up: Hubspot (can be swapped for any CRM like Salesforce or Google Sheet) Influencers.club API
by Kevin Meneses
What this workflow does This template extracts high-intent SEO keywords from any web page and turns them into a ranked keyword list you can use for content planning, landing pages, and SEO strategy. It runs in 3 phases: Scrape the target URL* with Decodo Decodo – Web Scraper for n8n Use AI to extract seed keywords* and understand the page topic Enrich each seed keyword with real Google SERP data via SerpApi* (related searches + questions + competitors), then apply a JavaScript scoring system to rank the best opportunities The final output is saved to Google Sheets as a clean table of ranked keywords. Who this workflow is for SEO consultants and agencies SaaS marketers and growth teams Founders validating positioning and messaging Content teams looking for “what people actually search for” This workflow is especially useful when you want keywords with commercial / solution intent, not generic single-word terms. Workflow overview Phase 1 — Scrape & clean page content Reads the URL from Google Sheets Scrapes the page via Decodo Cleans HTML into plain text (token-friendly) Phase 2 — AI keyword extraction AI returns a structured JSON with: brand / topic 5–10 mid-tail seed keywords intent + audience hints Phase 3 — SERP enrichment + scoring Uses SerpApi to fetch: related searches People Also Ask questions competitor domains Scores and ranks keywords based on: -- source type (related searches / PAA / organic) -- frequency across seeds -- modifiers (pricing, best, free, docs, etc.) -- mid-tail length preference Setup (step by step) 1) Google Sheets (input) Create a sheet with: Column name: urls One URL per row 2) Google Sheets (output) Create an output sheet with columns like: keyword score intent_hint source_type Tip: Clear the output sheet before each run if you want a clean export. 3) Decodo Add your Decodo credentials The URL is taken automatically from Google Sheets Decodo – Web Scraper for n8n 4) SerpApi Add your SerpApi key in the SerpApi node 5) AI Model Connect your preferred AI model (Gemini / OpenAI) The prompt is optimized to output valid JSON only Self-hosted disclaimer This is a community template. You must configure your own credentials (Google Sheets, Decodo, SerpApi, AI). Results depend on page accessibility and page content quality.
by koichi nagino
Overview This workflow, "Mood Graph Studio," offers a comprehensive solution to track and visualize your emotional well-being. By simply inputting a single sentence about your mood, this template uses AI to perform a sentiment analysis, generates a visual graph via Wolfram Alpha, provides personalized feedback, and logs everything to Google Sheets. It is designed for anyone interested in mindfulness, self-reflection, or quantified self-tracking. How It Works The workflow is divided into two main API functionalities and a manual trigger for easy testing. Analyze a Single Mood (/mood endpoint) An AI Agent (OpenAI) quantifies your mood text into valence (positivity) and energy (activity). A query is sent to Wolfram Alpha to generate a simple linear graph based on the score. A second AI Agent provides short, encouraging advice in Japanese. The complete entry is logged as a new row in Google Sheets. Returns a JSON response containing the full analysis and the graph image. Generate Mood History Graph (/history endpoint) Retrieves historical mood data for a specified user from Google Sheets. A Code node formats the data into a time-series plot query. Wolfram Alpha** generates a line graph visualizing the mood trend over time. The resulting graph is automatically posted to Slack. How to Set Up 1. Credentials You must add your own credentials for the following services in the respective nodes: OpenAI**: Used in both Chat Model nodes. Google Sheets**: Used in the "Log Mood" and "Get History" nodes. Slack**: Used in the "Send History" node. 2. Wolfram Alpha App ID This workflow uses the HTTP Request node to call the Wolfram Alpha API. Get a free App ID from the Wolfram|Alpha Developer Portal. Paste your App ID into the appid parameter value in both Generate...Graph nodes. 3. Google Sheet Configuration Create a new Google Sheet. Paste the Sheet ID into the Document ID field in both Google Sheets nodes. Crucially**, ensure the first row has the following headers exactly (case-sensitive): userId, moodText, valence, energy, createdAt, wolframQuery, feedback How to Use For Testing: Use the Manual Trigger. Modify the sample text in the "Set Test Data" node and click "Execute Workflow" on the canvas. For Production: Activate the workflow. Send POST requests to the Production URL of the Webhook nodes.
by Shashwat Singh
This workflow automatically detects duplicate files uploaded to a specific Google Drive folder by generating an MD5 hash of each file and comparing it against a Supabase database. If a duplicate is found, the file is moved to a dedicated Duplicates folder and a Slack notification is sent. All events, including unique uploads, duplicates, race conditions, and errors, are logged for audit purposes. It is designed for teams that handle high file volumes and need reliable, content based deduplication instead of simple filename checks. How it works Monitors a specific Google Drive folder for new files. Normalizes file metadata and downloads the binary content. Generates an MD5 hash from the file binary. Checks Supabase to see if the hash already exists. If duplicate, moves the file to a Duplicates folder and sends a Slack alert. If unique, stores the hash in Supabase. Logs every outcome, including errors and race conditions, in an audit table. Setup steps Connect your Google Drive account and select the folder to monitor. Connect your Supabase account and create the required tables: file_hashes dedup_audit_log Connect your Slack account and select a channel for duplicate alerts. Update the Duplicates folder ID in the Google Drive Move node. Setup typically takes 10 to 15 minutes if your Supabase project is ready.
by Madame AI
Auto-generate Wordpress posts and social media updates from links With BrowserAct This workflow acts as an intelligent content engine. Simply send a link to your Telegram bot (e.g., a product page or news article), and it will automatically scrape the content, rewrite it into a high-quality blog post using AI, generate custom artwork, and publish it to WordPress while simultaneously creating an engaging Telegram update. Target Audience Content marketers, bloggers, and social media managers looking to repurpose content at scale. How it works Analyze Input: The workflow receives a message via Telegram. An AI Agent classifies it to determine if it's a request to process a link or just a chat. Scrape & Clean: If a link is detected, BrowserAct scrapes the target page. An AI "Content Editor" then cleans the text and structures it into a blog format. Visual Creativity: A separate AI Agent analyzes the original images from the article to understand the style, then generates a unique, artistic image prompt. Generate Assets: Text: An AI writes the full blog post (HTML formatted for WordPress) and a punchy Telegram caption. Image: Google Gemini generates a new header image based on the artistic prompt. Publish: WordPress: The workflow uploads the generated image and publishes the new article. Telegram: It sends the generated image and the engaging caption back to your chat. How to set up Configure Credentials: Connect your Telegram, WordPress, BrowserAct, Google Gemini, and OpenRouter accounts in n8n. Prepare BrowserAct: Ensure the Telegram and WordPress Post Architect template is saved in your BrowserAct account. Configure Telegram: Ensure your bot is created via BotFather and the API token is added to the Telegram credentials. Activate: Turn on the workflow. Test: Send an article link to your bot to see it transform into a new post. Requirements BrowserAct* account with the *Telegram and WordPress Post Architect** template. WordPress** account (Application Password). Telegram** account (Bot Token). Google Gemini* & *OpenRouter** accounts. How to customize the workflow Change Image Style: Modify the system prompt in the Image Prompt agent to enforce a specific art style (e.g., "Minimalist Line Art" or "Cyberpunk"). Add SEO: Update the Generate Web Structure agent to automatically include SEO keywords or meta descriptions in the JSON output. Multi-Platform: Add a Twitter or LinkedIn node at the end to cross-post the Telegram update to other social networks. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates
by Ali HAIDER
Receive WhatsApp messages via Whapi, generate AI replies with a local Ollama model, log conversations in Google Sheets, and auto-capture leads — all without touching a cloud LLM. This n8n template builds a fully automated WhatsApp AI CRM using Whapi.cloud for messaging and Ollama for 100% local AI inference — no OpenAI costs, no data leaving your server. How it works A Webhook node receives inbound WhatsApp messages from Whapi.cloud. A Code node extracts the sender's phone, name, message text, and filters out outbound/non-text messages. An IF node ensures only real inbound text messages from customers are processed. Google Sheets is used to fetch that customer's full conversation history, enabling memory across sessions. A Code node builds a full prompt — system instructions + conversation history + new message — passed to the AI model. Ollama (via LangChain LLM Chain node) generates a contextual reply using a local model (default: gemma3:1b). The user message and AI reply are each appended to Google Sheets as conversation history logs. A separate Google Sheets upsert captures or updates the lead record with phone and name. The AI reply is sent back to the customer via Whapi's HTTP API. How to use Set up a Whapi.cloud account and connect a WhatsApp number. Point the webhook to your n8n webhook URL. Create a Google Sheet with a History tab (columns: Phone, Name, Role, Message, Timestamp) and a Leads tab (columns: Phone, Name, CreatedAt). Add your Google Sheets credentials and replace YOUR_GOOGLE_SHEET_ID in the relevant nodes. Run Ollama locally or on your server. Pull the model: ollama pull gemma3:1b. Update the model name in the Ollama node if using a different model. Customise the system prompt inside the Build AI Prompt node to match your business (real estate, support, bookings, etc.). Activate the workflow and send a WhatsApp message to test. Requirements Whapi.cloud account (WhatsApp Business API) Ollama running locally or on a self-hosted server Google Sheets (with OAuth2 credentials connected in n8n) Customising this workflow Switch AI models: Swap gemma3:1b for any Ollama-supported model like llama3, mistral, or phi3 depending on your hardware. Change the industry: Edit the system prompt in Build AI Prompt to serve any business — bookings, customer support, sales qualification, etc. Upgrade the CRM: Replace Google Sheets with Airtable, Notion, or a real CRM (HubSpot, Pipedrive) by swapping out the Sheets nodes. Add handoff logic: Insert a condition to escalate to a human agent if the message contains keywords like "speak to someone" or "human". Multi-language: The system prompt already instructs the AI to reply in the customer's language — no extra setup needed. Who is this for It's designed for service businesses (real estate, consultants, agencies) that want to respond to inbound WhatsApp leads instantly, log conversations, and build a simple CRM — all from a single workflow.
by Cheng Siong Chin
How It Works The workflow starts with a scheduled trigger that activates at set intervals. Behavioral data from multiple sources is parsed and sent to the MCDN routing engine, which intelligently assigns leads to the right teams based on predefined rules. AI-powered scoring evaluates each prospect’s potential, ensuring high-quality leads are prioritized. The results are synced to the CRM, and updates are reflected on an analytics dashboard for real-time visibility. Setup Steps Trigger: Define schedule frequency. Data Fetch: Configure APIs for all behavioral data sources. MCDN Router: Set routing rules, thresholds, and team assignments. AI Models: Connect OpenAI/NVIDIA APIs and configure scoring prompts. CRM Integration: Enter credentials for Salesforce, HubSpot, or other CRMs. Dashboard: Link to analytics tools like Tableau or Google Sheets for reporting. Prerequisites API credentials: NVIDIA AI, OpenAI, CRM platform; data sources; spreadsheet/analytics access Use Cases Lead prioritization for sales teams; customer segmentation; automated routing; Customization Adjust routing rules, add custom scoring models, modify team assignments, expand data sources, integrate additional AI providers Benefits Reduces manual lead routing 90%; improves scoring accuracy; accelerates sales cycle; enables data-driven team assignments;
by vinci-king-01
Certification Requirement Tracker with Rocket.Chat and GitLab ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically scrapes certification-issuing bodies once a year, detects any changes in certification or renewal requirements, creates a GitLab issue for the responsible team, and notifies the relevant channel in Rocket.Chat. It helps professionals and compliance teams stay ahead of changing industry requirements and never miss a renewal. Pre-conditions/Requirements Prerequisites An n8n instance (self-hosted or n8n.cloud) ScrapeGraphAI community node installed and activated Rocket.Chat workspace with Incoming Webhook or user credentials GitLab account with at least one repository and a Personal Access Token (PAT) Access URLs for all certification bodies or industry associations you want to monitor Required Credentials ScrapeGraphAI API Key** – Enables web scraping services Rocket.Chat Credentials** – Either: Webhook URL, or Username & Password / Personal Access Token GitLab Personal Access Token** – To create issues and comments via API Specific Setup Requirements | Service | Requirement | Example/Notes | | ------------- | ---------------------------------------------- | ---------------------------------------------------- | | Rocket.Chat | Incoming Webhook URL OR user credentials | https://chat.example.com/hooks/abc123… | | GitLab | Personal Access Token with api scope | Generate at Settings → Access Tokens | | ScrapeGraphAI | Domain whitelist (if running behind firewall) | Allow outbound HTTPS traffic to target sites | | Cron Schedule | Annual (default) or custom interval | 0 0 1 1 * for 1-Jan every year | How it works This workflow automatically scrapes certification-issuing bodies once a year, detects any changes in certification or renewal requirements, creates a GitLab issue for the responsible team, and notifies the relevant channel in Rocket.Chat. It helps professionals and compliance teams stay ahead of changing industry requirements and never miss a renewal. Key Steps: Scheduled Trigger**: Fires annually (or any chosen interval) to start the check. Set Node – URL List**: Stores an array of certification-body URLs to scrape. Split in Batches**: Iterates over each URL for parallel scraping. ScrapeGraphAI**: Extracts requirement text, effective dates, and renewal info. Code Node – Diff Checker**: Compares the newly scraped data with last year’s GitLab issue (if any) to detect changes. IF Node – Requirements Changed?**: Routes the flow based on change detection. GitLab – Create/Update Issue**: Opens a new issue or comments on an existing one with details of the change. Rocket.Chat – Notify Channel**: Sends a message summarizing any changes and linking to the GitLab issue. Merge Node**: Collects all branch results for a final summary report. Set up steps Setup Time: 15-25 minutes Install Community Node: In n8n, navigate to Settings → Community Nodes and install “ScrapeGraphAI”. Add Credentials: a. In Credentials, create “ScrapeGraphAI API”. b. Add your Rocket.Chat Webhook or PAT. c. Add your GitLab PAT with api scope. Import Workflow: Copy the JSON template into n8n (Workflows → Import). Configure URL List: Open the Set – URL List node and replace the sample array with real certification URLs. Adjust Cron Expression: Double-click the Schedule Trigger node and set your desired frequency. Customize Rocket.Chat Channel: In the Rocket.Chat – Notify node, set the channel or use an incoming webhook. Run Once for Testing: Execute the workflow manually to ensure issues and notifications are created as expected. Activate Workflow: Toggle Activate so the schedule starts running automatically. Node Descriptions Core Workflow Nodes: stickyNote – Workflow Notes**: Contains a high-level diagram and documentation inside the editor. Schedule Trigger** – Initiates the yearly check. Set (URL List)** – Holds certification body URLs and meta info. SplitInBatches** – Iterates through each URL in manageable chunks. ScrapeGraphAI** – Scrapes each certification page and returns structured JSON. Code (Diff Checker)** – Compares the current scrape with historical data. If – Requirements Changed?** – Switches path based on diff result. GitLab** – Creates or updates issues, attaches JSON diff, sets labels (certification, renewal). Rocket.Chat** – Posts a summary message with links to the GitLab issue(s). Merge** – Consolidates batch results for final logging. Set (Success)** – Formats a concise success payload. Data Flow: Schedule Trigger → Set (URL List) → SplitInBatches → ScrapeGraphAI → Code (Diff Checker) → If → GitLab / Rocket.Chat → Merge Customization Examples Add Additional Metadata to GitLab Issue // Inside the GitLab "Create Issue" node ↗️ { "title": Certification Update: ${$json.domain}, "description": What's Changed?\n${$json.diff}\n\n_Last checked: {{$now}}_, "labels": "certification,compliance," + $json.industry } Customize Rocket.Chat Message Formatting // Rocket.Chat node → JSON parameters { "text": :bell: Certification Update Detected\n>${$json.domain}\n>See the GitLab issue: ${$json.issueUrl} } Data Output Format The workflow outputs structured JSON data: { "domain": "example-cert-body.org", "scrapeDate": "2024-01-01T00:00:00Z", "oldRequirements": "Original text …", "newRequirements": "Updated text …", "diff": "- Continuous education hours increased from 20 to 24\n- Fee changed to $200", "issueUrl": "https://gitlab.com/org/compliance/-/issues/42", "notification": "sent" } Troubleshooting Common Issues No data returned from ScrapeGraphAI – Confirm the target site is publicly accessible and not blocking bots. Whitelist the domain or add proper headers via ScrapeGraphAI options. GitLab issue not created – Check that the PAT has api scope and the project ID is correct in the GitLab node. Rocket.Chat message fails – Verify webhook URL or credentials and ensure the channel exists. Performance Tips Limit the batch size in SplitInBatches to avoid API rate limits. Schedule the workflow during off-peak hours to minimize load. Pro Tips: Store last-year scrapes in a dedicated GitLab repository to create a complete change log history. Use n8n’s built-in Execution History Pruning to keep the database slim. Add an Error Trigger workflow to notify you if any step fails.
by Afigo Sam
🚀 Overview This workflow automatically discovers trending topics, generates engaging social media content using AI, and publishes posts to X (Twitter) and Facebook via Buffer. It combines trend monitoring + AI content generation + automated publishing into one seamless pipeline, helping creators and marketers stay consistent and relevant without manual effort. 🎯 Who is this for Content creators & social media managers Agencies managing multiple accounts Developers building AI-powered automation systems Anyone who wants to post consistently on trending topics ⚙️ What this workflow does Fetches trending topics (e.g., from X or external sources) Filters and formats relevant trends Uses Google Gemini to generate: X (Twitter) tweets Facebook posts Thread posts Formats content for each platform Publishes posts automatically via Buffer API ✨ Key Features Fully automated content pipeline Multi-platform posting (X + Facebook + Thread) AI-generated threads + captions Easily customizable prompts and filters Scalable for multiple niches or accounts 🔧 Setup Instructions Credentials required Google Gemini API key Apify API key Buffer API access token Configuration steps Add your API credentials to the respective nodes Customize the prompt templates for your niche Define your preferred trend source or keywords Set posting frequency (via Cron or trigger node) Optional customization Add moderation/approval step before posting Filter trends by region, hashtags, or keywords Adjust tone (professional, funny, viral, etc.) and language 🧠 How it works (high-level) The workflow monitors trending data, processes it into structured input, and feeds it into AI models to generate platform-specific content. The output is then formatted and sent to Buffer for scheduled or immediate publishing. ⚠️ Notes & Limitations AI-generated content may require review for accuracy Rate limits may apply depending on APIs used Ensure Buffer supports your connected accounts Trending data source reliability may vary 👋 Need help or want to customize this workflow? 📺 Contact: Fiverr 📩 Consultation: Book Appointment
by Wan Dinie
AI Website Analyzer to Product Ideas with FireCrawl and GPT-4.1 This n8n template demonstrates how to use AI to analyze any website and generate product ideas or summaries based on the website's content and purpose. Use cases are many: Try analyzing competitor websites, discovering product opportunities, understanding business models, or generating insights from landing pages! Good to know At time of writing, Firecrawl offers up to 500 free API calls. See Firecrawl Pricing for updated info. OpenAI API costs vary by model. GPT-3.5 is cheaper while GPT-4 and above offer deeper analysis but cost more per request. How it works We'll collect a website URL via a manual form trigger. The URL is sent to the Firecrawl API, which deeply crawls and analyzes the website content. Firecrawl returns the scraped data, including page structure, content, and metadata. The scraped data is then sent to OpenAI's API with a custom prompt. OpenAI generates an AI-powered summary analyzing what the website is doing, its purpose, and potential product ideas. The final output is displayed or can be stored for further use. How to use The manual trigger node is used as an example, but feel free to replace this with other triggers such as webhook or even a form. You can analyze multiple URLs by looping through a list, but of course, the processing will take longer and cost more. Requirements Firecrawl API key (get free 500 calls at https://firecrawl.dev) OpenAI API key for AI analysis Valid website URLs to analyze Customizing this workflow Change the output format from HTML to JSON, Markdown, or plain text by editing the Firecrawl parameters. Modify the AI prompt to focus on specific aspects like pricing strategy, target audience, or UX analysis. Upgrade to GPT-4.1, GPT-5.1, or GPT-5.2 for more advanced and detailed analysis. Add a webhook trigger to analyze websites automatically from other apps or services. Store results in a database like Supabase or Google Sheets for tracking competitor analysis over time.
by Peliqan
How it works This template is an end-to-end demo of a chatbot using business data from multiple sources (e.g. Notion, Chargebee, Hubspot etc.) with RAG + SQL. Peliqan.io is used as a "cache" of all business data. Peliqan uses one-click ELT to sync all your business data to its built-in data warehouse, allowing for fast & accurate RAG and "Text to SQL" queries. The workflow will write source data to Supabase as a vector store, for RAG searches by the chatbot. The source URL (e.g. the URL of a Notion page) is added in metadata. The AI Agent will decide for each question to use either RAG or Text-to-SQL or a combination of both. Text-to-SQL is performed via the Peliqan node, added as a tool to the AI Agent. The question of the user in natural language is converted to an SQL query by the AI Agent. The query is executed by Peliqan.io on the source data and the result is interpreted by the AI Agent. RAG is typically used to answer knowledge questions, often on non-structured data (Notion pages, Google Drive etc.). Text-to-SQL is typically used to answer analytical questions, for example "Show list of customers with number of open support tickets and add customer revenue based on invoiced amounts". Preconditions You signed up for a Peliqan.io free trial account You have one or more data sources, e.g. a CRM, ERP, Accounting software, files, Notion, Google Drive etc. Set up steps Sign up for a free trial on peliqan.io: https://peliqan.io Add one or more sources in Peliqan (e.g. Hubspot, Pipedrive...) Copy your Peliqan API key under settings and use it here to add a Peliqan connection Run the "RAG" workflow to feed Supabase, change the name of the table in the Peliqan node "Get table data". Update the list of tables & columns that can be used for SQL in the System Message of the AI Agent. Visit https://peliqan.io/n8n for more information. Disclaimer: This template contains a community node and therefore only works for n8n self-hosted users.
by Jason Stelo
What This Workflow Does You have a Google Sheet where you type in a person's name and set their status to Pending. This workflow checks that sheet on a schedule, finds anyone marked Pending, and automatically researches them using the Perplexity AI API. The results get written back to the same row — current company, location, job title, industry, and social media. When done the status flips to Done. If anything goes wrong or a field comes back empty, you get a Slack notification. Setup Steps Google Sheet Create a sheet with two tabs. The first tab is your main data sheet with these columns: Name, Status, Current Company, Location, Current Title, Industry, Socials / Others, Error Log. The second tab is called Config and is where you write your research questions using [NAME] as a placeholder for the person's name. Credentials You need three credentials set up in n8n — Google Sheets OAuth2, a Perplexity API key set up as HTTP Header Auth with the value Bearer YOUR_API_KEY, and a Slack API connection pointed at the channel you want alerts sent to. Placeholders After importing the JSON, find and replace all placeholders with your real values — your Google Sheet ID, credential IDs, and Slack channel ID. Run It Type a name in Column A, set Column B to Pending, and execute the workflow. Everything else is automatic it will run every desired time you will I preffer every 15 minutes depending on your usage