by Dean Pike
Convert any website into a searchable vector database for AI chatbots. Submit a URL, choose scraping scope, and this workflow handles everything: scraping, cleaning, chunking, embedding, and storing in Supabase. What it does Scrapes websites using Apify (3 modes: full site unlimited, full site limited, single URL) Cleans content (removes navigation, footer, ads, cookie banners, etc) Chunks text (800 chars, markdown-aware) Generates embeddings (Google Gemini, 768 dimensions) Stores in Supabase vector database Requirements Apify account + API token Supabase database with pgvector extension Google Gemini API key Setup Create Supabase documents table with embedding column (vector 768). Run this SQL query in your Supabase project to enable the vector store setup Add your Apify API token to all three "Run Apify Scraper" nodes Add Supabase and Gemini credentials Test with small site (5-10 pages) or single page/URL first Next steps Connect your vector store to an AI chatbot for RAG-powered Q&A, or build semantic search features into your apps. Tip: Start with page limits to test content quality before full-site scraping. Review chunks in Supabase and adjust Apify filters if needed for better vector embeddings. Sample Outputs Apify actor "runs" in Apify Dashboard from this workflow Supabase docuemnts table with scraped website content ingested in chunks with vector embeddings
by Rully Saputra
Automated SEO Watchlist: Continuous Audits Powered by Decodo, Gemini and Google Sheets Automate continuous SEO audits with Decodo and Gemini AI — live data, smart insights, and Google Sheets tracking with team alerts. Who’s it for This workflow is designed for SEO specialists, marketing teams, agencies, and website owners who want an effortless, automated way to monitor SEO health. It’s perfect for ongoing audits, content monitoring, and proactive SEO management — without the manual workload. How it works / What it does Every five days, the workflow: Reads a list of URLs from Google Sheets. Uses Decodo to fetch live on-page data — titles, meta descriptions, headings, schema, links, and Core Web Vitals. Passes that data to Gemini AI for an advanced SEO analysis and scoring based on key factors (content, metadata, links, speed, and structure). Parses results via a Structured Output Parser for clean JSON output. Stores findings in Google Sheets and sends a Telegram alert when the audit completes. Why Decodo matters Decodo is the backbone of this workflow. It powers the real-time page inspection, ensuring Gemini AI has complete, accurate data to analyze. Decodo transforms static audits into live, intelligent monitoring — making your SEO insights far more actionable and reliable. How to set up Connect your Decodo API credentials. Add your Google Sheets URL list. Configure your Telegram bot credentials. Enable the workflow — it runs automatically every 5 days. Requirements Decodo API credentials Google Sheets OAuth connection Telegram Bot token n8n instance (Cloud or Self-hosted) How to customize the workflow Change the trigger interval in the Schedule Trigger node. Modify the SEO Analyzer (LLM Chain) weights for different scoring. Extend the Store Result node to integrate with dashboards or databases. Adjust the AI prompt for additional SEO checks (e.g., backlinks, readability, image optimization). ✅ Highlights Automated SEO auditing Real-time data from Decodo Smart analysis powered by Gemini AI Structured reporting in Google Sheets Team notifications via Telegram
by Madame AI
Analyze job market data with AI to find matching jobs This n8n template helps you stay on top of the job market by matching scraped job offers with your resume using an AI Agent. This workflow is perfect for job seekers, recruiters, or market analysts who need to find specific job opportunities without manually sifting through countless listings. Steps to Take Create BrowserAct Workflow:* Set up the *Job Market Intelligence** template in your BrowserAct account. Add BrowserAct Token:* Connect your BrowserAct account credentials to the *HTTP Request** node. Update Workflow ID:* Change the workflow_id value in the *HTTP Request** node to match the one from your BrowserAct workflow. Connect Gemini:* Add your Google Gemini credentials and update your *resume* inside the prompt in the *AI Agent** node. Configure Telegram:* Connect your Telegram account and add your Channel ID to the *Send a text message** node. How it works The workflow is triggered manually by clicking "Execute workflow," but you can easily set it to run on a schedule. It uses an HTTP Request node to start a web scraping task via the BrowserAct API to collect the latest job offers. A series of If and Wait nodes monitor the scraping job, ensuring the full data is ready before proceeding. An AI Agent node, powered by Google Gemini, processes the job listings and filters them to find the best matches for your resume. A Code node then transforms the AI's output into a clean, readable format. Finally, the filtered job offers are sent directly to you via Telegram. Requirements BrowserAct** API account BrowserAct* *“Job Market Intelligence”** Template Gemini** account Telegram** credentials Need Help ? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Never Manually Search for a Job Again (AI Automation Tutorial)
by Madame AI
Find & Qualify Funded Leads with BrowserAct & Gemini This n8n template helps you find new investment leads by automatically scraping articles for funding announcements and analyzing them with an AI Agent. This workflow is ideal for venture capitalists, sales teams, or market researchers who need to automatically track and compile lists of recently funded companies. Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. How it works The workflow is triggered manually but can be set to a Cron node to run on a schedule. A Google Sheet node loads a list of keywords (e.g., "Series A," "Series B") and geographic locations to search for. The workflow loops through each keyword, initiating BrowserAct web scraping tasks to collect relevant articles. A second set of BrowserAct nodes patiently monitors the scraping jobs, waiting for them to complete before proceeding. Once all articles are collected, they are merged and fed into an AI Agent node, powered by Google Gemini. The AI Agent processes the articles to identify companies that recently received funding, extracting the Company Name, the Field of Investment, and the source URL. A Code node transforms the AI's JSON output into a clean, itemized format. An If node filters out any entries where no company was found, ensuring data quality. The qualified leads are automatically added or updated in a Google Sheet, matching by "Company" to prevent duplicates. Finally, a Slack message is sent to a channel to notify your team that the lead list has been updated. Requirements BrowserAct** API account for web scraping BrowserAct** n8n Community Node -> (n8n Nodes BrowserAct) BrowserAct* "Funding Announcement to Lead List (TechCrunch)*" Template (or a similar scraping workflow) Gemini** account for the AI Agent Google Sheets** credentials for input and output Slack** credentials for sending notifications Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct n8n Community Node Workflow Guidance and Showcase How to Automatically Find Leads from Funding News (n8n Workflow Tutorial)
by Madame AI
AI-Powered Top GitHub Talent Sourcing (by Language & Location) to Google Sheet This n8n template is a powerful talent sourcing engine that finds, analyzes, and scores GitHub contributors using a custom AI formula. This workflow is ideal for technical recruiters, hiring managers, and team leads who want to build a pipeline of qualified candidates based on specific technical skills and location. Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. How it works The workflow runs on a Schedule Trigger (e.g., hourly) to constantly find new candidates. A BrowserAct node ("Run a workflow task") initiates a scraping job on GitHub based on your criteria (e.g., "Python" developers in "Berlin"). A second BrowserAct node ("Get details") waits for the scraping to complete. If the job fails, a Slack alert is sent. A Code node processes the raw scraped data, splitting the list of developers into individual items. An AI Agent, powered by Google Gemini, analyzes each profile. It scores their resume/summary and calculates a final weighted FinalScore based on their followers, repositories, and resume quality. The structured and scored candidate data is then saved to a Google Sheet, using the "Name" column to prevent duplicates. A final Slack message is sent to notify you that the GitHub contributors list has been successfully updated. Requirements BrowserAct** API account for web scraping BrowserAct* "Source Top GitHub Contributors by Language & Location*" Template BrowserAct** n8n Community Node -> (n8n Nodes BrowserAct) Gemini** account for the AI Agent Google Sheets** credentials for saving leads Slack** credentials for sending notifications Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node Workflow Guidance and Showcase Automate Talent Sourcing: Find GitHub Devs with n8n & Browseract
by System Admin
Grab our list of chats from Airtable to send a random recipe. If the chat ID isn't in our airtable, we add it. This is to send a new recipe daily. . https://spoonacular.com/food-api/docs. https://spoo...
by Feras Dabour
AI LinkedIn Content Bot with Approval Loop This n8n workflow transforms your Telegram messenger into a personal assistant for creating and publishing LinkedIn posts. You can simply send an idea as a text or voice message, collaboratively edit the AI's suggestion in a chat, and then publish the finished post directly to LinkedIn just by saying "Okay." What You'll Need to Get Started Before you can use this workflow, you'll need a few prerequisites set up. This workflow connects three different services, so you will need API credentials for each: Telegram Bot API Key: You can get this by talking to the "BotFather" on Telegram. It will guide you through creating your new bot and provide you with the API token. New Chat with Telegram BotFather OpenAI API Key: This is required for the "Speech to Text" and "AI Agent" nodes. You'll need an account with OpenAI to generate this key. [OpenAI API Platform](https://platform.openai.com ) Blotato API Key: This service is used to publish the final post to LinkedIn. You'll need a Blotato account and to connect your LinkedIn profile there to get the key. Blotato platform for social media publishing Once you have these keys, you can add them to the corresponding credentials in your n8n instance. How the Workflow Operates, Step-by-Step Here is a detailed breakdown of how the workflow processes your request and handles the publishing. 1\. Input & Initial Processing This phase captures your idea and converts it into usable text. | Node Name | Role in Workflow | | :--- | :--- | | Start: Telegram Message | This Telegram Trigger node initiates the entire process upon receiving any message from you in the bot. | | Prepare Input | Consolidates the message content, ensuring the AI receives only one clean text input. | | Check: ist it a Voice? | Checks the incoming message for text. If text is empty, it proceeds to voice handling. | | Get Voice File | If a voice note is detected, this node downloads the raw audio file from Telegram. | | Speech to Text | This node uses the OpenAI Whisper API to convert the downloaded audio file into a text string. | 2\. AI Core & Iteration Loop This is the central dialogue system where the AI drafts the content and engages in the feedback loop. | Node Name | Role in Workflow | | :--- | :--- | | AI: Draft & Revise Post | The main logic agent. It analyzes your request, applies the "System Prompt" rules, drafts the post, and handles revisions based on your feedback. | | OpenAI Chat Model | Defines the large language model (LLM) used for generating and revising the post. | | Window Buffer Memory | A memory buffer that stores the last turns of the conversation, allowing the AI to maintain context when you request changes (e.g., "Make it shorter"). | | Check if Approved | This crucial node detects the specific JSON structure the AI outputs only when you provide an approval keyword (like "ok" or "approved"). | | Post Suggestion Or Ask For Approval | Sends the AI's post draft back to your Telegram chat for review and feedback. | AI Agent System Prompt (Internal Instructions - English) The agent operates under a strict prompt that dictates its behavior and formatting (found within the AI: Draft & Revise Post node): > You are a LinkedIn Content Creator Agent for Telegram. > Keep the confirmation process, but change the output format as follows: > > Your Task > Analyze the user's message: > > * Topic > * Goal (e.g., reach, show expertise, recruiting, personal branding, leads) > * Target Audience > * Tonality (e.g., factual, personal, bold, inspiring) > > Create a LinkedIn post as ONE continuous text: > > * Strong hook in the first 1–2 lines. > * Clear main part with added value, story, example, or insight. > * Optional Call-to-Action (e.g., question to the community, invitation to exchange). > * Integrate hashtags at the end of the post (5–12 suitable hashtags, mix of niche + somewhat broader). > * Readable on LinkedIn: short paragraphs, emojis only sparingly. > > Present the suggestion to the user in the following format: > > Headline: Post Proposal: > Below that, the complete LinkedIn post (incl. hashtags at the end in the same text). > > Ask for feedback: > For example: > "Any changes? (Tone, length, formality, personal vs. professional, more technical content, different hashtags?)" > > If the user requests changes: > Adjust the post specifically based on the feedback. > Again, output only: > Post Proposal: > the revised complete post. > > If the user says “approved”, “ok”, “sounds good”, or similar: > Return exclusively this JSON, without additional text, without Markdown: > > > { > "Post": "The final LinkedIn post as one text, including hashtags at the end" > } > > > Important: > > * Never output JSON before approval, only normal suggestion text. > * The final output after approval consists of only one field: Post. 3\. Publishing & Status Check Once approved, the workflow handles the publication and monitors the post's status in real-time. | Node Name | Role in Workflow | | :--- | :--- | | Approval: Extract Final Post Text | Parses the incoming JSON, extracting only the clean text ready for publishing. | | Create post with Blotato | Uses the Blotato API to upload the finalized content to your connected LinkedIn account. | | Give Blotat 5s :) | A brief pause to allow the publishing service to start processing the request. | | Check post status | Checks back with Blotato to determine if the post is published, in progress, or failed. | | Published? | Checks if the status is "published" to send the success message. | | In Progress? | Checks if the post is still being processed. If so, it loops back to the next wait period. | | Give Blotat other 5s :) | Pauses the workflow before re-checking the post status, preventing unnecessary API calls. | 4\. Final Notification | Node Name | Role in Workflow | | :--- | :--- | | Send a confirmation message | Sends a confirmation message and the direct link to the published LinkedIn post. | | Send an error message | Sends a notification if the post failed to upload or encountered an error during processing. | 🛠️ Personalizing Your Content Bot The true power of this n8n workflow lies in its flexibility. You can easily modify key components to match your unique brand voice and technical preferences. 1\. Tweak the Content Creator Prompt The personality, tone, and formatting rules for your LinkedIn content are all defined in the System Prompt. Where to find it: Inside the AI: Draft & Revise Post node, under the System Message setting. What to personalize: Adjust the tone, change the formatting rules (e.g., number of hashtags, required emojis), or insert specific details about your industry or target audience. 2\. Switch the AI Model or Provider You can easily swap the language model used for generation. Where to find it: The OpenAI Chat Model node. What to personalize: Model: Swap out the default model for a more powerful or faster alternative (e.g., gpt-4 family, or models from other providers if you change the node). Provider: You can replace the entire Langchain block (including the AI Model and Window Buffer Memory nodes) with an equivalent block using a different provider's Chat/LLM node (e.g., Anthropic, Cohere, or Google Gemini), provided you set up the corresponding credentials and context flow. 3\. Modify Publishing Behavior (Schedule vs. Post) The final step is currently set to publish immediately, but you might prefer to schedule posts. Where to find it: The Create post with Blotato node. What to personalize: Consult the Blotato documentation for alternative operations. Instead of choosing the "Create Post" operation (which often posts immediately), you can typically select a "Schedule Post" or "Add to Queue" operation within the Blotato node. If scheduling, you will need to add a step (e.g., a Set node or another agent prompt) before publishing to calculate and pass a Scheduled Time parameter to the Blotato node.
by Onur
Template Description: > Never run out of high-quality LinkedIn content again. This workflow is a complete content factory that takes a simple topic from a Google Sheet, uses AI to research a trending angle, writes a full post, generates a unique and on-brand image, and publishes it directly to your LinkedIn profile. This template is designed for brands and creators who want to maintain a consistent, high-quality social media presence with minimal effort. The core feature is its ability to generate visuals that adhere to a specific, customizable brand style guide. 🚀 What does this workflow do? Pulls content ideas from a Google Sheet acting as your content calendar. Uses an AI Researcher (OpenAI + SerpAPI) to find the most recent and engaging news or trends related to your topic. Employs an AI Writer to draft a complete, professional LinkedIn post with a catchy title, engaging text, and relevant hashtags. Generates a unique, on-brand image for every post using Replicate, based on a customizable style guide (colors, composition, mood) defined within the workflow. Publishes the post with its image** directly to your LinkedIn profile. Updates the status** in your Google Sheet to "done" to avoid duplicate posts. 🎯 Who is this for? Marketing Teams:** Automate your content calendar and ensure brand consistency across all visuals. Social Media Managers:** Save hours of research, writing, and design work. Solopreneurs & Founders:** Maintain an active, professional LinkedIn presence without a dedicated content team. Content Creators:** Scale your content production and focus on strategy instead of execution. ✨ Benefits End-to-End Automation:** From a single keyword to a published post, the entire process is automated. Brand Consistency:** The AI image generator follows a strict, customizable style guide, ensuring all your visuals feel like they belong to your brand. Always Relevant Content:** The AI research step ensures your posts are based on current trends and news, increasing engagement. Massive Time Savings:** Automates research, copywriting, and graphic design in one seamless flow. Content Calendar Integration:** Easily manage your content pipeline using a simple Google Sheet. ⚙️ How it Works Get Topic: The workflow fetches the next "Pending" topic from your Google Sheet. AI Research: An AI Agent uses SerpAPI to research the topic and identify a viral angle. AI Writing: A second AI Agent takes the research and writes the full LinkedIn post. Generate Image Prompt: A Code node constructs a detailed prompt, merging the post's content with your defined brand style guide. Generate Image: The prompt is sent to Replicate. The workflow waits and checks periodically until the image is ready. Publish: The generated text and image are published to your LinkedIn account. Update Status: The workflow archives the image to Google Drive and updates the topic's status in your Google Sheet to "done". 📋 n8n Nodes Used Google Sheets Langchain Agent (with OpenAI & SerpAPI) Code HTTP Request Wait / If LinkedIn Google Drive 🔑 Prerequisites An active n8n instance. Google Account** with Sheets & Drive access (OAuth2 Credentials). OpenAI Account & API Key**. SerpAPI Account & API Key** (for the research tool). Replicate Account & API Token**. LinkedIn Account** (OAuth2 Credentials). A Google Sheet** with "Topic" and "Status" columns. 🛠️ Setup Import the workflow into your n8n instance. Configure All Credentials: Go through the workflow and connect your credentials for Google, OpenAI, SerpAPI, Replicate, and LinkedIn in their respective nodes. Link Your Google Sheet: In the 1. Get Pending Topic... node, select your spreadsheet and sheet. Do the same for the final 8. ...Update Status node. Customize Your Brand Style (Highly Recommended): In the 4. Generate Branded Image Prompt (Code) node, edit the fixedImageStyleDetails variable. Change the RAL color codes and descriptive words to match your brand's visual identity. Populate Your Content Calendar: Add topics to your Google Sheet and set their status to "Pending". Activate the workflow!
by plemeo
Who’s it for Social-media managers, growth hackers, and brands who want to keep their Instagram accounts active by auto-liking posts from specific profiles they track—without scrolling feeds manually. How it works / What it does Schedule Trigger runs every 2 h. Profile Post Extractor pulls up to 20 recent posts from each Instagram profile in your CSV. Select Cookie rotates Instagram session-cookies. Get Random Post picks one and checks against instagram_posts_already_liked.csv. Builds instagram_posts_to_like.csv, uploads to SharePoint. Phantombuster Autolike Agent likes the post. Liked URLs are appended to prevent duplicates. Wait nodes throttle activity (~12 likes/profile/day). How to set up Add credentials: Phantombuster API, SharePoint OAuth2. In SharePoint › “Phantombuster” folder create: • instagram_session_cookies.txt (one per line). • instagram_posts_already_liked.csv (header postUrl). • profiles_instagram.csv with profile URLs. Adjust schedule if needed. Activate the workflow—likes will run automatically. Requirements n8n 1.33+ Phantombuster Growth plan Microsoft 365 SharePoint tenant How to customize Add/remove tracked profiles in profiles_instagram.csv. Adjust throttle by changing Wait intervals. Swap SharePoint for Google Drive/Dropbox if needed.
by Eugen
👥 Who the Automation is for This automation is perfect for bloggers, solopreneurs, business owners, and marketing teams who want to scale SEO content creation. Instead of spending hours on research and drafting, you can go from a single keyword idea to a ready-to-edit WordPress draft in minutes. ⚙️ How the Automation Works Collect keywords in a Google Sheet and mark the ones you want as “prioritized.” Click “Prepare Content” → your keyword(s) are sent to n8n. n8n pulls the top 10 Google SERP results. AI analyzes competitors (tone, content type, gaps) and creates a content brief. Another AI generates a blog draft based on the brief. The draft is automatically uploaded to WordPress and your sheet updates. 👉 In short: Keyword → SERP → Brief → Draft → WordPress. 🛠 How to Set Up Full Setup Guide Copy the Google Sheets Template. Import the workflow into n8n. Add your API keys: Google Custom Search, Claude AI, and WordPress credentials. Test the webhook connection from Google Sheets. 🎉 Done — you now have a one-click pipeline from keyword idea to WordPress draft.
by Rakin Jakaria
Who this is for This workflow is for digital marketing agencies or sales teams who want to automatically find business leads based on industry & location, gather their contact details, and send personalized cold emails — all from one form submission. What this workflow does This workflow starts every time someone submits the Lead Machine Form. It then: Scrapes business data* (company name, website, phone, address, category) using *Apify** based on business type & location. Extracts the best email address* from each business website using *Google Gemini AI**. Stores valid leads* in *Google Sheets**. Generates cold email content** (subject + body) with AI based on your preferred tone (Friendly, Professional, Simple). Sends the cold email** via Gmail. Updates the sheet** with send status & timestamp. Setup To set this workflow up: Form Trigger – Customize the “Lead Machine” form fields if needed (Business Type, Location, Lead Number, Email Style). Apify API – Add your Apify Actor Endpoint URL in the HTTP Request node. Google Gemini – Add credentials for extracting email addresses. Google Sheets – Connect your sheet for storing leads & email status. OpenAI – Add your credentials for cold email generation. Gmail – Connect your Gmail account for sending cold emails. How to customize this workflow to your needs Change the AI email prompt to reflect your brand’s voice and offer. Add filters to only target leads that meet specific criteria (e.g., website must exist, email must be verified). Modify the Google Sheets structure to track extra info like “Follow-up Date” or “Lead Source”. Switch Gmail to another email provider if preferred.
by moosa
This workflow monitors product prices from BooksToScrape and sends alerts to a Discord channel via webhook when competitor's prices are lower than our prices. 🧩 Nodes Used Schedule (for daily or required schedule) If nodes (to check if checked or unchecked data exists) HTTP Request (for fetching product page ) Extract HTML (for extracting poduct price) Code(to clean and extract just the price number) Discord Webhook (send discord allerts) Sheets (extract and update) 🚀 How to Use Replace the Discord webhook URL with your own. Customize the scraping URL if you're monitoring a different site.(Sheet i used) Run the workflow manually or on a schedule. ⚠️ Important Do not use this for commercial scraping without permission. Ensure the site allows scraping (this example is for learning only).