by galelem
This n8n workflow automates the entire pipeline of generating, formatting, and publishing SEO-rich blog posts to a Blogger site—ideal for auto service businesses. What it does: ⏱ Runs on a schedule via the Schedule Trigger 📰 Fetches trending news from Mediastack (technology category) 🖼 Generates relevant images using the Pexels API 🧠 Creates SEO-optimized content using AI agents (LangChain & OpenRouter) 📝 Formats content into Blogger-compatible HTML, including title, metadata, images, FAQs, and internal linking 🔄 Posts directly to Blogger via authenticated Google Blogger API 📢 Sends Telegram notifications with previews and publishing confirmations 🔐 Uses secure credentials (no hardcoded API keys) Ideal For: Bloggers and marketers looking to automate content creation Auto repair, dealership, or detailing businesses maintaining a content strategy Agencies managing multiple Blogger-based SEO campaigns
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically discovers and collects information about Stack Overflow user profiles for lead generation. It saves you time by eliminating the need to manually browse through developer profiles and provides a centralized database of potential leads with their technical expertise. Overview This workflow automatically scrapes Stack Overflow user profiles and extracts key information like developer names, locations, reputation scores, and technical tags. It uses Bright Data to access Stack Overflow without being blocked and AI to intelligently parse user data into structured format. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping Stack Overflow user profiles without being blocked OpenAI**: AI agent for intelligent data extraction and parsing Google Sheets**: For storing and organizing lead information How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and specify the target spreadsheet Customize: Adjust the Stack Overflow URL and user criteria you want to target Use Cases Recruitment Teams**: Find developers with specific technical skills for hiring Business Development**: Identify potential clients or partners in the tech industry Sales Teams**: Build targeted outreach lists for developer-focused products Research**: Gather data on developer communities and skill distributions Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #stackoverflow #leadgeneration #brightdata #webscraping #developers #recruitment #businessdevelopment #salesleads #n8nworkflow #workflow #nocode #leadautomation #developerscraping #techtalent #userprofiles #aiautomation #datamining #prospecting #outreach #techrecruiting #developerleads #stackoverflowscraping #profilescraping #leadcollection #techcommunity #developerdatabase #automatedleads #intelligentscraping
by Davide
This workflow is designed to generate SEO-friendly content with DeepSeek R1 (or V3), publish it on WordPress, and update a Google Sheets document with the details of the created post. Below is a detailed analysis of what each node in the workflow does: How It Works Triggering the Workflow: The workflow starts with a Manual Trigger node, which is activated when the user clicks "Test workflow" in the n8n interface. Fetching Data: The Get Ideas node retrieves data from a Google Sheets document. It reads a specific sheet and filters the data based on the "ID POST" column, returning the first matching row. Setting the Prompt: The Set your prompt node extracts the PROMPT field from the Google Sheets data and assigns it to a variable for use in subsequent steps. Generating Content: The Generate article node uses an AI model (DeepSeek) to create an SEO-friendly article based on the prompt. The article includes an introduction, 2-3 chapters, and a conclusion, formatted in HTML. The Generate title node uses the same AI model to generate a concise, SEO-optimized title for the article. Publishing on WordPress: The Create post on WordPress node creates a new draft post on WordPress using the generated title and article content. Generating and Uploading an Image: The Generate Image node creates a photorealistic image based on the article title using an AI model (OpenAI). The Upload image node uploads the generated image to WordPress as a media file. The Set Image node assigns the uploaded image as the featured image for the WordPress post. Updating Google Sheets: The Update Sheet node updates the original Google Sheets document with the post details, including the title, post ID, creation date, and row number. Set Up Steps Configure Google Sheets Integration: Set up the Google Sheets node to connect to your Google account and specify the document ID and sheet name to read from and update. Set Up AI Models: Configure the OpenAI nodes (for generating the article, title, and image) with the appropriate API credentials and model settings (e.g., deepseek-reasoner for text generation). Configure WordPress Integration: Set up the WordPress node with your WordPress site's API credentials to allow creating posts and uploading media. Define the Prompt and Content Structure: In the Set your prompt node, ensure the prompt variable is correctly mapped to the data from Google Sheets. In the Generate article and Generate title nodes, define the instructions for the AI model to generate the desired content. Set Up Image Generation: Configure the Generate Image node with the appropriate prompt and image settings (e.g., size, quality, style). Configure HTTP Requests for Media Upload: Set up the Upload image and Set Image nodes to use the WordPress REST API for uploading and assigning the featured image. Map Data for Google Sheets Update: In the Update Sheet node, map the relevant fields (e.g., title, post ID, date) to the appropriate columns in the Google Sheets document. Test and Activate the Workflow: Run the workflow manually to ensure all steps execute correctly. Once verified, activate the workflow for automated execution. Overall purpose of the workflow This workflow automates the creation of SEO-friendly content for a WordPress blog. Starting from a prompt extracted from a Google Sheets document, it generates an article, a title, and an image, publishes the post on WordPress, and updates the Google Sheets document with the details of the created post. This process is useful for blog managers who want to automate content creation and publishing. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Brian
This template automates posting to Instagram Business and Facebook Pages using the Meta Graph API. It supports both short-lived and long-lived tokens, with a secure approach using System User tokens for reliable, ongoing automation. Includes detailed guidance for authentication, token refresh logic, and API use. Features: 📸 Publish to Instagram via /media + /media_publish 📘 Post to Facebook Pages via /photos 🔐 Long-lived token support via Meta Business System User ♻️ Token refresh support using staticData in n8n 🧠 In-line sticky note instructions Use Cases: Schedule and publish branded social media content Automate marketing flows with CRM + social sync Empower internal teams or clients to post without manual steps Tags: Instagram, Facebook, Meta Graph API, Social Media, Token Refresh, Long-Lived Token, Marketing Automation, System User
by Zacharia Kimotho
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. What does this workflow do? This workflow helps speed up the analysis process of the top ranking titles and meta descriptions to identify paterns and styles that will help us rank on Google for a given keyword How does it work? We provide a keyword we are interested in on our Google sheet. When executed, We scrap the top 10 pages using Bright Data serp API and analyse the style and patterns of the top ranking pages and generate a new title and meta description Techncial setup Make a copy of this Google sheet Update your desired keywords on the cell/row Set your Bright data credentials on the Update the zone to your preset zone We are getting the results as a JSON. You can update this setting on the url https://www.google.com/search?q={{ $json.search_term .replaceAll(" ", "+")}}&start=0&brd_json=1 by removing the brd_json=1 query Store the generated results on the Duplicated sheet Run the workflow Setting up the Serp Scraper in Bright Data On Bright Data, go to the Proxies & Scraping tab Under SERP API, create a new zone Give it a suitable name and description. The default is serp_api Add this to your account Add your credentials as a header credential and rename to Bright data API
by Shahrear
📜 AI-Powered Contract Management Pipeline (Google Drive + VLM Run + Sheets + Calendar + Slack) ⚙️ What This Workflow Does This workflow automatically extracts, organizes, and tracks legal contract details from documents uploaded to Google Drive. Using VLM Run’s Execute Agent, it parses key metadata such as contract ID, parties, dates, and terms — then stores, alerts, and schedules reminders through Google Sheets, Calendar, and Slack. 🧩 Requirements Google Drive OAuth2** for monitoring and downloads VLM Run API credentials** with Execute Agent access Google Sheets OAuth2** for structured record storage Google Calendar OAuth2** for key date reminders Slack API credentials** for team notifications A reachable Webhook URL (for receiving parsed contract data) ⚡Quick Setup Configure Google Drive OAuth2 and create upload folder and folder for saving extracted images. Install the verified VLM Run node by searching for VLM Run in the node list, then click Install. Once installed, you can start using it in your workflows. Add VLM Run API credentials for document parsing. Configure Google Sheet and Calendar. For Google Sheet, from the document list, pick your Google Sheet (e.g., test). Then select the sheet inside it (e.g., Sheet1). Set the operation to Append Row — this will add new contract details as new rows. Turn on Map Each Column Manually. Match each contract field (like Contract ID, Title, Parties, Effective Date, Termination Date) to its corresponding column in your Google Sheet. Configure Slack for notifications. ⚙️ How It Works Monitor Contract Uploads – Watches a target Google Drive folder for new file uploads (PDFs, images, or scans). Download Contract File – Automatically downloads new contracts for AI analysis. VLM Run ContractParser – Sends the file to the VLM Run Execute Agent, which extracts structured contract data, including: Contract ID Title Parties (with roles) Property address Effective date Termination date Rent, deposit, payment terms, and governing law Receive Contract Data – The webhook endpoint receives the structured JSON response. Format Contract Data – Normalizes fields, formats dates, and prepares for storage. Save to Expense Database (Google Sheets) – Appends extracted data to a master Google Sheet for centralized contract tracking. Notify via Slack – Posts a concise summary to a Slack channel, showing key contract details for visibility. Create Calendar Events – Automatically schedules Google Calendar events for: Effective Date Termination Date Renewal Reminder (60 days before termination) 💡 Why Use This Workflow Manual contract management is error-prone and time-consuming key details like renewal dates, payment terms, or termination clauses often get lost in email threads or folders. This workflow ensures: Zero missed deadlines** automatic Google Calendar reminders keep your team on track. Instant team visibility** - Slack notifications keep legal, finance, and operations aligned. End-to-end automation** no need for manual parsing, data entry, or follow-ups. 🧠 Perfect For Legal teams automating contract intake and tracking Real estate or lease management workflows Finance or procurement teams needing expiration alerts Organizations centralizing contract metadata in Sheets 🛠️ How to Customize Modify Extraction Fields Edit the VLM Run Execute Agent schema to add fields like contract value, payment schedule, department, or contact email. Change Storage Swap Google Sheets for Airtable, Notion, or BigQuery if you manage large datasets or need relational tracking. Customize Notifications Send Slack alerts only for high-value or expiring contracts, and tag relevant teams (e.g., @legal, @finance). Add Calendar Events Auto-create events for reviews or payment milestones using extra date fields. Add Approvals or Signatures Insert a Google Form or Slack approval step, or trigger DocuSign for e-signature automation. ⚠️ Community Node Disclaimer This workflow uses community nodes (VLM Run) that may need additional permissions and custom setup.
by Anir Agram
🛡️📥 Telegram Invoice Agent → 🔎 OCR → 🤖 AI Parsing → 📄 Google Sheets + 🗂️ Drive What this workflow does 🤖 Captures invoices from Telegram and auto-downloads PDFs/images. 🔎 Runs OCR, then uses AI to structure clean invoice fields. 📄 Appends parsed data to a Google Sheets “Invoice Database.” 🗂️ Uploads the original file to Google Drive with a neat name. 💬 Sends a friendly Telegram summary with totals, due date, notes, and link. Why it’s useful ⚡ Faster bookkeeping with zero manual copy-paste. 🧱 Consistent schema for reliable reporting and pivots. 👥 Team-friendly drop-and-log via Telegram. 🧩 Easy to extend with approvals, ERP/CRM sync, or vendor routing. How it works 📲 Telegram Trigger → file received. 🌐 HTTP OCR (OCR.space) → text extracted. 🤖 AI Agent → maps to strict JSON schema. 📄 Google Sheets → appends structured row. 🗂️ Google Drive → saves original invoice. 💬 Telegram → concise confirmation and links. What you’ll need 🤖 Telegram Bot token. 🔑 OCR API key (OCR.space: free tier; upgrade for volume/accuracy). 🔐 Google OAuth for Sheets + Drive. 🧠 LLM account (e.g., Gemini/OpenAI-compatible). Setup steps 🔗 Connect credentials: Telegram, Google, OCR, AI. 📄 Prepare Sheet columns: Invoice Number, Date, Total Amount ($), Billing Address, Due Date, Notes. 🧭 Update sheet ID and Drive folder ID. 🧪 Test: send a sample invoice and validate OCR, AI output, row append, and Drive link. Customization ideas 🎯 Higher accuracy OCR: swap to Google Vision. 📊 Line items: extract into a second tab for analytics. ✅ Approvals: add Telegram keyboard confirmation before write. 🧯 Robustness: IF/Retry on empty OCR; user prompt to retake photo. Who it’s for 🧑💻 Freelancers/agencies needing fast invoice intake via Telegram. 🧾 Small finance teams wanting a searchable ledger with links to originals. 🏗️ Builders extending to ERPs/CRMs and custom accounting flows. Want help customizing? 📧 anirpoke@gmail.com 🔗 Linkedin
by Davide
This workflow streamlines your WooCommerce product creation process by integrating directly with Google Sheets. Simply input product details into your spreadsheet, and the workflow takes care of the rest-automatically creating new products on your WooCommerce store with inventory management. But it doesn’t stop there. A dedicated SEO expert chain analyzes each product’s content and generates optimized meta titles and meta descriptions for the plugin Yoast SEO, enhancing visibility and ranking potential on search engines. Key Benefits: 🔄 Automation: No more manual uploads—save time and reduce errors by syncing Google Sheets directly with WooCommerce. ⚡ Speed: Instantly publish multiple products with just one action. 🧠 Built-in SEO Intelligence: Automatically generate SEO-friendly meta titles and descriptions tailored to each product. 📈 Improved Search Visibility: Boost your store's traffic with optimized product listings. 🧩 Customizable: Easily adapt the workflow to your specific needs or integrate with other platforms. How It Works This workflow automates the creation of WooCommerce products and generates optimized SEO meta tags (title and description) using AI. Here’s the step-by-step process: Data Retrieval**: The workflow starts by fetching product details (title, category, description, price, etc.) from a Google Sheets document. Product Creation**: Each product is created in WooCommerce using the retrieved data, including categories, pricing, stock details, and images. AI-Powered SEO Optimization**: An AI model (Google Gemini via OpenRouter) analyzes the product details and generates SEO-optimized meta titles (≤60 chars) and meta descriptions (≤160 chars). Meta Tag Assignment**: The generated meta tags are saved back to the Google Sheets and applied to the WooCommerce product using Yoast SEO metadata. Completion Tracking**: The workflow marks completed entries in Google Sheets and sends a Telegram notification upon finishing all products. Set Up Steps Before running the workflow, ensure the following steps are completed: Step 1**: Install the Yoast SEO plugin on WordPress and add the provided PHP code to functions.php to enable meta tag API support. Step 2**: Enable the WooCommerce REST API in WordPress and configure the Telegram node with a valid CHAT_ID for notifications. Step 3**: Prepare a Google Sheet with product data (columns A-I in specific formats) and share its ID in the workflow. Ensure columns B, E, and F are in text format, and column I is numeric. Once set up, the workflow can be triggered manually or scheduled to run automatically, streamlining product creation and SEO optimization. Who is it useful for? Ideal for eCommerce managers, digital marketers, or anyone managing large product catalogs-this workflow turns your spreadsheet into a powerful product launcher. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Harshil Agrawal
This workflow allows you to insert and retrieve data from a table in Stackby. Set node: The Set node is used to set the values for the name and id fields for a new record. You might want to add data from an external source, for example an API or a CRM. Based on your use-case, add the respective node before the Set node and configure your Set node accordingly. Stackby node: This node appends data from the previous node to a table in Stackby. Based on the values you want add to your table, enter the column names in the Column field. Stackby1 node: This node fetches all the data that is stored in the table in Stackby.
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically tracks customer satisfaction scores across multiple platforms and surveys to help improve customer experience and identify areas for enhancement. It saves you time by eliminating the need to manually check different feedback sources and provides comprehensive satisfaction analytics. Overview This workflow automatically scrapes customer satisfaction surveys, review platforms, and feedback forms to extract satisfaction scores and sentiment data. It uses Bright Data to access various feedback platforms without being blocked and AI to intelligently analyze satisfaction trends and identify improvement opportunities. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping satisfaction surveys and review platforms without being blocked OpenAI**: AI agent for intelligent satisfaction analysis and trend identification Google Sheets**: For storing satisfaction scores and generating analytics reports How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your satisfaction tracking spreadsheet Customize: Define feedback sources and satisfaction metrics you want to monitor Use Cases Customer Experience**: Monitor satisfaction trends across all customer touchpoints Product Teams**: Identify product features that impact customer satisfaction Support Teams**: Track satisfaction scores for support interactions Management**: Get comprehensive satisfaction reporting for strategic decisions Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #customersatisfaction #satisfactionscores #brightdata #webscraping #customerexperience #n8nworkflow #workflow #nocode #satisfactiontracking #csat #nps #customeranalytics #feedbackanalysis #customerinsights #satisfactionmonitoring #experiencemanagement #customermetrics #satisfactionsurveys #feedbackautomation #customerfeedback #satisfactiondata #customerjourney #experienceanalytics #satisfactionreporting #customersentiment #experienceoptimization #satisfactiontrends #customervoice
by Priya Jain
This workflow provides an OAuth 2.0 auth token refresh process for better control. Developers can utilize it as an alternative to n8n's built-in OAuth flow to achieve improved control and visibility. In this template, I've used Pipedrive API, but users can apply it with any app that requires the authorization_code for token access. This resolves the issue of manually refreshing the OAuth 2.0 token when it expires, or when n8n's native OAuth stops working. What you need to replicate this Your database with a pre-existing table for storing authentication tokens and associated information. I'm using Supabase in this example, but you can also employ a self-hosted MySQL. Here's a quick video on setting up the Supabase table. Create a client app for your chosen application that you want to access via the API. After duplicating the template: a. Add credentials to your database and connect the DB nodes in all 3 workflows. Enable/Publish the first workflow, "1. Generate and Save Pipedrive tokens to Database." Open your client app and follow the Pipedrive instructions to authenticate. Click on Install and test. This will save your initial refresh token and access token to the database. Please watch the YouTube video for a detailed demonstration of the workflow: How it operates Workflow 1. Create a workflow to capture the authorization_code, generate the access_token, and refresh the token, and then save the token to the database. Workflow 2. Develop your primary workflow to fetch or post data to/from your application. Observe the logic to include an if condition when an error occurs with an invalid token. This triggers the third workflow to refresh the token. Workflow 3. This workflow will handle the token refresh. Remember to send the unique ID to the webhook to fetch the necessary tokens from your table. Detailed demonstration of the workflow: https://youtu.be/6nXi_yverss
by Paul-François GORIAUX
This workflow acts as your personal AI-powered analyst for Meta Ads. It's pretty straightforward: First, it grabs a list of Facebook Ad Library URLs you want to check out from a Google Sheet. Then, it automatically scrapes the active ads from those pages. Here's the cool part: it sends each ad's image and text to Google Gemini, which analyzes it like an expert marketer would. Finally, Gemini's full analysis—we're talking strengths, weaknesses, actionable suggestions, and a performance score—gets dropped neatly into another Google Sheet for you. Set up steps You should be ready to roll in about 5 minutes. There are no complex configurations, you just need to: Connect your accounts: The workflow has placeholders waiting for your credentials for Google (for Sheets and the Gemini API) and ScrapingFlash. Link your Google Sheets: Just point the first Google Sheets node to the sheet with your URLs, and tell the last node where you want to save the results. All the nitty-gritty details and expressions are explained in the sticky notes inside the workflow itself!