by Askan
What problem does this solve? It fetches LinkedIn profiles for a multitude of purposes based on a keyword and location via Google search and stores them in an Excel file for download and in a NocoDB database. It tries to avoid using costly services and should be n8n beginner friendly. It uses the serpapi.com to avoid being blocked by Google Search and to process the data in an easier way. What does it do? Based on criteria input, it searches LinkedIn profiles It discards unnecessary data and turns the follower count into a real number The output is provided as an Excel table for download and in a NocoDB database How does it do it? Based on criteria input, it uses serpAPI.com to conduct Google search of the respective LinkedI profiles With OpenAI.com the name of the respective company is being added With OpenAI.com the follower number e.g., 300+ is turned into a real number: 300 All unnecessary metadata is being discarded As an output an Excel file is being created The output is stored in a nocodb.com table Step-by-step instruction Import the Workflow: Copy the workflow JSON from the "Template Code" section below. Import it into n8n via "Import from File" or "Import from URL". Set up a free account at serpapi.com and get API credentials to enable good Google search results Set up an API account at openai.com and get API key Set up a nocodb.com account (or self-host) and get the API credentials Create the credentials for serpapi.com, opemnai.com and nocodb.com in n8n. Set up a table in NocoDB with the fields indicated in the note above the NocoDB node Follow the instructions as detailed in the notes above individual nodes When the workflow is finished, open the Excel node and click download if you need the Excel file
by Jay Hartley
Disclaimer This template only works on n8n local instances! How it Works This workflow allows you to to receive webhooks from the public web and have your local workflow catch them, without any remote proxy. It is very useful for running quick tests without exposing your dev server. All you have to do is activate the workflow and use the public address as defined below. Set up steps If you use the default key-value storage, there are only three steps: Install the @horka.tv/n8n-nodes-storage-kv community node Put your n8n workflow address in Local Webhook Address Activate the workflow and, from Executions, note down your public webhook token from the inputs to Get Latest Requests. You can now use https://webhook.site/[YOUR TOKEN] as a webhook destination, to receive webhook requests from the public web.
by Marcel Claus-Ahrens
This automation syncs your Invoice PDFs from Stripe to a (AWS) S3 Bucket each month, in a folder of your choice, with the following subPath: yourFolder/invoiceYear/invoiceMonth/fileName Fill in your Credentials and Settings in the Nodes marked with "*". You can adjust this Workflow to your needs. You can also override the yearand month in the ENV* Node for manual syncs. It will sync every Invoice PDF which created-date is greater then the provided year and month. It will automatically set the day to the first day of the desired month. Enjoy the Workflow! ❤️ https://let-the-work-flow.com Workflow Automation & Development
by Zacharia Kimotho
Remember when you were doing some large research and wanted to quickly bookmark a page and save it, only to find premium options? Worry not; n8n got you covered. You can now create a simple bookmarking app straight to your browser using simple scrips on your browser called bookmarklets. A bookmarklet is a bookmark stored in a web browser that contains JavaScript commands that add new features to the browser. To create one, we need to add a short script to the bookmark tab of our browser like below A simple hack is to open a new tab and click on the star that appears on the right side Now that we have our bookmark, it's time for the fun part. Right-click on the bookmark we just created and select the edit option. This will allow you to set the name you want for your bookmark and the destination URL. The URL used here will be the script that shall "capture" the page we want to bookmark. The code below has been used and tested to work for this example javascript:(() => { var currentUrl = window.location.href; var webhookUrl = 'https://$yourN8nInstanceUrl/webhook/1rxsxc04b027-39d2-491a-a9c6-194289fe400c'; var xhr = new XMLHttpRequest(); xhr.open('POST', webhookUrl, true); xhr.setRequestHeader('Content-Type', 'application/json'); var data = JSON.stringify({ url: currentUrl }); xhr.send(data); })(); Your Bookmark should look like something like this Now that we have this setup, we are now going to n8n to receive the data sent by this script. Create a new webhook node that receives the POST request as in the workflow and replace $yourN8nInstanceUrl with your actual n8n instance. This workflow can then be configured to send this data to a notion database. Make sure the notion database has all the required permissions before executing the workflow. Otherwise the URLs will not be saved
by Mohsin Ali
1. Document Ingestion & Processing Google Drive Trigger monitors for new files → Loop Over Items processes each file → File Info extracts metadata → Google Drive downloads the actual content → Switch routes to appropriate extractors (PDF or TEXT) based on file type 2. Content Transformation & Chunking Document Data node processes extracted text → Recursive Splitter breaks content into contextual chunks → Chunk Splitting applies intelligent segmentation while preserving document context and relationships between chunks 3. Embedding & Storage Basic LLM Chain processes chunks → OpenAI Chat Model generates contextual understanding → Summarize creates document summaries → Supabase Vector Store saves embeddings with metadata → Embeddings OpenAI creates vector representations → Default Data Loader handles storage operations 4. Query Processing & Retrieval When Clicking Execute triggers user queries → OpenAI processes and understands the question → AI Agent orchestrates hybrid search (combining vector similarity + keyword matching) → Google Gemini Chat Model generates final responses using retrieved context → HTTP Request handles additional external data sources
by Davide
This workflow streamlines your WooCommerce product creation process by integrating directly with Google Sheets. Simply input product details into your spreadsheet, and the workflow takes care of the rest-automatically creating new products on your WooCommerce store with inventory management. But it doesn’t stop there. A dedicated SEO expert chain analyzes each product’s content and generates optimized meta titles and meta descriptions for the plugin Yoast SEO, enhancing visibility and ranking potential on search engines. Key Benefits: 🔄 Automation: No more manual uploads—save time and reduce errors by syncing Google Sheets directly with WooCommerce. ⚡ Speed: Instantly publish multiple products with just one action. 🧠 Built-in SEO Intelligence: Automatically generate SEO-friendly meta titles and descriptions tailored to each product. 📈 Improved Search Visibility: Boost your store's traffic with optimized product listings. 🧩 Customizable: Easily adapt the workflow to your specific needs or integrate with other platforms. How It Works This workflow automates the creation of WooCommerce products and generates optimized SEO meta tags (title and description) using AI. Here’s the step-by-step process: Data Retrieval**: The workflow starts by fetching product details (title, category, description, price, etc.) from a Google Sheets document. Product Creation**: Each product is created in WooCommerce using the retrieved data, including categories, pricing, stock details, and images. AI-Powered SEO Optimization**: An AI model (Google Gemini via OpenRouter) analyzes the product details and generates SEO-optimized meta titles (≤60 chars) and meta descriptions (≤160 chars). Meta Tag Assignment**: The generated meta tags are saved back to the Google Sheets and applied to the WooCommerce product using Yoast SEO metadata. Completion Tracking**: The workflow marks completed entries in Google Sheets and sends a Telegram notification upon finishing all products. Set Up Steps Before running the workflow, ensure the following steps are completed: Step 1**: Install the Yoast SEO plugin on WordPress and add the provided PHP code to functions.php to enable meta tag API support. Step 2**: Enable the WooCommerce REST API in WordPress and configure the Telegram node with a valid CHAT_ID for notifications. Step 3**: Prepare a Google Sheet with product data (columns A-I in specific formats) and share its ID in the workflow. Ensure columns B, E, and F are in text format, and column I is numeric. Once set up, the workflow can be triggered manually or scheduled to run automatically, streamlining product creation and SEO optimization. Who is it useful for? Ideal for eCommerce managers, digital marketers, or anyone managing large product catalogs-this workflow turns your spreadsheet into a powerful product launcher. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Angel Menendez
Have you ever wanted to throttle Plex when connecting remotely to your server? Well here is the script for you! The instructions to deploy are below: You will need: A plex server with Plex Pass (for webhooks) n8n running locally (either in docker on via the desktop app) Qbittorent with WebUI enabled Begin by installing n8n by visiting n8n.io. You can install the desktop version or the docker version, whichever works best for your but I'm doing this on my desktop version of n8n. Copy the code from this page into your n8n install canvas. You should see the script appear before your eyes. From there, you Double click on the Webhook node at the beginning of the script. Copy both the Test and Production URLs that appear there. Now make your way to Plex and visit your settings. On the left, you should see the webhooks option if you have Plex Pass. This will setup your triggers. Next visit your qBitTorrent instance and enable WebUI. Notate your Username, Password, and Port. You will also need to know the IP of the machine that qbittorent is running on. If you have an iPhone you can connect to the same wireless network as your computer and use the Fing app to scan the network for the IP. Open up the script and edit the Global Variables to reflect the values you copied. Hit save at the top right, and then activate the script. Enjoy!!
by Peter
Read a value by key from a local json file. Related workflow: WriteKey Create a subfolder in your n8n homedir: /home/node/.n8n/local-files. In docker look at the data path and create a subfolder local-files. Set the correct access rights chmod 1000.1000 local-files. Put the workflow code in a new workflow named GetKey. Create another workflow with a function item: return { file: '/4711.json', // 4711 should be your workflow id key: 'MyKey', default: 'Optional returned value if key is empty / not exists' } Pipe the function item to an Execution Workflow that calls the GetKey workflow. It would be nice if we could get someday a shiny built-in n8n node that does the job. :)
by Shahrear
📜 AI-Powered Contract Management Pipeline (Google Drive + VLM Run + Sheets + Calendar + Slack) ⚙️ What This Workflow Does This workflow automatically extracts, organizes, and tracks legal contract details from documents uploaded to Google Drive. Using VLM Run’s Execute Agent, it parses key metadata such as contract ID, parties, dates, and terms — then stores, alerts, and schedules reminders through Google Sheets, Calendar, and Slack. 🧩 Requirements Google Drive OAuth2** for monitoring and downloads VLM Run API credentials** with Execute Agent access Google Sheets OAuth2** for structured record storage Google Calendar OAuth2** for key date reminders Slack API credentials** for team notifications A reachable Webhook URL (for receiving parsed contract data) ⚡Quick Setup Configure Google Drive OAuth2 and create upload folder and folder for saving extracted images. Install the verified VLM Run node by searching for VLM Run in the node list, then click Install. Once installed, you can start using it in your workflows. Add VLM Run API credentials for document parsing. Configure Google Sheet and Calendar. For Google Sheet, from the document list, pick your Google Sheet (e.g., test). Then select the sheet inside it (e.g., Sheet1). Set the operation to Append Row — this will add new contract details as new rows. Turn on Map Each Column Manually. Match each contract field (like Contract ID, Title, Parties, Effective Date, Termination Date) to its corresponding column in your Google Sheet. Configure Slack for notifications. ⚙️ How It Works Monitor Contract Uploads – Watches a target Google Drive folder for new file uploads (PDFs, images, or scans). Download Contract File – Automatically downloads new contracts for AI analysis. VLM Run ContractParser – Sends the file to the VLM Run Execute Agent, which extracts structured contract data, including: Contract ID Title Parties (with roles) Property address Effective date Termination date Rent, deposit, payment terms, and governing law Receive Contract Data – The webhook endpoint receives the structured JSON response. Format Contract Data – Normalizes fields, formats dates, and prepares for storage. Save to Expense Database (Google Sheets) – Appends extracted data to a master Google Sheet for centralized contract tracking. Notify via Slack – Posts a concise summary to a Slack channel, showing key contract details for visibility. Create Calendar Events – Automatically schedules Google Calendar events for: Effective Date Termination Date Renewal Reminder (60 days before termination) 💡 Why Use This Workflow Manual contract management is error-prone and time-consuming key details like renewal dates, payment terms, or termination clauses often get lost in email threads or folders. This workflow ensures: Zero missed deadlines** automatic Google Calendar reminders keep your team on track. Instant team visibility** - Slack notifications keep legal, finance, and operations aligned. End-to-end automation** no need for manual parsing, data entry, or follow-ups. 🧠 Perfect For Legal teams automating contract intake and tracking Real estate or lease management workflows Finance or procurement teams needing expiration alerts Organizations centralizing contract metadata in Sheets 🛠️ How to Customize Modify Extraction Fields Edit the VLM Run Execute Agent schema to add fields like contract value, payment schedule, department, or contact email. Change Storage Swap Google Sheets for Airtable, Notion, or BigQuery if you manage large datasets or need relational tracking. Customize Notifications Send Slack alerts only for high-value or expiring contracts, and tag relevant teams (e.g., @legal, @finance). Add Calendar Events Auto-create events for reviews or payment milestones using extra date fields. Add Approvals or Signatures Insert a Google Form or Slack approval step, or trigger DocuSign for e-signature automation. ⚠️ Community Node Disclaimer This workflow uses community nodes (VLM Run) that may need additional permissions and custom setup.
by Davide
This workflow is designed to generate SEO-friendly content with DeepSeek R1 (or V3), publish it on WordPress, and update a Google Sheets document with the details of the created post. Below is a detailed analysis of what each node in the workflow does: How It Works Triggering the Workflow: The workflow starts with a Manual Trigger node, which is activated when the user clicks "Test workflow" in the n8n interface. Fetching Data: The Get Ideas node retrieves data from a Google Sheets document. It reads a specific sheet and filters the data based on the "ID POST" column, returning the first matching row. Setting the Prompt: The Set your prompt node extracts the PROMPT field from the Google Sheets data and assigns it to a variable for use in subsequent steps. Generating Content: The Generate article node uses an AI model (DeepSeek) to create an SEO-friendly article based on the prompt. The article includes an introduction, 2-3 chapters, and a conclusion, formatted in HTML. The Generate title node uses the same AI model to generate a concise, SEO-optimized title for the article. Publishing on WordPress: The Create post on WordPress node creates a new draft post on WordPress using the generated title and article content. Generating and Uploading an Image: The Generate Image node creates a photorealistic image based on the article title using an AI model (OpenAI). The Upload image node uploads the generated image to WordPress as a media file. The Set Image node assigns the uploaded image as the featured image for the WordPress post. Updating Google Sheets: The Update Sheet node updates the original Google Sheets document with the post details, including the title, post ID, creation date, and row number. Set Up Steps Configure Google Sheets Integration: Set up the Google Sheets node to connect to your Google account and specify the document ID and sheet name to read from and update. Set Up AI Models: Configure the OpenAI nodes (for generating the article, title, and image) with the appropriate API credentials and model settings (e.g., deepseek-reasoner for text generation). Configure WordPress Integration: Set up the WordPress node with your WordPress site's API credentials to allow creating posts and uploading media. Define the Prompt and Content Structure: In the Set your prompt node, ensure the prompt variable is correctly mapped to the data from Google Sheets. In the Generate article and Generate title nodes, define the instructions for the AI model to generate the desired content. Set Up Image Generation: Configure the Generate Image node with the appropriate prompt and image settings (e.g., size, quality, style). Configure HTTP Requests for Media Upload: Set up the Upload image and Set Image nodes to use the WordPress REST API for uploading and assigning the featured image. Map Data for Google Sheets Update: In the Update Sheet node, map the relevant fields (e.g., title, post ID, date) to the appropriate columns in the Google Sheets document. Test and Activate the Workflow: Run the workflow manually to ensure all steps execute correctly. Once verified, activate the workflow for automated execution. Overall purpose of the workflow This workflow automates the creation of SEO-friendly content for a WordPress blog. Starting from a prompt extracted from a Google Sheets document, it generates an article, a title, and an image, publishes the post on WordPress, and updates the Google Sheets document with the details of the created post. This process is useful for blog managers who want to automate content creation and publishing. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically discovers and collects information about Stack Overflow user profiles for lead generation. It saves you time by eliminating the need to manually browse through developer profiles and provides a centralized database of potential leads with their technical expertise. Overview This workflow automatically scrapes Stack Overflow user profiles and extracts key information like developer names, locations, reputation scores, and technical tags. It uses Bright Data to access Stack Overflow without being blocked and AI to intelligently parse user data into structured format. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping Stack Overflow user profiles without being blocked OpenAI**: AI agent for intelligent data extraction and parsing Google Sheets**: For storing and organizing lead information How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and specify the target spreadsheet Customize: Adjust the Stack Overflow URL and user criteria you want to target Use Cases Recruitment Teams**: Find developers with specific technical skills for hiring Business Development**: Identify potential clients or partners in the tech industry Sales Teams**: Build targeted outreach lists for developer-focused products Research**: Gather data on developer communities and skill distributions Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #stackoverflow #leadgeneration #brightdata #webscraping #developers #recruitment #businessdevelopment #salesleads #n8nworkflow #workflow #nocode #leadautomation #developerscraping #techtalent #userprofiles #aiautomation #datamining #prospecting #outreach #techrecruiting #developerleads #stackoverflowscraping #profilescraping #leadcollection #techcommunity #developerdatabase #automatedleads #intelligentscraping
by Simon
This n8n workflow simplifies the process of removing backgrounds from images stored in Google Drive. By leveraging the PhotoRoom API, this template enables automatic background removal, padding adjustments, and output formatting, all while storing the updated images back in a designated Google Drive folder. This workflow is very useful for companies or individuals that are spending a lot of time into removing the background from product images. How it Works The workflow begins with a Google Drive Trigger node that monitors a specific folder for new image uploads. Upon detecting a new image, the workflow downloads the file and extracts essential metadata, such as the file size. Configurations are set for background color, padding, output size, and more, which are all customizable to match specific requirements. The PhotoRoom API is called to process the image by removing its background and adding padding based on the settings. The processed image is saved back to Google Drive in the specified output folder with an updated name indicating the background has been removed. Requirements PhotoRoom API Key Google Drive API Access Customizing the Workflow Easily adjust the background color, padding, and output size using the configuration node. Modify the output folder path in Google Drive or replace Google Drive with another storage service if needed. For advanced use cases, integrate further image processing steps, such as adding captions or analyzing content using AI.