by Yar Malik (Asfandyar)
Who’s it for This workflow is designed for researchers, content creators, and AI agents who need to quickly scrape structured web data and capture full-page screenshots for further use. It’s especially useful for automating competitive research, news monitoring, and content curation. How it works The workflow uses the Firecrawl API integrated with n8n to perform web searches and return results in structured formats (Markdown and screenshots). It includes: A search agent that transforms natural language queries into Firecrawl-compatible search strings. HTTP requests to retrieve results from specific sites (e.g., YouTube, news outlets) or across the web. Automatic capture of full-page screenshots alongside structured text. Integration with the OpenAI Chat Model for enhanced query handling. How to set up Import this workflow into your n8n instance. Add and configure your Firecrawl API credentials. Add your OpenAI credentials for natural language query parsing. Trigger the workflow via the included chat input or modify it to run on schedule. Requirements A Firecrawl account with an active API key. n8n self-hosted or cloud instance. OpenAI account if you want to enhance search queries. How to customize the workflow Update the search queries to focus on your preferred sites or keywords. Adjust the number of results with the limit parameter. Extend the workflow to store screenshots in Google Drive, Notion, or your database. Replace the chat trigger with any other event trigger (webhook, schedule, etc.).
by Msaid Mohamed el hadi
🔍 AI-Powered Website Prompt Executor (Apify + OpenRouter) This workflow combines the power of Apify and OpenRouter to scrape website content and execute any custom prompt using AI. You define what you want — whether it’s extracting contact details, summarizing content, collecting job offers, or anything else — and the system intelligently processes the site to give you results. 🚀 Overview This workflow allows you to: Input a URL and define a prompt. Scrape the specified number of pages from the website. Process each page’s metadata and Markdown content. Use AI to interpret and respond to the prompt on each page. Aggregate and return structured output. 🧠 How It Works Input Example { "enqueue": true, "maxPages": 5, "url": "https://apify.com", "method": "GET", "prompt": "collect all contact informations available on this website" } Workflow Steps | Step | Action | | ---- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | | 1 | Triggered by another workflow with JSON input. | | 2 | Calls the Apify actor firescraper-ai-website-content-markdown-scraper to scrape content. | | 3 | Loops through the scraped pages. | | 4 | AI analyzes each page based on the input prompt. | | 5 | Aggregates AI outputs across all pages. | | 6 | Final AI processing step to return a clean structured result. | 🛠 Technologies Used Apify** – Scrapes structured content and Markdown from websites. OpenRouter** – Provides access to advanced AI models like Gemini. LangChain** – Handles AI agent orchestration and prompt interpretation. 🔧 Customization Customize the workflow via the following input fields: url: Starting point for scraping maxPages: Limit the number of pages to crawl prompt: Define any instruction (e.g., “summarize this website,” “extract product data,” “list all emails,” etc.) This allows dynamic, flexible use across various use cases. 📦 Output The workflow returns a JSON result that includes: Processed prompt responses from each page Aggregated AI insights Structured and machine-readable format 🧪 Example Use Cases 🔍 Extracting contact information from websites 📄 Summarizing articles or company profiles 🛍️ Collecting product information 📋 Extracting job listings or news 📬 Generating outreach lists from public data 🤖 Used as a tool within other AI agents for real-time web analysis 🧩 Integrated as an external tool in MCP (Multi-Component Prompt) servers to enhance AI capabilities 🔐 API Credentials Required You will need: Apify API token** – For running the scraper actor OpenRouter API key** – For AI-powered prompt processing Set these credentials in your environment or n8n credential manager before running.
by Yuki Hirota
Automated Meeting Recording Transcription & Minutes Distribution Workflow Managing meeting recordings manually—downloading audio, transcribing it, summarizing key points, saving documents, and notifying the team—quickly becomes repetitive and inefficient. This workflow eliminates all of those manual steps by automatically detecting new audio files uploaded to a designated Google Drive folder, converting them into high-quality transcripts using OpenAI, summarizing them into structured meeting minutes, transforming the content into a text file, uploading it back to Google Drive, and finally notifying a Chatwork room with the completed summary. What used to take hours can now be completed automatically within minutes, ensuring consistency, accuracy, and faster information sharing. Who’s it for This workflow is ideal for: Teams that need high-quality, client-ready meeting minutes generated automatically Project managers who require accurate summaries, decision tracking, and action items without manual effort Cross-functional teams handling multiple meetings and requiring structured, searchable documentation Organizations using Google Meet, Zoom, or Teams where recordings must be turned into polished minutes Anyone who wants a consistent, AI-assisted system that analyzes discussions, extracts insights, and formats them professionally By leveraging an advanced Meeting Minutes Generation System—capable of key-point extraction, noise reduction, speaker/topic organization, and review support—this workflow ensures that every meeting is transformed into a clean, structured, and highly usable document. How it works 1. Audio file upload triggers the workflow When a new recording is uploaded to the designated Google Drive folder, the Google Drive Trigger immediately activates and begins processing. 2. The audio file is downloaded The file is retrieved from Google Drive and prepared in binary format for accurate transcription. 3. AI-powered transcription The audio is sent to OpenAI’s transcription engine, producing a complete and highly accurate transcript of the meeting. 4. Generate structured, client-ready meeting minutes The transcript is processed by a specialized Meeting Minutes Generation System powered by a multi-step prompt. Instead of using a fixed template, the system intelligently analyzes the transcription and automatically generates a professionally structured document using the following capabilities: Extraction of key points while removing irrelevant conversation Organization of content by speaker, topic, and logical flow Automatic construction of headings and document structure Draft generation based on client-facing writing standards Review-support logic that allows refinement and improved readability Task-based orchestration (ingestion → key-point extraction → draft generation → review → final approval) Because the system dynamically determines the optimal structure, the resulting minutes adapt to the content of each meeting rather than following a rigid set of categories. If certain information cannot be derived from the transcript, the system will appropriately leave it out rather than forcing placeholder sections. 5. Convert the minutes into a file The structured minutes are converted into a .txt or .docx document, ready for submission or archival. 6. Upload the finalized document The completed meeting minutes are uploaded to a specific Google Drive folder and saved using a timestamped filename. 7. Notify Chatwork A formatted notification—including the summarized content—is automatically posted to a Chatwork room, ensuring immediate team visibility. How to set up Import the workflow into your n8n environment. Authenticate Google Drive and select the folder to monitor for new recordings. Connect your OpenAI API keys for both transcription and structured minutes generation. Specify the Google Drive folder where the finished documents should be stored. Add your Chatwork API token and room ID for automated notifications. Upload a sample audio file to confirm the full end-to-end pipeline works correctly. Requirements n8n instance (cloud or self-hosted) Google Drive account with appropriate folder permissions OpenAI API credentials Chatwork API token Supported audio formats like mp3, wav, m4a, etc. How to customize the workflow Modify the minutes-generation prompt to reflect your organization’s preferred format Add Slack, Teams, or Discord notifications in addition to Chatwork Route different types of meetings to different folders or templates Save transcript and structured minutes separately for compliance or analysis Log metadata or decisions into Google Sheets or project management tools Store minutes in a vector database to enable semantic search across past meetings Attach the final document directly as a file in Chatwork Extend the system to support revision cycles, reviewer comments, or approval workflows
by Vadim
This workflow automates the process of generating stylized product photos for e-commerce by combining real product shots with creative templates. It enables the creation of a complete set of images for an SKU from a single product photo and a set of reusable templates. The workflow uses Google Gemini (Nano Banana) for image editing and Airtable as the data source. Example use case. An apparel brand can use this workflow to turn plain product photos (e.g., socks on a white background) into lifestyle images that match their brand aesthetic. By combining each product photo with predefined templates and reference images, the workflow generates a variety of stylized results automatically - ready for marketing or online stores. How it works This workflow expects the following Airtable table setup: "Product Images"** - contains original product photos, one per record. "Reference Images"** - contains reference images for templates, one per record. "Templates"** - contains reusable generation templates. Each template includes a text prompt and up to three reference images. "Jobs"** - contains batch generation jobs. Each job references multiple product images and multiple templates. "Results"** - contains the generated outputs. Each result includes a generated image, references to the job, product image, and template, and a status field (pending, approved, rejected). The workflow is triggered by a webhook that receives a job ID from Airtable. It then: Fetches the job record. Retrieves the associated product images and templates (each with its text prompt and reference images). Downloads all required product and reference images. For each product-template combination, sends these images and the prompt to Google Gemini to generate new AI-edited images. Saves the generated images back into Airtable. NOTE: A separate workflow should handle the human-in-the-loop approval process and any regeneration of rejected results. Requirements Airtable Personal Access Token Google Gemini API key Setup Ensure all required Airtable tables exist. Configure parameters in the parameters node: Set Airtable Base ID Set ID of the attachment field in the "Results" table (where the generated images will be uploaded) Configure credentials for all Airtable nodes. Set Google Gemini API key for the "Generate..." nodes.
by Soumya Sahu
This workflow turns a Google Sheet into a fully automated content calendar for BlueSky. It handles single posts, multi-post threads, and image attachments, allowing you to manage your entire social presence from a simple spreadsheet. Who is this for Ideal for social media managers, content creators, and growth marketers who want to schedule content in bulk without using expensive third-party tools. What it does It runs on a schedule to check your Google Sheet for posts marked "Ready." It automatically handles: Threading:** Links posts together if they share a Thread ID and Sequence. Images:** Downloads image URLs and uploads them as blobs to BlueSky. Status Updates:** Marks rows as "Posted" and saves the live URL back to your sheet. How to set up Google Sheet: Create a sheet with these columns: Content, Thread ID, Sequence (use '1' for single posts), Image URL, Scheduled Time, Status, Post Link. Important: Even if it is a single post (not a thread), you must add a unique Thread ID. Note: Image URL is optional. Format: Set the "Scheduled Time" column type to Plain Text to prevent date errors. (A sample Google Sheet link is provided inside the workflow notes). Credentials: Enter your BlueSky Handle and App Password in the "Configuration" node. Select Sheet: In both the "Get row(s)" and "Update row" nodes, select your specific Google Sheet. 🚀 The BlueSky Growth Suite This workflow is part of a 3-part automation suite designed to help you grow on BlueSky: Part 1: Post Scheduler** (This template) Part 2: Analytics Tracker** (Track likes/reposts back to Sheets) Part 3: Lead Magnet Bot** (Auto-DM users who reply to your posts)
by Evoort Solutions
📈 YouTube Trend Finder Workflow using n8n & RapidAPI Description: Easily discover trending YouTube videos by country and language using this automated n8n workflow. The flow leverages the YouTube Trend Finder API and logs insights to Google Sheets — ideal for content creators, marketers, and researchers. 🔗 Node-by-Node Explanation | Node Name | Type | Description | |-----------------------------|--------------------|-----------------------------------------------------------------------------| | 1. On form submission | Form Trigger | Captures user input for country and language through a web form. | | 2. Trend Finder API Request | HTTP Request | Sends a request to YouTube Trend Finder API with the form data. | | 3. Re format output | Code | Extracts and reshapes API response data like title, link, and tags. | | 4. Google Sheets | Google Sheets | Appends the trending video data into a structured spreadsheet. | 🎯 Use Cases 🔍 Content Research: Find top-trending videos in any region or language for idea inspiration. 📈 Marketing Intelligence: Track video trends to tailor your video marketing strategy. 📰 Trend Monitoring: Journalists and analysts can quickly surface viral video topics. ✅ Benefits of this Workflow No Coding Required:** Easy-to-use form interface for non-technical users. Real-Time Trends:* Instantly access trending YouTube content with the *YouTube Trend Finder API**. Automated Logging:** Stores data directly in Google Sheets for future analysis or sharing. Customizable:** Easily modify for more inputs like video category, max results, or add filters. Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Sk developer
Active Job Scraper Workflow Using RapidAPI Jobs Search Realtime Data API This powerful Active Job Scraper workflow uses the RapidAPI Jobs Search Realtime Data API to fetch real-time job listings from leading job boards like Indeed, LinkedIn, ZipRecruiter, and Glassdoor. Overview Leverage the Jobs Search Realtime Data API on RapidAPI to gather fresh job data from Indeed, LinkedIn, ZipRecruiter, and Glassdoor. This n8n workflow lets you: Search jobs by location, keywords, job type, and remote options across these major platforms. Collect detailed job information including descriptions and metadata. Automatically save the scraped results into Google Sheets for easy tracking and analysis. Why Choose This Workflow? By integrating the RapidAPI Jobs Search Realtime Data API, you can scrape job listings from the most popular job sites—Indeed, LinkedIn, ZipRecruiter, and Glassdoor—all in one place. Customize your search parameters and get results tailored to your needs. Workflow Components | Node | Description | |------------------|-----------------------------------------------------------------| | Form Trigger | Collects input such as location, search term, job type, and remote status. | | HTTP Request | Calls the RapidAPI Jobs Search Realtime Data API to fetch jobs from Indeed, LinkedIn, ZipRecruiter, and Glassdoor. | | Code Node | Processes and formats the API response data. | | Google Sheets | Appends the extracted job listings to your spreadsheet. | 🔑 How to Get API Key from RapidAPI Jobs Search Realtime Data API Follow these steps to get your API key and start using it in your workflow: Visit the API Page 👉 Click here to open Jobs Search Realtime Data API on RapidAPI Log in or Sign Up Use your Google, GitHub, or email account to sign in. If you're new, complete a quick sign-up. Subscribe to a Pricing Plan Go to the Pricing tab on the API page. Select a plan (free or paid, depending on your needs). Click Subscribe. Access Your API Key Navigate to the Endpoints tab. Look for the X-RapidAPI-Key under Request Headers. Copy the value shown — this is your API key. Use the Key in Your Workflow In your n8n workflow (HTTP Request node), replace: "x-rapidapi-key": "your key" with: "x-rapidapi-key": "YOUR_ACTUAL_API_KEY"
by GYEONGJUN CHAE
Who is this for This template is essential for Remote Operations Managers, HR Teams, and Project Leads managing distributed teams across different countries. It prevents scheduling conflicts by automatically flagging when a regional team is out of office and identifying when multiple teams are off simultaneously. What it does Stop manually Googling "Is it a holiday in Berlin today?" This workflow automates your team availability calendar. It triggers on a weekly schedule. It takes your team's locations (e.g., KR, MX) and a "Lookahead" range (e.g., 50 days). It fetches official public holidays for both the current and next year using the Nager.Date API (to ensure year-end holidays aren't missed). It filters the results to find only holidays occurring within your defined lookahead window. It compares dates across countries to identify "Shared Holidays" (dates where multiple teams are off). It logs these holidays into a Notion database and notifies the team via Slack, specifically highlighting if a holiday is shared. How to set up Notion: Create a Database with properties: Name (Title), Date (Date), and Shared Countries (Text). Slack: Connect your Slack account in the credentials. Configuration: Define Team Countries: Enter the 2-letter country codes (e.g., "KR", "US") in the Set node. Define Days to Lookahead: Set how many days into the future you want to check (default is 50). Add to Notion: Select your Database ID. How to customize Filter Logic:** The "Filter Upcoming" node handles the date logic. You can modify this to check for specific holiday types (e.g., exclude "Optional" holidays). Shared Logic:** The "Find Shared Holidays" node calculates overlaps. You can adjust the JavaScript here if you want to change the format of the shared message sent to Slack. Destinations:** Swap the Notion node for Google Calendar to block off time directly.
by automedia
Scheduled YouTube Transcription with Duplicate Prevention Who's It For? This template is for advanced users, content teams, and data analysts who need a robust, automated system for capturing YouTube transcripts. It’s ideal for those who monitor multiple channels and want to ensure they only process and save each video's transcript once. What It Does This is an advanced, "set-it-and-forget-it" workflow that runs on a daily schedule to monitor YouTube channels for new content. It enhances the basic transcription process by connecting to your Supabase database to prevent duplicate entries. The workflow fetches all recent videos from the channels you track, filters out any that are too old, and then checks your database to see if a video's transcript has already been saved. Only brand-new videos are sent for transcription via the youtube-transcript.io API, with the final data (title, URL, full transcript, author) being saved back to your Supabase table. Requirements A Supabase account with a table to store video data. This table must have a column for the source_url to enable duplicate checking. An API key from youtube-transcript.io (offers a free tier). The Channel ID for each YouTube channel you want to track. How to Set Up Set Your Time Filter: In the "Max Days" node, set the number of days you want to look back for new videos (e.g., 7 for the last week). Add Channel IDs: In the "Channels To Track" node, replace the example YouTube Channel IDs with the ones you want to monitor. Configure API Credentials: Select the "Get Transcript from API" node. In the credentials tab, create a new "Header Auth" credential. Name it youtube-transcript-io and paste your API key into the "Value" field. The "Name" field should be x-api-key. Connect Your Supabase Account: This workflow uses Supabase in two places: "Check if URL Is In Database" and "Add to Content Queue Table". You must configure your Supabase credentials in both nodes. In each node, select your target table and ensure the columns are mapped correctly. Adjust the Schedule: The "Schedule Trigger" node is set to run once a day. Click it to adjust the time and frequency to your needs. Activate the Workflow: Save your changes and toggle the workflow to Active.
by Naveen Choudhary
Complete Template Description Automate LinkedIn Sales Navigator contact extraction to Google Sheets This workflow scrapes LinkedIn Sales Navigator search results and automatically saves contact details to Google Sheets with pagination support and rate limiting protection. Who's it for Sales teams, recruiters, and business development professionals who need to extract and organize LinkedIn contact data at scale without manual copy-pasting. What it does The workflow connects to a LinkedIn scraping API to fetch contact information from Sales Navigator search results. It handles pagination automatically, extracts contact details (name, title, company, location, profile URL), and appends them to a Google Sheet. Built-in rate limiting (30-60 second delays) prevents API blocks and mimics natural browsing behavior. Requirements Self-hosted n8n instance** (this workflow will NOT work on n8n Cloud due to cookie requirements and third-party API usage) LinkedIn Sales Navigator account Google Sheets account EditThisCookie browser extension API access from the creator (1 month free trial available) How to set up Step 1: Get API Access Email the creator to request 1 month of free API access using the link in the workflow. You'll receive your API key within 24 hours. Step 2: Configure API Authentication Click the "Scrape LinkedIn Contacts API" node Under Authentication, select "Header Auth" Create new credential with Name: x-api-key and your received API key as the Value Save the credential Step 3: Extract LinkedIn Cookies Install the EditThisCookie extension Navigate to LinkedIn Sales Navigator Click the cookie icon in your browser toolbar Click "Export" and copy the cookie data Paste into the cookies field in the "Set Search Parameters" node Step 4: Configure Your Search In the "Set Search Parameters" node, update: cookies: Your exported LinkedIn cookies url: Your LinkedIn Sales Navigator search URL total_pages: Number of pages to scrape (default: 2, each page = ~25 contacts) Step 5: Set Up Google Sheets Make a copy of the template Google Sheet (or create your own with matching column headers) In the "Save Contacts to Google Sheets" node, connect your Google Sheets account Select your destination spreadsheet and sheet name Important Security Note: Keep your LinkedIn cookies private. Never share them with others or commit them to public repositories. Customization options Adjust total_pages to control how many contacts you scrape Modify the delay in "Rate Limit Delay Between Requests" node (default: 30-60 seconds random) - do not lower this to avoid API blocks Customize which contact fields to save in the Google Sheets column mapping Change the search URL to target different prospect segments or filters
by James Li
Summary Onfleet is a last-mile delivery software that provides end-to-end route planning, dispatch, communication, and analytics to handle the heavy lifting while you can focus on your customers. This workflow template listens to Onfleet driver sign-up events and automatically notifies you on Slack. Configurations Update the Onfleet node with your own Onfleet credentials, to register for an Onfleet API key, please visit https://onfleet.com/signup to get started Update the Slack node with your own Slack credentials Update the Slack channel to something that exists in your Slack workspace, the default is set to #new-driver-signup in this example which may not apply to your workspace Update the Slack message to something customized, ideally with driver information such as phone number and name
by Lorena
This workflow collects images from web search on a specific query, detects labels in them, and stores this information in a Google Sheet.