by Robert Breen
This workflow pulls all tasks from your Monday.com board each day and logs them into a Google Sheet. It creates a daily snapshot of your project’s progress and statuses for reporting, tracking, or analysis. ⚙️ Setup Instructions 1️⃣ Connect Monday.com API In Monday.com → go to Admin → API Copy your Personal API Token Docs: Generate Monday API Token In n8n → Credentials → New → Monday.com API → paste your token and save 2️⃣ Prepare Your Google Sheet Copy this template to your own Google Drive: Google Sheet Template Add your data in rows 2–100. Make sure each new task row starts with Added = No. Connect Google Sheets in n8n Go to n8n → Credentials → New → Google Sheets (OAuth2) Log in with your Google account and grant access In the workflow, select your Spreadsheet ID and the correct Sheet Name 🧠 How it works Trigger**: Runs on click or via schedule (e.g., daily at 9 AM). Get many items (Monday.com)**: Fetches all tasks and their current status. Today's Date Node**: Adds the current date for snapshot logging. Map Fields**: Normalizes task name and status. Google Sheets (Append)**: Saves all tasks with status + date into your sheet for historical tracking. 📬 Contact Need help customizing this (e.g., filtering by status, emailing daily reports, or adding charts)? 📧 robert@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by Jayraj Pamnani
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Workflow Description: Startup Idea Finder (n8n) What This Workflow Does: This n8n workflow helps entrepreneurs discover startup ideas by automatically scraping top posts from multiple subreddits that often discuss unmet needs or problems (e.g., posts containing phrases like “Why is there no tool that”, “I wish there was an app for”, “someone should make”, etc.). The workflow extracts key information from these posts and sends it to Google’s Gemini 1.5 Flash-8b AI model, which analyzes the problem and suggests possible solutions or startup ideas. All relevant data and AI-generated insights are then saved to a Google Sheet for easy review and tracking. How It Works (Step-by-Step) 1. Manual Trigger: The workflow starts with a manual trigger. 2. Reddit Scraping: It queries multiple subreddits for top posts matching specific keywords that indicate a problem or unmet need. 3. Merge & Edit Fields: The results are merged and filtered to keep only the necessary fields: title, selftext, ups, created, and url. 4. AI Analysis: The filtered post data is sent to the Gemini 1.5 Flash-8b model with a prompt asking for: An explanation of the core problem, Whether existing solutions exist, A new startup idea if not, The target user, An implementation overview. 5. Google Sheets Logging: Both the original post data and the AI’s output are appended as a new row in a Google Sheet for future reference. APIs & Credentials Needed To use this workflow, you must set up the following credentials in your n8n instance: Reddit API: For scraping subreddit posts. Google Gemini (PaLM) API: For AI-powered analysis and idea generation. Google Sheets API: For saving results to your spreadsheet. Google Sheets Setup Before running the workflow, create a Google Sheet with the following columns (in this order): title, selftext, ups, created, url, output* The workflow will automatically append new rows with the scraped post data and the AI-generated output. Summary This workflow is a powerful tool for anyone looking to systematically discover and analyze real-world problems discussed online, and to generate actionable startup ideas using AI. Just set up your credentials, prepare your Google Sheet, and you’re ready to start finding your next big idea!
by Hirokazu Kawamoto
How it works Please send a corporate website URL via chat. The AI will investigate the company website on your behalf and return the extracted company information. Since this is set up as a conversational workflow, retrying or trying another URL is simple. How to use To get started, please set up the Credential in the Gemini node attached to the AI Agent node. You can obtain an API key from Google AI Studio. Once configured, the workflow will run when you send a corporate website URL (e.g., https://example.com/) via chat. Customizing this workflow You can change the settings in the Config node. You can modify targetCompanyFields to customize which company data fields are extracted. You can modify language to receive the results in a language other than English.
by Cahya
This workflow allows you to generate QR codes (Barcodes) in bulk from a Google Sheets file and store the generated QR images automatically in Google Drive. Each QR code contains a unique identifier (in this template, an email address) and is connected to a validation webhook. When a QR code is scanned, the webhook checks whether the ID exists and verifies its status. The system updates the spreadsheet to reflect whether the QR code has been generated and whether it has already been used. This template demonstrates a simple but practical QR lifecycle system: Generate QR codes in bulk from spreadsheet data Store QR images in Google Drive Validate QR scans through webhook Track generation and usage status directly in the spreadsheet Example use cases include event check-in systems, access control, membership validation, digital ticketing, or controlled asset distribution. The workflow is designed to be modular, easy to customize, and suitable for real-world implementation. How it works Set Up Google Sheets Create a Google Sheets file with at least the following columns: email & status_qr. The email column acts as the unique ID. The status_qr column will automatically update to indicate: QR_GENERATED QR_USED Make sure there are no duplicate email values. Set Up Credentials Google Sheets Credential Google Drive Credential Connect your Google account. Copy the Spreadsheet & Folder ID and configure it in the node. Configure the QR Generator Workflow Confirm the spreadsheet and sheet name are correctly selected. Ensure the workflow checks the status_qr column before generating a QR code. Only rows with empty status_qr will generate new QR codes. Generated QR images will be uploaded automatically to the configured Google Drive folder. The status_qr column will be updated to indicate that the QR has been generated. Run the workflow to generate QR codes in bulk. Configure the Validation Webhook Set the workflow to activate. Ensure the webhook URL matches the base URL embedded in the QR codes. When a QR code is scanned: The system extracts the email (ID). It checks whether the email exists in the spreadsheet. It verifies whether the QR has already been used. If valid, it updates status_qr to reflect usage. Need Help? Contact me on LinkedIn!
by Ahmed Salama
Description Categories Developer Automation, AI Agents, GitHub Automation, DevOps Productivity Build an AI-Driven GitHub Pull Request Automation with n8n + MCP This workflow creates an AI-powered GitHub automation that turns raw commit history into a clean, professional pull request automatically. When triggered via MCP or another workflow, it extracts repository details, fetches all commits from a target branch, uses AI to understand the intent behind the changes, and creates a well-structured pull request with a clear title and description. The result is a reliable, no-manual-work system that standardizes pull requests and reduces review friction across teams. Benefits Consistent Pull Requests Every PR follows a clean, readable structure regardless of who triggered it. Zero Manual Formatting No copy-pasting commit messages or writing descriptions by hand. Faster Review Cycles Reviewers get clear context upfront, reducing back-and-forth. AI-Assisted Context Awareness Commit history is summarized intelligently, not blindly concatenated. MCP-Ready Automation Can be called directly by AI tools like Cursor through MCP. How It Works MCP or Workflow Trigger Triggered via MCP server or another n8n workflow Accepts natural language or structured input Repository Information Extraction AI extracts: Repository owner Repository name Source branch Base branch Commit Retrieval (GitHub API) Fetches all commits for the source branch Collects commit messages as context Commit Summarization (AI) AI analyzes commit history Generates: A concise PR title A clear bullet-point description Pull Request Creation Creates a GitHub pull request automatically Uses correct base and head branches Inserts AI-generated title and description Required Setup GitHub Repository access OAuth or personal access token Permission to read commits and create pull requests AI Model Google Gemini or compatible LLM Connected via n8n AI nodes n8n Self-hosted or cloud HTTP access to GitHub APIs MCP Trigger enabled for AI tool access Business Use Cases Engineering Teams Standardize PR quality across developers Reduce cognitive load on contributors DevOps & Platform Teams Enforce PR hygiene automatically Improve velocity without extra process Founders & Tech Leads Maintain clean repositories without micromanagement Agencies & Consultants Deliver AI-assisted GitHub automation to clients Difficulty Level Intermediate Estimated Build Time 45–75 minutes Monthly Operating Cost GitHub: Existing plan AI Model: Free tier or usage-based n8n: Self-hosted or cloud Typical range: $0–20/month Why This Workflow Works Commits are the most reliable source of change intent AI summarizes meaning, not noise MCP enables direct AI-to-automation execution GitHub remains the single source of truth Possible Extensions Auto-assign reviewers based on files changed Add PR labels using AI classification Generate changelog entries automatically Post PR summary to Slack or Teams Enforce branch naming or commit standards Details Nodes used in workflow MCP Trigger AI Agent (Repository Extraction) Structured Output Parser GitHub API (Commits) Summarize AI LLM Chain GitHub API (Create Pull Request) If Edit Fields (Set) Sticky Note
by Alex Berman
Who is this for This template is for B2B sales teams, SDRs, growth marketers, and founders who maintain a spreadsheet of prospects and need verified contact details -- emails and mobile numbers -- without manual research. How it works Reads a list of contacts (first name, last name, company domain) from a Google Sheet. Formats the contacts and submits them to the ScraperCity Email Finder API to discover business email addresses. Polls until the email-finder job completes, then downloads and parses the results. Submits the found emails to the ScraperCity Mobile Finder API to look up phone numbers. Polls until the mobile-finder job completes, then downloads and parses results. Submits all found emails to the ScraperCity Email Validator API for deliverability and catch-all checks. Polls until validation completes, merges all enriched data together, and writes the final enriched rows back to a Google Sheet. How to set up Add your ScraperCity API key as an HTTP Header Auth credential named "ScraperCity API Key". Set your input Google Sheets document ID and sheet name in the "Configure Workflow" node. Set your output Google Sheets document ID and sheet name in the same node. Click "Execute workflow" to run. Requirements ScraperCity account with Email Finder, Mobile Finder, and Email Validator products enabled. Google Sheets OAuth2 credential connected to n8n. Input sheet with columns: first_name, last_name, domain. How to customize the workflow Swap Google Sheets for Airtable or a webhook trigger. Add a Filter node after validation to keep only emails with status "valid". Extend the output Set node to include additional fields from either API response.
by SOLOVIEVA ANNA
Who this workflow is for This template is for teams who want a lightweight “daily icebreaker” in Slack and creators who’d like to build a reusable trivia database over time. It works well for remote teams, communities, and any workspace that enjoys a quick brain teaser each day. What this workflow does The workflow fetches a random multiple-choice question from the Open Trivia Database (OpenTDB), posts a nicely formatted trivia message to a Slack channel, and logs the full question and answers into a Google Sheets spreadsheet. Over time, this creates a searchable “trivia archive” you can reuse for quizzes, content, or community events. How it works A Schedule Trigger runs once per day at a time you define. A Set node randomly chooses a difficulty level (easy, medium, or hard). A Switch node routes to the matching OpenTDB HTTP request. Each branch normalizes the API response into common fields (timestamp, date, difficulty, category, question, correct, incorrect, messageTitle, messageBody). A Merge node combines the three branches into a single stream. Slack posts the trivia message. Google Sheets appends the same data as a new row. How to set up Connect your Slack OAuth2 credentials and choose a target channel. Connect your Google Sheets credentials and select the spreadsheet and sheet. Adjust the schedule (time and frequency) to match your use case. How to customize Change the Slack message format (for example, add emojis or hints). Filter categories or difficulty levels instead of picking them fully at random. Add additional logging (e.g., user reactions, answer stats) in Sheets or another datastore.
by Dean Pike
Transform LinkedIn profile URLs into comprehensive enriched lead profiles, quickly and automatically. Add URLs to your sheet, run the workflow, and get fully enriched contact data: names, job titles, company details, career history, recent activity, and more – all written back to your spreadsheet. What it does Reads unenriched rows from Google Sheets (detects empty "First Name" field) Scrapes LinkedIn profiles via Apify (dev_fusion~linkedin-profile-scraper actor) Polls for completion with smart retry logic (15-second intervals, max 20 attempts) Extracts comprehensive profile data: Personal info (name, location, headline, bio) Current role (title, description, company, industry, size, website) Additional concurrent positions (for people with multiple roles) Most recent previous employer Last 2 LinkedIn posts with links Writes enriched data back to the same Google Sheet row Handles errors gracefully with status updates Requirements Apify account + API token Google Sheets OAuth2 credentials A Google Sheet with columns: LinkedIn, First Name, Last Name, Job Position, Location, Industry, Company Name, Company URL, Company Size, LI Other Profile Information, Status, Apify ID, Add date, row_number Setup Create your Google Sheet with the required columns (or duplicate the template structure) Replace YOUR_APIFY_API_KEY in three HTTP Request nodes: "Start Apify Scrape", "Check Status", and "Fetch LinkedIn Data" Connect your Google Sheets OAuth2 credentials to the two Google Sheets nodes Update the document ID if using your own sheet (currently points to a specific sheet) Add LinkedIn profile URLs to the "LinkedIn" column, leaving "First Name" blank Run manually – workflow processes all unenriched rows sequentially Sample Output Google Sheet row output - from one successfully enriched lead profile via LinkedIn URL: Link to Google Sheets sample file Next steps Use the enriched data for sales outreach, recruiting pipelines, or lead scoring. The "LI Other Profile Information" column contains a rich text summary ideal for AI-powered lead qualification or personalized messaging. Tip: Process small batches (5-10 profiles) first to verify Apify results and check for rate limiting. The Apify dataset ID is stored in each row, so you can retrieve raw JSON data later if needed for deeper analysis.
by Avkash Kakdiya
How it works This workflow automates the job curation process by retrieving pending job search inputs from a spreadsheet, querying the JSearch API for relevant job listings, and writing the curated results back to another sheet. It is designed to streamline job discovery and reduce manual data entry. Step-by-step 1. Trigger & Input The workflow starts on a defined schedule (e.g., once per day). It reads a row from the Job Scraper sheet where the status is marked as "Pending". The selected row includes fields like Position and Location, which are used to build the search query. 2. Job Search & Processing Sends a search request to the JSearch API using the Position and Location from the spreadsheet. Parses the API response and extracts individual job listings. Filters out empty, irrelevant, or invalid entries to ensure clean and relevant job data. 3. Output & Status Update Writes valid job listings to the Job Listing output sheet with fields such as job title, company name, location, and more. Updates the original row in the source sheet to mark it as Scraped, ensuring it will not be processed again in future runs. Benefits Reduces manual effort in job research and listing. Ensures only valid, structured data is stored and used. Prevents duplicate processing with automatic status updates. Simple to expand by adding more job sources or filters.
by giangxai
1. Overview This workflow automates the process of collecting viral content ideas, generating videos using AI services, and publishing posts to social media platforms. It uses scheduled triggers and external APIs to run the entire content pipeline with minimal manual input. Google Sheets is used as the central data layer to store ideas, track processing status, and prevent duplicate content. The workflow is designed to support continuous content production by connecting idea discovery, content generation, and publishing into a single automated system. 2. What Can This Workflow Do? This workflow can be used for: Automatically collecting viral content ideas on a scheduled basis Generating videos from text-based ideas without manual recording or editing Publishing video content to social media platforms using a single workflow Managing content status and processing history through Google Sheets Running automated content pipelines for multiple niches or channels The workflow is suitable for users who want to reduce manual work while maintaining a structured and repeatable content production process. 3. Required API Keys & Credentials The following credentials must be configured before running the workflow: Apify API Token: Used to scrape and collect viral content data Google OAuth2: Required for Google Sheets API access Gemini API Key: Used by the AI Agent for content analysis and generation Video Generation API Key: (Kie.ai, GeminiGen.ai, fal.ai...) Required to generate AI videos Blotato API Key: Used to automatically publish content to multiple social media platforms ⚠️ All API keys and credentials must be properly set up in n8n Credentials before executing the workflow. 4. High-Level Architecture The system is divided into two main workflows: Get Content Viral: Collects and stores viral content ideas automatically Create Video and Auto Post Transforms ideas into AI-generated videos and publishes them automatically Google Sheets as the Central Database Google Sheets serves as the central data layer to: Act as the main content database Prevent duplicate content processing Track content status throughout the entire pipeline **5. Workflow 1 – Get Content Viral Purpose** This workflow is responsible for automatically discovering and collecting viral content ideas, then storing them in Google Sheets for further processing. It acts as the idea intake layer of the entire AI Content Factory. Workflow Steps Scheduled content execution The workflow is triggered automatically based on a predefined schedule (e.g. daily or multiple times per day). This allows the system to continuously collect fresh viral content ideas without manual intervention. Run Apify Actor & Get Dataset Executes an Apify Actor to scrape viral content data from selected sources Retrieves a dataset containing raw content ideas, links, and metadata This step is responsible for data collection and crawling. Normalize scraped content data Cleans and normalizes the scraped data Removes unnecessary fields Formats the content into a standardized structure suitable for storage This ensures consistency and data quality. Save to Google Sheets (Viral Content) Appends the processed content ideas into a dedicated Google Sheet Each row represents a single viral content idea Google Sheets serves as the central idea repository for downstream workflows. Output: After the workflow completes, you will have: A continuously updated list of viral content ideas Structured data ready for video generation A centralized idea database accessible by other workflows **6. Workflow 2 – Create Video and Auto Post Purpose** This workflow is responsible for transforming viral content ideas into AI-generated videos and automatically publishing them to social media platforms. It acts as the production and distribution layer of the AI Content Factory. Workflow Steps Scheduled video generation The workflow runs automatically based on a predefined schedule. This allows you to control how many videos are generated and published per day. Get Viral Content (Google Sheets) Retrieves a single unprocessed content idea from Google Sheets Ensures that each idea is used only once This step prevents duplicate content generation. Analyze Content Analyzes the selected content idea Extracts key context, topic, and intent This prepares structured input for the AI Agent. AI Agent (Gemini + Structured Output Parser) The AI Agent generates: Video title Video description Video generation prompt A structured output parser is used to ensure consistent and predictable results for downstream steps. Format AI output for video generation Cleans and formats the AI-generated output Prepares the final payload for the video generation API This step ensures compatibility with the video generation service. Create Video (Video Generation API) Sends the prepared prompt to the video generation API Starts the AI video rendering process The video is generated asynchronously. Wait → Get Video → Switch Wait: Pauses the workflow while the video is rendering Get Video: Checks the current rendering status Switch: If Processing: wait and retry If Completed: continue to publishing If Failed: handle errors or retry Create Post (Blotato) Publishes the generated video to selected social media platforms Uses: Title Description Generated video Blotato handles multi-platform posting automatically. Update Google Sheets Updates the corresponding row in Google Sheets with: Processing status Published post URL Timestamp This provides full visibility and tracking across the pipeline. Output After the workflow completes, you will have: An AI-generated video Automatically published content across social platforms Updated tracking data in Google Sheets 8. Notes, Limitations & Best Practices Notes The overall content quality depends heavily on: The quality of scraped viral ideas The AI prompts used for analysis and video generation The capabilities of the selected video generation API This workflow is designed for automation, not manual fine-tuning. Google Sheets is used as a lightweight database and works best for small to medium-scale content pipelines. Limitations API rate limits and quotas may restrict: The number of videos generated per day The frequency of scraping and posting Video generation APIs may: Fail occasionally due to rendering issues Produce inconsistent results depending on prompt complexity Social media platforms may impose: Posting limits Content moderation restrictions Best Practices Start with a low posting frequency and gradually scale up Regularly review and refine: AI prompts Video formats Use Google Sheets status fields to: Monitor failures Prevent duplicate processing Add error handling and retry logic for: Video generation Posting steps Periodically clean and optimize your Google Sheets data to maintain performance Watch the tutorial video on YouTube..
by Max Tkacz
This workflow shows you how to post a message to a Slack channel and add a file attachment. It also shows you the general pattern for working with Binary data in n8n (any file like a PDF, Image etc). Specifically, this workflow shows how to download a file from a URL into your workflow, and then upload it to Slack. Video walkthrough Watch this 3 min Loom video for a walkthrough and more context on this general pattern:
by sven
In this video we will create a simple n8n Nodemation workflow to receive date via webhook, alter the data and send it to a webserver. We will be using webhook, function and http request node together. > Youtube Video