by Adam Janes
This workflow demonstrates a simple way to run evals on a set of test cases stored in a Google Sheet. The example we are using comes from an info extraction task dataset, where we tested 6 different LLMs on 18 different test cases. You can see our sample data in this spreadsheet here to get started. Once you have this working for our dataset, you can plug in your own test cases matching different LLMs to see how it works with your own data. How it works: It loads test cases from Google Sheets. For each row in our Google Sheet, it grabs the source document, converting it to text. Our "LLM judge" passes the input/output of each LLM to GPT-4.1 to evaluate each test case (Pass/Fail + Reason). It logs the outcome to a Google Sheet. A 0.5s pause between each request gets around OpenAI's API rate limits. Set up steps: Add your credentials for Google Sheets, Google Drive, and OpenRouter. Make a copy of the original data spreadsheet so that you can edit it yourself. You will need to plug your version in the Update Results node to see the spreadsheet update on each run of the loop.
by Henry
Who is this for? This workflow is ideal for SEO specialists, web designers, and digital marketers who want to quickly draft effective landing page layouts by referencing established competitors. It suits users who need a fast, structured starting point for web design while ensuring competitive relevance. What problem is this workflow solving? / Use case Designing a high-converting landing page from scratch can be time-consuming. This workflow automates the process of analyzing a competitor’s website, identifying essential sections, and producing a tailored layout—helping users save time and improve their website’s effectiveness. What this workflow does The workflow fetches and analyzes your chosen competitor’s landing page, using web scraping and structure-detection nodes in n8n. It identifies primary sections like hero banners, service highlights, testimonials, and contact forms, and then generates a simplified, customizable layout suitable for wireframing or initial design. Setup Prepare your unique services and target audience profile for customization later. Gather the competitor’s landing page URL you wish to analyze. Run the workflow, inputting your competitor’s URL when prompted. How to customize this workflow to your needs After generating the initial layout, adapt section names and content blocks to highlight your services and brand messaging. Add or remove sections based on your objectives and audience insights. Integrate additional nodes for richer analysis, such as keyword extraction or design pattern detection, to tailor the output further.
by Un tal Camilo Medina
🤖 Telegram Bot Webhook Configuration Tool This workflow creates a simple web form that helps you configure Telegram bot webhooks quickly. Instead of manually constructing the Telegram API URL, this tool does it for you automatically. How It Works The workflow consists of three main steps: Form Input: A web form collects your bot token and webhook URL URL Construction: Automatically builds the correct Telegram API URL Redirect: Takes you directly to the Telegram API to complete the configuration What You Need Bot Token**: Get this from @BotFather on Telegram (format: 123456789:ABCdefGHIjklMNOpqrsTUVwxyz) Webhook URL**: Your n8n webhook endpoint (must be HTTPS) Setup Instructions Import this workflow into your n8n instance Activate the workflow Access the generated form URL Fill in your bot details and submit Form Fields | Field | Description | Example | |-------|-------------|---------| | Bot API Token | Token from BotFather | 123456789:ABCdefGHIjklMNOpqrsTUVwxyz | | Webhook URL | Your n8n webhook endpoint | https://your-instance.app.n8n.cloud/webhook/telegram | What Happens You enter your bot token and webhook URL in the form The workflow constructs this URL: https://api.telegram.org/bot{TOKEN}/setWebhook?url={WEBHOOK_URL} You're redirected to that URL where Telegram configures your webhook Telegram shows you a success or error message Benefits No Manual URL Building**: Eliminates copy-paste errors Quick Setup**: Configure webhooks in seconds Privacy Focused**: No data is stored anywhere Team Friendly**: Share the form URL with team members Common Webhook URLs n8n Cloud: https://your-instance.app.n8n.cloud/webhook/telegram-bot Self-hosted: https://your-domain.com/webhook/telegram-bot Requirements n8n with form trigger support Valid Telegram bot token Publicly accessible webhook URL (HTTPS required) Troubleshooting Invalid Token Error: Make sure you copied the complete token from BotFather Webhook Error: Ensure your URL is publicly accessible and uses HTTPS SSL Error: Verify your webhook URL has a valid SSL certificate This tool simply automates the manual process of visiting the Telegram API URL to configure your bot's webhook. Perfect for developers who frequently set up or change Telegram bot configurations.
by Ranjan Dailata
Who this is for? Google SERP Tracker + Trends and Recommendations is an AI-powered n8n workflow that extracts Google search results via Bright Data, parses them into structured JSON using Google Gemini, and generates actionable recommendations and search trends. It outputs CSV reports and sends real-time Webhook notifications. This workflow is ideal for: SEO Agencies needing automated rank & trend tracking Growth Marketers seeking daily/weekly search-based insights Product Teams monitoring brand or competitor visibility Market Researchers performing search behavior analysis No-code Builders automating search intelligence workflows What problem is this workflow solving? Traditional tracking of search engine rankings and search trends is often fragmented and manual. Analyzing SERP changes and trends requires: Manual extraction or using unstable scrapers Unstructured or cluttered HTML data Lack of actionable insights or recommendations This workflow solves the problem by: Automating real-time Google SERP data extraction using Bright Data Structuring unstructured search data using Google Gemini LLM Generating actionable recommendations and trends Exporting both CSV reports automatically to disk for downstream use Notifying external systems via Webhook What this workflow does Accepts search input, zone name, and webhook notification URL Uses Bright Data to extract Google Search Results Uses Google Gemini LLM to parse the SERP data into structured JSON Loops over structured results to: Extract recommendations Extract trends Saves both as .csv files (example below): Google_SERP_Recommendations_Response_2025-06-10T23-01-50-650Z.csv Google_SERP_Trends_Response_2025-06-10T23-01-38-915Z.csv Sends a Webhook with the summary or file reference LLM Usage Google Gemini LLM handles: Parsing Google Search HTML into structured JSON Summarizing recommendation data Deriving trends from the extracted SERP metadata Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set input fields with the search criteria, Bright Data Zone name, Webhook notification URL. How to customize this workflow to your needs Input Customization Set your target keyword/phrase in the search field Add your webhook_notification_url for external triggers or notifications SERP Source You can extend the Bright Data search logic to include other engines like Bing or DuckDuckGo. Output Format Edit the .csv structure in the Convert to File nodes if you want to include/exclude specific columns. LLM Prompt Tuning The Gemini LLM prompt inside the Recommendation or Trends extractor nodes can be fine-tuned for domain-specific insight (e.g., SEO vs eCommerce focus).
by Amit Mehta
How it Works This workflow reads sheet details from a source Google Spreadsheet, creates a new spreadsheet, replicates the sheet structure, enriches the content by reading data, and writes it into the corresponding sheets in the new spreadsheet. The process is looped for every sheet, providing an automated way to duplicate and transform structured data. 🎯 Use Case Automate duplication and data enrichment for multi-sheet Google Spreadsheets Replicate templates across new documents with consistent formatting Data team workflows requiring repetitive structured Google Sheets setup Setup Instructions 1. Required Google Sheets You must have a source spreadsheet with multiple sheets. The destination spreadsheet will be created automatically. 2. API Credentials Google Sheets OAuth2** – connect to both read and write spreadsheets. HTTP Request Auth** – if external API headers are needed. 3. Configure Fields in Write Sheet Ensure you define appropriate columns and mapping for the destination sheet. 🔁 Workflow Logic Manual Trigger: Starts the flow on user demand. Create New Spreadsheet: Generates a blank spreadsheet. HTTP Request: Retrieves all sheet names from the source spreadsheet. JavaScript Code: Extracts titles and metadata from the HTTP response. Loop Over Sheets: Iterates through each sheet retrieved. Delete Default Sheet: Removes the placeholder 'Sheet1'. Create Sheets: Replicates each original sheet in the new document. Read Spreadsheet1: Pulls data from the matching original sheet. Write Sheet: Appends the data to the newly created sheets. 🧩 Node Descriptions | Node Name | Description | |-----------|-------------| | Manual Trigger | Starts the workflow manually by user test. | | Create New Spreadsheet | Creates a new Google Spreadsheet for output. | | HTTP Request | Fetches metadata from the source spreadsheet including sheet names. | | Code | Processes sheet metadata into a list for iteration. | | Loop Over Items | Loops over each sheet to replicate and populate. | | Google Sheets2 | Deletes the default 'Sheet1' from the new spreadsheet. | | Create Sheets | Creates a new sheet matching each source sheet. | | Read Spreadsheet1 | Reads data from the source sheet. | | Write sheet | Writes the data into the corresponding new sheet. | 🛠️ Customization Tips Adjust the Google Sheet title to be dynamic or user-input driven Add filtering logic before writing data Append custom audit columns like 'Timestamp' or 'Processed By' Enable logging or Slack alerts after each sheet is created 📎 Required Files | File Name | Purpose | |-----------|---------| | My_workflow_4.json | Main workflow JSON file for sheet duplication and enrichment | 🧪 Testing Tips Test with a spreadsheet containing 2–3 simple sheets Validate whether all sheets are duplicated Check if columns and data structure remain intact Watch for authentication issues in Google Sheets nodes 🏷 Suggested Tags & Categories #GoogleSheets #Automation #DataEnrichment #Workflow #Spreadsheet
by Eric
Why use this You need to delete (many) posts on a WordPress website and also delete the featured image associated with each post. Hours of rote work cut into a fraction with this automation. How it works set your wordpress URL in the manual trigger node set your WP post search parameters (WP API returns 10 posts by default; you could also set up pagination for scaling this automation beyond 10 posts per execution) decide (and build) your filter/approval process What you can expect this automation is set up to run the 10 oldest pending posts, with oldest first if you remove the 'Filter' node from the workflow, after each run, another 10 posts will be returned from WP Notes on Filter/Approval This is arbitrary and depends on your own use case. Maybe you have an editor who needs to approve the post deletion. You might want to get approval by email, slack msg or ticketing system. Or maybe you just want to monitor the process and spare specific posts from deletion. I used the Filter node to only grab the first item (itemIndex < 1) which in this case was the oldest pending post. This could also be expanded to two separate workflows: One triggered when a pending post is created that sends an approval request A second triggered by the approval/rejection that either publishes or deletes the post, depending on the approval result This would require another HTTP request, similar to the DELETE post request, that instead publishes the post.
by Avkash Kakdiya
🔁 What This Workflow Does This automation fetches daily AI-related articles from trusted RSS feeds, summarizes them using OpenAI (GPT), and generates a ready-to-post LinkedIn update in your writing style. It then emails the post to you every morning for review and publishing. High-Level Steps: Triggers every morning via Cron. Fetches latest AI news from multiple RSS sources. Filters recent articles (last 24 hrs). Summarizes each article using OpenAI (ChatGPT). Generates a LinkedIn-style post using your tone. Sends the post to your Gmail for review. ⚙️ Setup Steps Estimated setup time: 15–30 minutes You’ll need: OpenAI API key Gmail account connected in n8n RSS feed URLs (defaults are provided) Add your email in the Gmail node to receive daily posts. Add your tone/style prompt in the ChatGPT nodes (instructions inside workflow).
by Matthieu
LinkedIn Profile Enrichment Automation Who is this for? This template is perfect for sales teams, recruiters, marketing professionals, and business development specialists who need to gather comprehensive LinkedIn profile data at scale. Ideal for lead generation teams building prospect databases, recruiters sourcing candidate information, sales professionals researching prospects, and marketing teams creating targeted outreach campaigns. What problem does this workflow solve? Manually collecting detailed information from LinkedIn profiles is incredibly time-consuming and prone to inconsistency. Visiting each profile individually to extract names, job titles, experience, education, skills, and contact details can take hours for even small prospect lists. This automation eliminates the tedious manual data entry while ensuring consistent, comprehensive profile enrichment. What this workflow does This workflow automatically enriches a list of LinkedIn profile URLs by extracting comprehensive professional data including: Personal details** (first name, last name, headline, location) Professional status** (hiring status, open to work indicators) Network metrics** (connections, followers count) Work experience** (up to 5 most recent positions with company details, dates, and roles) Education background** (up to 3 educational institutions with degrees and dates) Skills and languages** (complete skill sets and language proficiencies) Professional summary** (profile description and bio) The enriched data is automatically organized and updated in your Google Sheets database with structured formatting for easy analysis and outreach. Setup Create a Ghost Genius API account and obtain your API key for cookieless LinkedIn profile scraping Configure HTTP Request credentials with Header Auth using your Ghost Genius API key Set up your Google Sheets database using the provided template with columns: URL, Firstname, Lastname, Tagline, Location Connections, Followers, Hiring?, Open to work? Summary, Languages, Skills Experience 1-5, Education 1-3 Configure Google Sheets OAuth2 credentials following n8n's authentication setup guide Add LinkedIn profile URLs to the first column of your Google Sheet to begin enrichment Test the workflow with a small batch before processing larger lists How to customize this workflow Adjust batch processing settings** to handle larger volumes by modifying the batch size and interval timing Add data validation rules** to filter out incomplete or invalid profiles before processing Integrate with CRM systems** like HubSpot or Salesforce to automatically sync enriched data Set up automated scheduling** to regularly re-enrich profiles and capture profile updates Add email notifications** to alert when enrichment batches are completed or encounter errors Customize data mapping** to include additional profile fields or reorganize the output structure Add duplicate detection** to prevent re-processing the same profiles multiple times
by Stefan
Automate LinkedIn engagement without sounding like a bot. This workflow: 🌍 Detects language & tone (German / English) 👍 Chooses the right reaction (like / celebrate / support …) 🗣 Generates a personalised comment in your voice and mentions the author 📲 Optional Telegram review – approve ✅ or regenerate ❌ before posting 💸 Runs on cost-efficient GPT-4o mini or Claude 3.5 Haiku ☁️ Publishes comment + reaction via the Unipile API Setup (≈ 15-30 min) Unipile – connect LinkedIn → copy account_id, dsn, then create an Access-Token (X-API-KEY). Telegram (optional) – create a bot, add a credential named YOUR TELEGRAM ACCOUNT. OpenAI / Anthropic – add your API key and keep one LLM node (delete the other). Open the “Defining guardrails” node and replace the credential placeholders. (Optional) Tweak role, comment_length and openers_example_1-3 for your brand voice. Security: no live keys included – all secrets are placeholders. Best for: solopreneurs, marketing teams, personal-branding consultants.
by Alex
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How It Works This template orchestrates a multi-step workflow that constructs a comprehensive four-zone automation matrix—Green, Yellow, Red, and White—grounded in the Human Agency Scale (HAS). When a user sends a job title via Telegram, the workflow routes both text and voice messages appropriately. Voice messages are transcribed via OpenAI's Whisper, while text inputs bypass transcription. Both streams merge into a single data flow. The AI Agent node, powered by GPT-4, analyzes the user's profession and core tasks. It also leverages live context by calling the Tavily search tool, ensuring the analysis incorporates up-to-date information. After the evaluation, the workflow formats and returns the completed matrix, with detailed task examples and rationales for each zone, back to the user via Telegram. Setup Instructions Create an OpenAI credential in n8n (model: GPT-4.1 mini). Add a Tavily credential with your API key (FREE plan available). Configure a Telegram Bot credential: API bot token. Import this JSON as a new workflow in n8n and map credentials in each node. Activate the workflow; test by sending sample job titles; adjust node timeouts and webhook settings as needed. Requirements n8n v1.0.0 or higher Active OpenAI API key (GPT-4.1 mini access) Tavily API key for web context search Telegram Bot token with correctly configured webhook Stable internet connectivity Audience & Problem This template is designed for consultants, HR professionals, and analysts who need a scalable, standardized approach to evaluate which routine tasks in a given profession can be automated, which require human oversight, and which should remain manual to preserve strategic judgment, creativity, and expertise.
by Emmanuel Bernard
This workflow illustrates how to use Perplexity AI in your n8n workflow. Perplexity is a free AI-powered answer engine that provides accurate, trusted, and real-time answers to any question. Credentials Setup 1/ Go to the perplexity dashboard, purchase some credits and create an API Key https://www.perplexity.ai/settings/api 2/ In the perplexity Request node, use Generic Credentials, Header Auth. For the name, use the value "Authorization" And for the value "Bearer pplx-e4...59ea" (Your Perplexity Api Key) AI Model Sonar Pro is the current top model used by perplexity. If you want to use a different one, check this page: https://docs.perplexity.ai/guides/model-cards
by Yaron Been
Automated monitoring system that tracks startup activities, funding events, and company updates in real-time, providing valuable market intelligence. 🚀 What It Does Real-time monitoring of startup activities Funding alerts and updates Competitor tracking Industry trend analysis Customizable watchlists 🎯 Perfect For Venture capitalists Startup founders Business development teams Market researchers Investment analysts ⚙️ Key Benefits ✅ Stay ahead of market movements ✅ Never miss important funding rounds ✅ Track competitor activities ✅ Identify emerging trends ✅ Save hours of manual research 🔧 What You Need Crunchbase API access n8n instance Notification preferences (email/Slack/Teams) 📊 Data Points Tracked New funding rounds Company updates Leadership changes Product launches Market expansions 🛠️ Setup & Support Quick Setup Deploy in 20 minutes with our step-by-step configuration guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Stay informed about the startup ecosystem with automated monitoring and alerts. Make data-driven decisions with timely, relevant information.