by Mary Newhauser
Build a Weekly AI Trend Alerter with arXiv and Weaviate Ditch the endless scroll for AI trends. Meet Archi, your personal AI research assistant that hits you up once a week with everyone you need to know. 🧑🏽🔬 This workflow scrapes AI and machine learning article abstracts from arXiv, enriches them with topic categories using a LLM, and embeds them in a Weaviate vector store. The vector store is then used as a tool for agentic RAG to write a concise, easy-to-read summary of the week in AI research. The final output is a short, weekly email sent to the address of your choice that summarizes key AI research trends and future research directions, with links directly to the most interesting and impactful arXiv papers of the week. Who it's for This workflow is for anyone who can't keep up with all the latest AI advances. Coding skills are not required. How it works This is a contiguous workflow that can be summarized in two main parts: a data pipeline that fetches and embeds articles in Weaviate, and an agentic workflow that generates a weekly email summary. Part 1: Automatically fetch newly published articles on a weekly basis Fetch article abstracts (and metadata) from arXiv's free API Pre-process abstract data Enrich each article with a primary topic, secondary topics, and estimated potential impact of the research using a LLM Post-process data Insert data and embeddings into Weaviate Part 2: Use an AI Agent and Weaviate to generate a weekly summary email Add Weaviate as a Tool to an AI agent node Query Weaviate, agentically, to generate a report on the most important research trends of the week Post-process data Send the summary via email Prerequisites An existing Weaviate cluster. You can view instructions for setting up a local cluster with Docker here or a Weaviate Cloud cluster here. API keys to generate embeddings and power chat models. We use a combination of OpenRouter and OpenAI models. Feel free to switch out the models as you like. An email address with STMP privileges. This is the address the email will come from. In this demo we use a personal Gmail address. You can create a new credential to link a STMP Account using these instructions. Self-hosted n8n instance. See this video for how to get set up in just three minutes. How to run the workflow Go through the prerequisites, creating a Weaviate cluster (can be local or cloud), downloading self-hosted n8n, creating STMP privileges for your email account, and adding your API keys and other credentials. Select the embedding and chat models you'd like to use. Enter the email addresses you want to send the email from and to. Let it rip. Workflow output The output for this workflow is a weekly email that summarizes key research trends and future research directions based on AI and ML papers published on arXiv. Here's an example of a summary email: Hey there, Here's a quick rundown of the key trends in Machine Learning research from the past week. * Key Research Trends This Week* This week saw significant advancements in retrieval-augmented systems, foundation models for specialized domains, and techniques balancing efficiency with performance. Advanced RAG Architectures**: Researchers are developing sophisticated RAG frameworks that go beyond simple document retrieval, with AdaPCR introducing passage combination retrieval and UrbanMind proposing a framework for urban intelligence with multilevel optimization. Foundation Models for Tabular Data**: The Real-TabPFN shows that targeted continued pre-training on real-world datasets can significantly boost the performance of foundation models for tabular data, outperforming models trained on broader, potentially noisier datasets. Efficiency-Focused Techniques**: Researchers are developing resourceful methods that maintain performance without expensive computations, like logit reweighting for topic-focused summarization and strategic querying for privacy-preserving personalization. * Future Research Directions* Based on current trends, we expect to see the following developments in the near future: Explainable RAG Systems**: Following the source attribution work in RAG systems, we can expect more research into making complex retrieval systems transparent and explainable for users. Cross-Domain and Cross-Modal Fusion**: The promising performance of vision-language and code-specialized LLMs in retrieval tasks points toward unified retrievers capable of handling text, code, images, and multimodal content. Data-Centric Synthetic Generation**: As shown by work on synthetic relational tabular data, we'll likely see more sophisticated approaches to generating high-quality synthetic data for pre-training foundation models in specialized domains. This week highlights how researchers are making AI more efficient, explainable, and applicable to specialized domains. Look out for more developments in RAG systems, tabular foundation models, and privacy-preserving AI techniques in the coming weeks. Until next week, Archi Want to make it better? Feel free to tweak, build on, or completely reconfigure this workflow. If you come up with something cool, let us know and we might just share it with our community! 💚
by VipinW
Apply to jobs automatically from Google Sheets with status tracking Who's it for Job seekers who want to streamline their application process, save time on repetitive tasks, and never miss following up on applications. Perfect for anyone managing multiple job applications across different platforms. What it does This workflow automatically applies to jobs from a Google Sheet, tracks application status, and keeps you updated with notifications. It handles the entire application lifecycle from submission to status monitoring. Key features: Reads job listings from Google Sheets with filtering by priority and status Automatically applies to jobs on LinkedIn, Indeed, and other platforms Updates application status in real-time Checks application status every 2 days and notifies you of changes Sends email notifications for successful applications and status updates Prevents duplicate applications and manages rate limiting How it works The workflow runs on two main schedules: Daily Application Process (9 AM, weekdays): Reads your job list from Google Sheets Filters for jobs marked as "Not Applied" with Medium/High priority Processes each job individually to prevent rate limiting Applies to jobs using platform-specific APIs (LinkedIn, Indeed, etc.) Updates the sheet with application status and reference ID Sends confirmation email for each application Status Monitoring (Every 2 days at 10 AM): Checks all jobs with "Applied" status Queries job platforms for application status updates Updates the sheet if status has changed Sends notification emails for status changes (interviews, rejections, etc.) Requirements Google account with Google Sheets access Gmail account for notifications Resume stored online (Google Drive, Dropbox, etc.) API access to job platforms (LinkedIn, Indeed) - optional for basic version n8n instance (self-hosted or cloud) How to set up Step 1: Create Your Job Tracking Sheet Create a Google Sheet with these exact column headers: | Job_ID | Company | Position | Status | Applied_Date | Last_Checked | Application_ID | Notes | Job_URL | Priority | |--------|---------|----------|--------|--------------|--------------|----------------|-------|---------|----------| | JOB001 | Google | Software Engineer | Not Applied | | | | | https://careers.google.com/jobs/123 | High | | JOB002 | Microsoft | Product Manager | Not Applied | | | | | https://careers.microsoft.com/jobs/456 | Medium | Column explanations: Job_ID**: Unique identifier (JOB001, JOB002, etc.) Company**: Company name Position**: Job title Status**: Not Applied, Applied, Under Review, Interview Scheduled, Rejected, Offer Applied_Date**: Auto-filled when application is submitted Last_Checked**: Auto-updated during status checks Application_ID**: Platform reference ID (auto-generated) Notes**: Additional information or application notes Job_URL**: Direct link to job posting Priority**: High, Medium, Low (Low priority jobs are skipped) Step 2: Configure Google Sheets Access In n8n, go to Credentials → Add Credential Select Google Sheets OAuth2 API Follow the OAuth setup process to authorize n8n Test the connection with your job tracking sheet Step 3: Set Up Gmail Notifications Add another credential for Gmail OAuth2 API Authorize n8n to send emails from your Gmail account Test by sending a sample email Step 4: Update Workflow Configuration In the "Set Configuration" node, update these values: spreadsheetId**: Your Google Sheet ID (found in the URL) resumeUrl**: Direct link to your resume (make sure it's publicly accessible) yourEmail**: Your email address for notifications coverLetterTemplate**: Customize your cover letter template Step 5: Customize Application Logic For basic version (no API access): The workflow includes placeholder HTTP requests that you can replace with actual job platform integrations. For advanced version (with API access): Replace LinkedIn/Indeed HTTP nodes with actual API calls Add your API credentials to n8n's credential store Update the platform detection logic for additional job boards Step 6: Test and Activate Add 1-2 test jobs to your sheet with "Not Applied" status Run the workflow manually to test Check that the sheet gets updated and you receive notifications Activate the workflow to run automatically How to customize the workflow Adding New Job Platforms Update Platform Detection: Modify the "Check Platform Type" node to recognize new job board URLs Add New Application Node: Create HTTP request nodes for new platforms Update Status Checking: Add status check logic for the new platform Customizing Application Strategy Rate Limiting**: Add "Wait" nodes between applications (recommended: 5-10 minutes) Application Timing**: Modify the cron schedule to apply during optimal hours Priority Filtering**: Adjust the filter conditions to match your criteria Multiple Resumes**: Use conditional logic to select different resumes based on job type Enhanced Notifications Slack Integration**: Replace Gmail nodes with Slack for team notifications Discord Webhooks**: Send updates to Discord channels SMS Notifications**: Use Twilio for urgent status updates Dashboard Updates**: Connect to Notion, Airtable, or other productivity tools Advanced Features AI-Powered Personalization**: Use OpenAI to generate custom cover letters Job Scoring**: Implement scoring logic based on job requirements vs. your skills Interview Scheduling**: Auto-schedule interviews when status changes Follow-up Automation**: Send follow-up emails after specific time periods Important Notes Platform Compliance Always respect rate limits to avoid being blocked Follow each platform's Terms of Service Use official APIs when available instead of web scraping Don't spam job boards with excessive applications Data Privacy Store credentials securely using n8n's credential store Don't hardcode API keys or personal information in nodes Regularly review and clean up old application data Ensure your resume link is secure but accessible Quality Control Start with a small number of jobs to test the workflow Review application success rates and adjust strategy Monitor for errors and set up proper error handling Keep your job list updated and remove expired postings This workflow transforms job searching from a manual, time-consuming process into an automated system that maximizes your application efficiency while maintaining quality and compliance.
by Lucas Peyrin
How it works Ever wonder how to make your workflows smarter? How to handle different types of data in different ways? This template is a hands-on tutorial that teaches you the three most fundamental nodes for controlling the flow of your automations: Merge, IF, and Switch. To make it easy to understand, we use a simple package sorting center analogy: Data Items** are packages on a conveyor belt. The Merge Node is where multiple conveyor belts combine into one. The IF Node is a simple sorting gate with two paths (e.g., "Fragile" or "Not Fragile"). The Switch Node is an advanced sorting machine that routes packages to many different destinations. This workflow takes you on a step-by-step journey through the sorting center: Creating Packages: Three different "packages" (two letters and one parcel) are created using Set nodes. Merging: The first Merge node combines all three packages onto a single conveyor belt so they can be processed together. Simple Sorting: An IF node checks if a package is fragile. If true, it's sent down one path; if false, it's sent down another. Re-Grouping: After being processed separately, another Merge node brings the packages back together. This "Split > Process > Merge" pattern is a critical concept in n8n! Advanced Sorting: A Switch node inspects each package's destination and routes it to the correct output (London, New York, Tokyo, or a Default bin). By the end, you'll see how all packages have been correctly sorted, and you'll have a solid understanding of how to build intelligent, branching logic in your own workflows. Set up steps Setup time: 0 minutes! This template is a self-contained tutorial and requires zero setup. There are no credentials or external services to configure. Simply click the "Execute Workflow" button. Follow the flow from left to right, clicking on each node to see its output and reading the detailed sticky notes to understand what's happening at each stage.
by Amit Mehta
How it Works This workflow automates the collection and analysis of YouTube comments from a video and sends a summary report via email, using Google Sheets, the YouTube API, OpenAI (GPT-4o), and Gmail. Whether you're a content creator, brand manager, or social media analyst, this workflow helps you automate sentiment analysis and receive insights directly in your inbox — all triggered from a simple spreadsheet. 🎯 Use Case Ideal for: YouTubers** monitoring audience sentiment Marketing teams** analyzing campaign feedback Community managers** summarizing engagement Setup Instructions 1. Upload the Spreadsheet File name: Youtube_Video Sheet structure: | ID | Video Title | YouTube Video ID | Status | Add video IDs and set their Status as Pending 2. Configure Google Sheets Nodes Connect your Google account to: Pick Video IDs from Google Sheet Update Status on Google Sheet 3. Add API Credentials YouTube API Key** → for comment + video scraping nodes OpenAI API Key** → for analyzing comments Gmail Account** → for sending the summary email 4. Activate the Workflow Once live, the workflow will: Watch for new or updated rows in the spreadsheet Scrape comments using the YouTube API Analyze sentiment and key themes via GPT-4o Send a formatted HTML email with the summary Update the spreadsheet status to Mail sent 🔁 Workflow Logic Trigger: New/updated row in Google Sheet Retrieve: YouTube video metadata + comments Analyze: Comments using GPT-4o Email: Summary report via Gmail Update: Spreadsheet status to Mail sent 🧩 Node Descriptions | Node Name | Description | |-----------|-------------| | Pick Video IDs from Google Sheet | Watches the spreadsheet and retrieves pending video IDs | | If | Checks whether status is 'Pending' | | Limit | Restricts the number of processed rows | | Set Video Details | Prepares video info (e.g., title, channel) | | Get YouTube Video Details | Fetches metadata (title, channel, etc.) | | Get YouTube Video Comments | Pulls top-level comments using YouTube API | | Prepare Comments Data | Formats comment text for OpenAI | | AI Agent | Summarizes comments using OpenAI's GPT-4o | | Prepare HTML for Email | Converts summary into HTML for email body | | Gmail Account Configuration | Sends the email report via Gmail | | Update Status on Google Sheet | Marks the row as 'Mail sent' | 🛠️ Customization Tips Change the AI prompt for tone, length, or custom metrics Send results to Slack or Telegram instead of Gmail Export summaries to Notion, Airtable, or PDF Schedule it daily/weekly for recurring analysis 📒 Suggested Sticky Notes for Workflow | Node/Section | Sticky Note Content | |--------------|---------------------| | Pick Video IDs from Google Sheet | "Triggers on new YouTube videos in your spreadsheet" | | AI Agent | "Uses OpenAI to generate an analysis summary – customize prompt as needed" | | Gmail | "Sends summary report – you can update subject, recipients, or style" | | Update Status | "Marks video as processed to avoid duplicate runs" | 📎 Required Files | File Name | Purpose | |-----------|---------| | Youtube_Video | Google Sheet to hold YouTube video IDs and status | | Youtube_Comment_Scraper.json | Main n8n workflow export for this automation | 🧪 Testing Tips Add one test video with a valid YouTube video ID and status = Pending Monitor the workflow logs to confirm API responses Confirm summary delivery in your inbox Verify that status updates in the sheet 🏷 Suggested Tags & Categories #YouTube #OpenAI #Automation #Marketing #Email #Analytics
by Jimleuk
This n8n template lets you summarize individual team member activity on MS Teams for the past week and generates a report. For remote teams, chat is a crucial communication tool to ensure work gets done but with so many conversations happening at once and in multiple threads, ideas, information and decisions usually live in the moment and get lost just as quickly - and all together forgotten by the weekend! Using this template, this doesn't have to be the case. Have AI crawl through last week's activity, summarize all messages and replies and generate a casual and snappy report to bring the team back into focus for the current week. A project manager's dream! How it works A scheduled trigger is set to run every Monday at 6am to gather all team channel messages within the last week. Messages are grouped by user. AI analyses the raw messages and replies to pull out interesting observations and highlights. This is referred to as the individual reports. All individual reports are then combined and summarized together into what becomes the team weekly report. This allows understanding of group and similar activities. Finally, the team weekly report is posted back to the channel. The timing is important as it should be the first message of the week and ready for the team to glance over coffee. How to use Ideally works best per project and where most of the comms happens on a single channel. Avoid combining channels and instead duplicate this workflow for more channels. You may need to filter for specific team members if you want specific team updates. Customise the report to suit your organisation, team or the channel. You may prefer to be more formal if clients or external stakeholders are also present. Requirements MS Teams for chat platform OpenAI for LLM Customising this workflow If the teams channel is busy enough already, consider posting the final report to email. Pull in project metrics to include in your report. As extra context, it may be interesting to tie the messages to production performance. Use an AI Agent to query for knowledgebase or tickets relevant to the messages. This may be useful for attaching links or references to add context.
by Chris Carr
Split Test Agent Prompts with Supabase and OpenAI Use Case Oftentimes, it's useful to test different settings for a large language model in production against various metrics. Split testing is a good method for doing this. What it Does This workflow randomly assigns chat sessions to one of two prompts, the baseline and the alternative. The agent will use the same prompt for all interactions in that chat session. How it Works When messages arrive, a table containing information regarding session ID and which prompt to use is checked to see if the chat already exists If it does not, the session ID is added to the table and a prompt is randomly assigned These values are then used to generate a response Setup Create a table in Supabase called split_test_sessions. It needs to have the following columns: session_id (text) and show_alternative (bool) Add your Supabase, OpenAI, and PostgreSQL credentials Modify the Define Path Values node to set the baseline and alternative prompt values. Activate the workflow and test by sending messages through n8n's inbuilt chat Experiment with different chat sessions to test see both prompts in action Next Steps Modify the workflow to test different LLM settings such as temperature Add a method to measure the efficacy of the two alternative prompts
by Dhruv from Saleshandy
This n8n template captures every “Request a Demo” booking in Calendly, uses OpenAI to score and qualify leads in real time, routes them into the correct Saleshandy sequence, and logs all data in Google Sheets for full GTM visibility. Use cases include: Empowering SDR teams to focus on high-value demos Providing growth marketers with reliable funnel metrics Automating triage for B2B AE teams overwhelmed by demo requests Good to know OpenAI GPT-4 calls cost based on token usage—you can expect ~1,200 tokens per lead. Calendly API rate-limits at 180 requests/min; consider batching if volume spikes. Google Sheets writes are single-threaded; high-volume users may opt for Airtable or BigQuery. How it works Capture – Webhook node listens for every new “Request a Demo” form submission in Calendly. Score – AI Agent node sends job title, company size, domain quality, and custom questions to OpenAI; returns a 1–10 score plus label (Qualified/Semi-qualified/Unqualified). Verify meeting – HTTP Request node confirms via the Calendly API that a slot was actually scheduled. Route – Switch node selects the appropriate Saleshandy sequence ID (Qualified, Nurture, Disqualify). Send – HTTP Request nodes add each prospect to the chosen Saleshandy sequence. Log – Google Sheets nodes write to three tabs (Qualified, Semi-qualified, Unqualified) with lead data, score, routing path, and timestamp. Prerequisites n8n workspace Accounts & API credentials for: Calendly OpenAI (GPT-4 or GPT-3.5) Google Sheets Saleshandy Step-by-Step Setup 1. Import the n8n Template Upload the JSON file into your n8n workspace. 2. Add Required Credentials In n8n → Credentials, add: Calendly: Personal Access Token (PAT) OpenAI: API Key Google Sheets: OAuth2 connection Saleshandy: API Key 3. Calendly Setup Go to Calendly Webhook Docs Create a Routing Form in Calendly. Generate your access token. Use Postman or any API client to: Make a POST request to create a webhook subscription. Use your n8n webhook URL in the url field. Add your Authorization token and extract the Organization ID. Paste the webhook URL into the Calendly Routing Form. 4. Set Your Saleshandy Sequences In n8n, locate the Set: Sequence IDs node. Replace the placeholder text with: Your actual Qualified Semi-qualified and Unqualified Saleshandy sequence step IDs. 5. Configure Google Sheets Create a spreadsheet with the following tabs: Qualified Semi-qualified Unqualified In n8n, connect the three Google Sheets nodes to this file. Customising this workflow Adjust scoring logic – Modify the OpenAI prompt in the AI Agent node to weight ARR, industry, or headcount differently. Refine thresholds – Change the Switch node rules for score ranges (e.g., Qualified ≥8, Semi-qualified 5–7). Swap destinations – Edit HTTP Request nodes to integrate with your CRM or email platform instead of Saleshandy. Enhance logging – Replace Google Sheets with Airtable, BigQuery, or another analytics store. Add notifications – Insert Slack or Microsoft Teams nodes after routing to alert reps instantly.
by David Olusola
📊 Google Sheets MCP Workflow – AI Meets Spreadsheets! 😄 ✨ What It Does This n8n workflow lets you chat with your spreadsheets using AI + MCP! From reading and updating data to creating sheets, it’s your smart assistant for Google Sheets 📈🤖 🚀 Cool Features 💬 Natural language commands (e.g. "Add a new lead: John Doe") ✏️ Full CRUD (Create, Read, Update, Delete) 🧠 AI-powered analysis & smart workflows 🗂️ Multi-sheet support 🔗 Works with ChatGPT, Claude, and more (via MCP) 💡 Use Cases Data Tasks: “Update status to 'Done' in row 3” Sheet Ops: “Create a ‘Marketing 2024’ sheet” Business Flows: “Summarize top sales from Q2” 🛠️ Quick Setup Import Workflow into n8n Copy the JSON In n8n → Import JSON → Paste & Save ✅ Connect Google Sheets Create a project in Google Cloud Enable Sheets & Drive APIs Create OAuth2 credentials In n8n → Add Google Sheets OAuth2 credential → Connect 🔐 Add Your Credentials Get your credential ID Open each Google Sheets node → Update with your new credential ID Link to AI (Optional 😊) MCP webhook is pre-set Plug it into your AI tool (like ChatGPT) Send test command → Watch the magic happen ✨ ✅ Test It Out Try these fun commands: 🆕 "Add entry: Jane Doe, janed@example.com" 🔍 "Read data from Sales 2024" 🧹 "Clear data from A1:C5" ➕ "Create sheet 'Budget 2025'" ❌ "Delete sheet 'Test'" 🧠 MCP Command List (AI-Callable Functions) These are the tasks the AI can perform via MCP: Add a new entry to a sheet Read data from a sheet Update a row in a sheet Delete a row from a sheet Create a new sheet Delete an existing sheet Clear data from a specific range Summarize data from a sheet using AI ⚙️ Tips & Fixes OAuth2 Errors? Re-authenticate and check scopes Confirm redirect URI is exact Permissions? Spreadsheet must be shared with edit access Use service accounts for production Webhook Not Firing? Double-check the URL Trigger it manually to test
by Zacharia Kimotho
This workflow makes it easier to keep track of the stocks market and get an email with a summary of the daily highlights on what happened, key insights and trends Setup Guide Define the schedule (days, times, intervals). Replace sample stock data with your desired stock list (ticker, name, etc.) in JSON format. Split Out the fields to have a clean list of the stocks to monitor set keyword node Extracts the stock ticker from each item and sets it to the keyword property. Financial times scraper Triggers the Bright Data Datasets API to scrape financial data. Set the node as below Method: POST URL: https://api.brightdata.com/datasets/v3/trigger Query Parameters: dataset_id: Replace with your Bright Data dataset ID. include_errors: true type: discover_new discover_by: keyword Headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Replace with your Bright Data API key. Body: JSON, ={{ $('set keyword').all().map(item => item.json)}} Execute Once: Checked. Get progress node Checks the status of the Bright Data scraping job if complete, or running Setup: URL: https://api.brightdata.com/datasets/v3/progress/{{ $json.snapshot_id }} Headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Replace with your Bright Data API key. Get snapshot + data retrieves the scraped data from the Bright Data API. Pass the request as URL: https://api.brightdata.com/datasets/v3/snapshot/{{ $json.snapshot_id }} Query Parameters: format: json Headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Replace with your Bright Data API key. Aggregate. Combines the data from each stock item into a single object Update to sheet and add all items to This sheet. Make a copy before you can map the data create summary node generates a summary of the scraped stock data using the Google Gemini AI model and notifies you via Gmail. Setup: Prompt Type: define Text: Customize the prompt to define the AI's role, input format, tasks, output format (HTML email), and constraints. Google Sheets. Appends the scraped data to a Google Sheet. This should be set to automap so as to adjust to the results found in the request Important Notes: Remember to replace placeholder values (API keys, dataset IDs, email addresses, Google Sheet IDs) with your actual values. Review and customize the AI prompt for the "create summary" node to achieve the desired email summary output. Consider adding error handling for a more robust workflow. Monitor API usage to avoid rate limits.
by Julian Ivanov
How it works This workflow automates the transformation of standard product images into professional product photography featuring human models It uses AI to analyze product images, create tailored photography prompts, and generate high-quality enhanced versions Set up steps You'll need an OpenAI API key and access to gpt-image-1 (verify your organization) Set up a Google Sheets spreadsheet with columns: Image-URL, Prompt, Output Create a Google Drive folder to store the generated images Requirements: OpenAI API access (for image generation and analysis) Google Sheets and Google Drive accounts Basic product images (URLs) as input The spreadsheet must contain a column named "Image-URL" with links to the product images This workflow automatically: Reads product image URLs from your Google Sheet Downloads the images for processing Analyzes each image to understand what product it contains Creates specialized photography prompts ensuring each product is shown with a human model Generates professional product photography using OpenAI's image generation capabilities Uploads results to Google Drive and updates your spreadsheet with links Extra: You can also use the included simple image generation workflow to directly create images via prompt without product image input. This option lets you quickly generate images through the OpenAI API using just text prompts
by n8n Team
This workflow digests mentions of n8n on Reddit that can be sent as an single email or Slack summary each week. We use OpenAI to classify if a specific Reddit post is really about n8n or not, and then the summarise it into a bullet point sentence. How it works Get posts from Reddit that might be about n8n; Filter for the most relevant posts (posted in last 7 days and more than 5 upvotes and is original content); Check if the post is actually about n8n; If it is, categorise with OpenAI. Bear in mind: Workflow only considers first 500 characters of each reddit post. So if n8n is mentioned after this amount, it won't register as being a post about n8n.io. Next steps Improve OpenAI Summary node prompt to return cleaner summaries; Extend to more platforms/sources - e.g. it would be really cool to monitor larger Slack communities in this way; Do some classification on type of user to highlight users likely to be in our ICP; Separate a list of data sources (reddit, twitter, slack, discord etc.), extract messages from there and have them go to a sub workflow for classification and summarisation.
by M Shehroz Sajjad
What problem does it solve? Manual candidate screening is time-consuming and inconsistent. This workflow automates initial interviews, providing 24/7 availability, consistent questioning, and objective assessments for every candidate. Who is it for? HR teams handling high-volume recruiting Small businesses without dedicated recruiters Companies scaling their hiring process Remote-first organizations needing asynchronous screening What this workflow does Creates AI interviewers from job descriptions that conduct natural conversations with candidates via BeyondPresence Agents. Automatically analyzes interviews and saves structured assessments to Google Sheets. Setup Copy template sheet: BeyondPresence HR Interview System Template Add credentials: BeyondPresence API Key OpenAI API Google Sheets Configure webhook in BeyondPresence dashboard: https://[your-n8n-instance]/webhook/beyondpresence-hr-interviews Paste job description and run setup Share generated link with candidates How it works Agent Creation: Converts job description into conversational AI interviewer Interview Conduct: Candidates chat naturally with AI via shared link Webhook Trigger: Completed interviews sent to n8n AI Analysis: OpenAI evaluates responses against job requirements Results Storage: Assessments saved to Google Sheets with scores and recommendations Resources Google Sheets Template BeyondPresence Documentation Webhook Setup Guide Example Use Case Tech startup screens 200 applicants for engineering role. Creates AI interviewer in 2 minutes, sends link to all candidates. Receives structured assessments within 24 hours, identifying top 20 candidates for human interviews. Reduces initial screening time from 2 weeks to 2 days.