by Rahul Joshi
Description Process new resumes from Google Drive, extract structured candidate data with AI, save to Google Sheets, and auto-create a ClickUp hiring task. Gain a centralized, searchable candidate database and instant task kickoff—no manual data entry. 🚀 What This Template Does Watches a Google Drive folder for new resume PDFs and triggers the workflow. 📂 Downloads the file and converts the PDF to clean, readable text. 📄 Analyzes resume text with an AI Resume Analyzer to extract structured candidate info (name, email, phone, experience, skills, education). 🤖 Cleans and validates the AI JSON output for reliability. 🧹 Appends or updates a candidate row in Google Sheets and creates a ClickUp hiring task. ✅ Key Benefits Save hours with end-to-end, hands-off resume processing. ⏱️ Never miss a candidate—every upload triggers automatically. 🔔 Keep a single source of truth in Sheets, always up-to-date. 📊 Kickstart hiring instantly with auto-created ClickUp tasks. 🗂 Works with varied resume formats using AI extraction. 🧠 Features Google Drive “Watch for New Resumes” trigger (every minute). ⏲ PDF-to-text extraction optimized for text-based PDFs. 📘 AI-powered resume parsing into standardized JSON fields. 🧩 JSON cleanup and validation for safe storage. 🧰 Google Sheets append-or-update for a central candidate database. 📑 ClickUp task creation with candidate-specific titles and assignment. 🎯 Requirements n8n instance (cloud or self-hosted); recommended n8n version 1.106.3 or higher. 🔧 Google Drive access to a dedicated resumes folder (PDF resumes recommended). 📂 Google Sheets credential with edit access to the candidate database sheet. 📈 ClickUp workspace/project access to create tasks for hiring. 📌 AI service credentials for the Resume Analyzer step (add in n8n Credentials). 🤖 Target Audience HR and Talent Acquisition teams needing faster screening. 👥 Recruiters and staffing agencies handling high volumes. 🏢 Startups and ops teams standardizing candidate intake. 🚀 No-code/low-code builders automating hiring workflows. 🧩 Step-by-Step Setup Instructions Connect Google Drive, Google Sheets, ClickUp, and your AI service in n8n Credentials. 🔐 Set the Google Drive “watched” folder (e.g., Resume_store). 📁 Import the workflow, assign credentials to all nodes, and map your Sheets columns. 🗂️ Adjust the ClickUp task details (title pattern, assignee, list). 📝 Run once with a sample PDF to test, then enable scheduling (every 1 minute). ▶️ Optionally rename the email/task nodes for clarity (e.g., “Create Hiring Task in ClickUp”). ✍️
by DevCode Journey
Who is this for? This n8n workflow is designed for investors, financial analysts, automated trading system developers, and finance enthusiasts who require daily, comprehensive, data-driven insights into specific stock symbols. It's perfect for users who need to automate the complex process of combining technical indicators, news sentiment, professional analyst ratings, and social media buzz into a single, actionable recommendation. This system provides a 24/7 automated "analyst" for portfolio monitoring. What this Workflow Does This n8n workflow executes a daily, multi-faceted analysis of a target stock. It starts by gathering all relevant data (price history, news, ratings, social posts) and processes it through specialized Code nodes to calculate technical indicators (SMA, RSI), determine price predictions (Linear Regression), and perform sentiment analysis on news and social media. Finally, it uses a weighted model to synthesize all data into a single, comprehensive Buy/Sell/Hold recommendation and delivers a detailed report via Telegram. Key Features Daily Scheduling**: Automatically triggers analysis every day at a specified time (e.g., 9:00 AM). Multi-Factor Analysis: Combines **four key domains for a holistic view: Technical, Prediction, News Sentiment, Analyst Ratings, and Social Sentiment. Technical Indicator Calculation: Calculates **SMA (20, 50, 200), RSI (14-day), and identifies Support/Resistance levels. Price Prediction: Uses **Simple Linear Regression to forecast a 7-day price trend and generate an initial recommendation. Sentiment Analysis: Custom Code nodes perform **keyword-based sentiment analysis on news articles and social media posts. Composite Recommendation: A weighted model combines all analysis scores (35% Technical, 25% News, 25% Analyst, 15% Social) to generate a **final recommendation, confidence score, and summary. Automated Alerting: Delivers a fully formatted, easily readable **Markdown report via Telegram. Requirements API Configuration Node**: A preliminary node (implied by the expression references) containing: Target stockSymbols (e.g., TSLA, AAPL). telegramChatId for receiving the report. API Keys for data sources (e.g., a Financial Data API for price/news/ratings, a Social Media API). Telegram Credentials**: For the Telegram node to send the final message. Financial Data Source Workflow**: Requires preceding nodes (not fully visible) to fetch: Historical price data (required for SMA/RSI/Regression). Recent news headlines and summaries. Recent analyst ratings. Social media data (e.g., from Twitter/StockTwits). n8n Instance**: Self-hosted or cloud-based n8n installation. How to Use Step-by-Step Setup 1. Configure Scheduling Open the "Daily Stock Check" node. Set the interval rule to the precise hour you want the report to run (e.g., 9:00 AM). 2. Configure Stock Symbol and Telegram In the (implied) "API Configuration" node, set the stockSymbols you wish to track. Set the target telegramChatId where the report will be delivered. Ensure your Telegram credentials are set up in n8n. 3. Verify Data Fetching Nodes Ensure the nodes feeding data into "Analyze Stock Trends," "Analyze News Sentiment," "Process Analyst Ratings," and "Analyze Social Sentiment" are correctly configured to fetch the required historical price, news, ratings, and social data. 4. Adjust Analysis Weights (Advanced) If you wish to change the importance of different factors, edit the WEIGHTS object inside the "Generate Comprehensive Recommendation" Code node. Default Weights: Technical (0.35), News (0.25), Analyst (0.25), Social (0.15). 5. Test the Workflow Manually execute the workflow to ensure all Code nodes process the incoming data correctly and the "Send Telegram Alert" successfully delivers the final, formatted message. Workflow Components The workflow is structured into three main phases: Data Processing, Recommendation Synthesis, and Reporting. 1. Data Processing and Indicator Calculation | Node Name | Type | Key Functionality | | :--- | :--- | :--- | | Daily Stock Check | Schedule Trigger | Initiates the entire workflow daily at the set time. | | Analyze Stock Trends | Code | Calculates Technical Indicators: SMA (20, 50, 200), RSI (14-day), Volume Trend, and Support/Resistance levels. | | Predict Future Trends | Code | Performs Simple Linear Regression on historical prices to determine slope and predict the price 7 days ahead. | | Analyze News Sentiment | Code | Performs keyword-based sentiment analysis on news headlines and summaries to categorize overall sentiment (positive/negative/neutral) and assign a score. | | Process Analyst Ratings | Code | Aggregates analyst recommendations (Buy/Hold/Sell) to calculate consensus rating and average price target. | | Analyze Social Sentiment | Code | Performs keyword-based sentiment analysis on social media data to determine community mood and trending hashtags. | 2. Recommendation Synthesis | Node Name | Type | Description | | :--- | :--- | :--- | | Combine All Analysis | Merge | Consolidates the outputs from the four analysis branches (Technical, News, Analyst, Social) into a single data item. | | Generate Comprehensive Recommendation | Code | The core logic. Calculates a weighted composite score (from -100 to 100) based on all four inputs, generating the final STRONG BUY/BUY/HOLD/SELL/STRONG SELL recommendation and a numerical confidence score. | 3. Reporting and Alerting | Node Name | Type | Description | | :--- | :--- | :--- | | Format Telegram Message | Set | Constructs the final detailed report message using Markdown formatting, pulling data from all preceding analysis nodes into a clear, structured report. | | Send Telegram Alert | Telegram | Sends the fully formatted analysis report to the pre-configured Telegram chat ID. | 🙋 For Help & Community 👾 Discord: n8n channel 🌐 Website: devcodejourney.com 🔗 LinkedIn: Connect with Shakil 📱 WhatsApp Channel: Join Now 💬 Direct Chat: Message Now
by usamaahmed
🚀 HR Resume Screening Workflow — Smart Hiring on Autopilot 🤖 🎯 Overview: "This workflow builds an AI-powered resume screening system inside n8n. It begins with Gmail and Form triggers that capture incoming resumes, then uploads each file to Google Drive for storage. The resume is downloaded and converted into plain text, where two branches run in parallel: one extracts structured contact details, and the other uses an AI agent to summarize education, job history, and skills while assigning a suitability score. A cleanup step normalizes the data before merging both outputs, and the final candidate record is saved into Google Sheets and Airtable, giving recruiters a centralized dashboard to identify top talent quickly and consistently.” 🔑 Prerequisites: To run this workflow successfully, you’ll need: Gmail OAuth** → to read incoming resumes. Google Drive OAuth** → to upload and download resume files. Google Sheets OAuth** → to save structured candidate records. Airtable Personal Access Token** → for dashboards and record-keeping. OpenAI / OpenRouter API Key** → to run the AI summarizer and evaluator. ⚙️ Setup Instructions: Import the Workflow Clone or import the workflow into your n8n instance. Add Credentials Go to n8n → Credentials and connect Gmail, Google Drive, Google Sheets, Airtable, and OpenRouter/OpenAI. Configure Key Nodes Gmail Trigger → Update filters.q with the job title you are hiring for (e.g., "Senior Software Engineer"). Google Drive Upload → Set the folderId where resumes will be stored. Google Sheets Node → Link to your HR spreadsheet (e.g., “Candidates 2025”). Airtable Node → Select the correct base & table schema for candidate records. Test the Workflow Send a test resume (via email or form). Check Google Sheets & Airtable for structured candidate data. Go Live Enable the workflow. It will now run continuously and process new resumes as they arrive. 📊 End-to-End Workflow Walkthrough: 🟢 Section 1 – Entry & Intake Nodes: 📧 Gmail Trigger → Polls inbox every minute, captures job application emails, and downloads resume attachments (CV0, CV1, …). 🌐 Form Trigger → Alternate entry for resumes submitted via a careers page or job portal. ✅ Quick Understanding: Think of this section as the front desk of recruitment - resumes received either by email or online form, and the system immediately grabs them for processing. 📂 Section 2 – File Management Nodes: ☁️ Upload File (Google Drive) → Saves the incoming resume into a structured Google Drive folder, naming it after the applicant. ⬇️ Download File (Google Drive) → Retrieves the stored resume file for further processing. 🔎 Extract from File → Converts the resume (PDF/DOC) into plain text so the AI and extractors can work with it. ✅ Quick Understanding: This is your digital filing room. Every resume is safely stored, then converted into a readable format for the hiring system. 🤖 Section 3 – AI Processing (Parallel Analysis) Nodes: 🧾 Information Extractor → Pulls structured contact information (candidate name, candidate email and candidate phone number) using regex validation and schema rules. 🤖 AI Agent (LangChain + OpenRouter) → Reads the full CV and outputs: 🎓 Educational Qualifications 💼 Job History 🛠 Skills Set 📊 Candidate Evaluation Score (1–10) 📝 Justification for the score ✅ Quick Understanding: Imagine having two assistants working in parallel, one quickly extracts basic contact info, while the other deeply reviews the CV and gives an evaluation. 🛠️ Section 4 – Data Cleanup & Merging Nodes: ✏️ Edit Fields → Standardizes the AI Agent’s output into a consistent field (output). 🛠 Code (JS Parsing & Cleanup) → Converts the AI’s free-text summary into clean JSON fields (education, jobHistory, skills, score, justification). 🔗 Merge → Combines the structured contact info with the AI’s evaluation into a single candidate record. ✅ Quick Understanding: This is like the data cleaning and reporting team, making sure all details are neat, structured, and merged into one complete candidate profile. 📊 Section 5 – Persistence & Dashboards Nodes: 📑 Google Sheets (Append Row) → Saves candidate details into a Google Sheet for quick team access. 🗄 Airtable (Create Record) → Stores the same structured data into Airtable, enabling dashboards, analytics, and ATS-like tracking. ✅ Quick Understanding: Think of this as your HR dashboard and database. Every candidate record is logged in both Google Sheets and Airtable, ready for filtering, reporting, or further action. 📊 Workflow Overview Table: | Section | Key Roles / Nodes | Model / Service | Purpose | Benefit | | --- | --- | --- | --- | --- | | 📥 Entry & Intake | Gmail Trigger, Form Trigger | Gmail API / Webhook | Capture resumes from email or forms | Resumes collected instantly from multiple sources | | 📂 File Management | Google Drive Upload, Google Drive Download, Extract from File | Google Drive + n8n Extract | Store resumes & convert to plain text | Centralized storage + text extraction for processing | | 🤖 AI Processing | Information Extractor, AI Agent (LangChain + OpenRouter) | Regex + OpenRouter AI {gpt-oss-20b (free)} | Extract contact info + AI CV analysis | Candidate details + score + justification generated automatically | | 🛠 Data Cleanup & Merge | Edit Fields, Code (JS Parsing & Cleanup), Merge | n8n native + Regex Parsing | Standardize and merge outputs | Clean, structured JSON record with all candidate info | | 📊 Persistence Layer | Google Sheets Append Row, Airtable Create Record | Google Sheets + Airtable APIs | Store structured candidate data | HR dashboards & ATS-ready records for easy review and analytics | | 🔄 Execution Flow | All connected | Gmail + Drive + Sheets + Airtable + AI | End-to-end automation | Automated resume → structured record → recruiter dashboards | 📂 Workflow Output Overview: Each candidate’s data is standardized into the following fields: Candidate Name Candidate Email Contact Number Educational Qualifications Job History Skills Set AI Score (1–10) Justification 📌 Example (Google Sheet row): 📊 Benefits of This Workflow at a Glance: ⏱️ Lightning-Fast Screening** → Processes hundreds of resumes in minutes instead of hours. 🤖 AI-Powered Evaluation** → Automatically summarizes candidate education, work history, skills, and gives a suitability score (1–10) with justification. 📂 Centralized Storage** → Every resume is securely saved in Google Drive for easy access and record-keeping. 📊 Data-Ready Outputs** → Structured candidate profiles go straight into Google Sheets and Airtable, ready for dashboards and analytics. ✅ Consistency & Fairness** → Standardized AI scoring ensures every candidate is evaluated on the same criteria, reducing human bias. 🛠️ Flexible Intake** → Works with both Gmail (email applications) and Form submissions (job portals or career pages). 🚀 Recruiter Productivity Boost** → Frees HR teams from manual extraction and data entry, allowing them to focus on interviewing and hiring the best talent. 👉 Practical HR Use Case: “Screen resumes for a Senior Software Engineer role and shortlist top candidates.” Gmail Trigger → Captures incoming job applications with CVs attached. Google Drive → Stores resumes for record-keeping. Extract from File → Converts CVs into plain text. Information Extractor → Pulls candidate name, email, and phone number. AI Agent → Summarizes education, job history, skills, and assigns a suitability score (1–10). Code & Merge → Cleans and combines outputs into a structured candidate profile. Google Sheets → Logs candidate data for quick HR review. Airtable → Builds dashboards to filter and identify top-scoring candidates. ✅ Result: HR instantly sees structured candidate records, filters by score, and focuses interviews on the best talent.
by Cheng Siong Chin
How It Works This workflow automates property registration verification, fraud detection, and blockchain-based compliance tracking by systematically assessing fraud risk, validating transactions, ensuring data immutability through cryptographic hashing, and recording property records on the blockchain. It ingests property registration data, applies GPT-4–driven fraud analysis with risk scoring, and verifies transaction legitimacy against regulatory and contractual criteria. The system generates cryptographic hashes for property and lease records, validates compliance requirements using AI-based analysis, queries the blockchain for verification, logs transactions on-chain, stores audit records in structured sheets, and securely archives all supporting documentation. Designed for real estate firms, legal practices, and property management companies, it enables transparent verification, fraud mitigation, and tamper-resistant compliance tracking across the property lifecycle. Setup Steps Configure property data source and set up OpenAI GPT-4 for fraud detection and compliance. Connect blockchain network credentials and configure hash generation parameters. Set up Google Sheets for audit logging and configure blockchain verification queries. Define fraud risk thresholds, compliance criteria, and transaction validation rules. Prerequisites Property registration data source; OpenAI API key; blockchain network access Use Cases Real estate firms automating fraud checks on property transactions; Customization Adjust fraud detection criteria and risk thresholds, modify blockchain network selection. Benefits Eliminates manual fraud detection, prevents title fraud and forgery
by Lucas Walter
Overview & Setup This n8n template demonstrates how to automatically generate authentic User-Generated Content (UGC) style marketing videos for eCommerce products using AI. Simply upload a product image, and the workflow creates multiple realistic influencer-style video ads complete with scripts, personas, and video generation. Use cases Generate multiple UGC video variations for A/B testing Create authentic-looking product demonstrations for social media Produce influencer-style content without hiring creators Quickly test different marketing angles for new products Scale video content creation for eCommerce catalogs Good to know Sora 2 video generation takes approximately 2-3 minutes per 12-second video Each video generation costs approximately $0.50-1.00 USD (check OpenAI pricing for current rates) The workflow generates multiple video variations from a single product image Videos are automatically uploaded to Google Drive upon completion Generated videos are in 720x1280 (9:16) format optimized for social media How it works Product Analysis: OpenAI's vision API analyzes the uploaded product image to understand its features, benefits, and target audience Persona Creation: The system generates a detailed profile of the ideal influencer/creator who would authentically promote this product Script Generation: Gemini 2.5 Pro creates multiple authentic UGC video scripts (12 seconds each) with frame-by-frame breakdowns, natural dialogue, and camera movements Frame Generation: For each script, Gemini generates a custom first frame that adapts the product image to match UGC aesthetic and aspect ratio Video Production: Sora 2 API generates the actual video using the script and custom first frame as reference Status Monitoring: The workflow polls the video generation status every 15 seconds until completion Upload & Storage: Completed videos are automatically uploaded to Google Drive with organized naming How to use Click the form trigger URL to access the submission form Upload your product image (works best with clean product shots on white/neutral backgrounds) Enter the product name Submit the form and wait for the workflow to complete Find your generated UGC videos in the specified Google Drive folder Each run produces multiple video variations you can test Requirements OpenAI API** account with Sora 2 access for video generation and GPT-4 Vision Google Gemini API** account for script generation and image adaptation Google Drive** account for video storage Sufficient API credits for video generation (budget accordingly) Customizing this workflow Adjust the video duration in the generate_video node (currently set to 12 seconds) Modify the persona prompt in analyze_product node to target different audience demographics Change the script style in set_build_video_prompts node for different UGC aesthetics (excited discovery, casual recommendation, etc.) Update the Google Drive folder in upload_video node to organize videos by campaign Add additional processing nodes for video editing, subtitle generation, or thumbnail creation Modify the aspect ratio in resize_image node for different platforms (1:1 for Instagram feed, 16:9 for YouTube, etc.)
by Lakindu Siriwardana
🔧 Automated Video Generator (n8n Workflow) 🚀 Features End-to-End Video Creation from user idea or transcript AI-Powered Scriptwriting using LLMs (e.g., DeepSeek via OpenRouter) Voiceover Generation with customizable TTS voices Image Scene Generation using generative models like together.ai Clip Creation & Concatenation into a full video Dynamic Caption Generation with styling options Google Drive & Sheets Integration for asset storage and progress tracking ⚙️ How It Works User Submits Form with: Main topic or transcript Desired duration TTS voice Visual style (e.g., Pixar, Lego, Cyberpunk) Image generation provider AI generates a script: A catchy title, description, hook, full script, and CTA using a language model. Text-to-Speech (TTS): The script is turned into audio using the selected voice, with timestamped captions generated. Scene Segmentation: The script is split into 5–6 second segments for visual storyboarding. Image Prompt Creation: Each scene is converted into an image prompt in the selected style (e.g., "anime close-up of a racing car"). Image Generation: Prompts are sent to together.ai or fal.ai to generate scenes. Clip Creation: Each image is turned into a short video clip (Ken Burns-style zoom) based on script timing. Video Assembly: All clips are concatenated into a single video. Captions are overlaid using the earlier timestamps. Final Output is uploaded to Google Drive, Telegram and links are saved in Google Sheets. 🛠 Inital Setup 🗣️ 1. Set Up TTS Voice (Text-to-Speech) Run your TTS server locally using Docker. 🧰 2. Set Up NCA-Toolkit The nca-toolkit appears to be a custom video/image processing backend used via HTTP APIs: http://host.docker.internal:9090/v1/image/transform/video http://host.docker.internal:9090/v1/video/concatenate http://host.docker.internal:9090/v1/ffmpeg/compose 🔧 Steps: Clone or build the nca-toolkit container (if it's a private tool): Ensure it exposes port 9090. It should support endpoints for: Image to video (zoom effect) Video concatenation Audio + video merging Caption overlay via FFmpeg Run it locally with Docker: docker run -d -p 9090:80 your-nca-toolkit-image 🧠 3. Set Up together.ai (Image Generation) (Optional You can use ChatGPT API Instead) This handles image generation using models like FLUX.1-schnell. 🔧 Steps: Create an account at: https://www.together.ai Generate your API key
by Abdulrahman Alhalabi
NGO TPM Request Management System Benefits For Beneficiaries: 24/7 Accessibility** - Submit requests anytime via familiar Telegram interface Language Flexibility** - Communicate in Arabic through text or voice messages Instant Acknowledgment** - Receive immediate confirmation that requests are logged No Technical Barriers** - Works on basic smartphones without special apps For TPM Teams: Centralized Tracking** - All requests automatically logged with timestamps and user details Smart Prioritization** - AI categorizes issues by urgency and type for efficient response Action Guidance** - Specific recommended actions generated for each request type Performance Analytics** - Track response patterns and common issues over time For NGO Operations: Cost Reduction** - Automated intake reduces manual processing overhead Data Quality** - Standardized categorization ensures consistent reporting Audit Trail** - Complete record of all beneficiary interactions for compliance Scalability** - Handle high volumes without proportional staff increases How it Works Multi-Input Reception - Accepts both text messages and voice recordings via Telegram Voice Transcription - Uses OpenAI Whisper to convert Arabic voice messages to text AI Categorization - GPT-4 analyzes requests and categorizes issues (aid distribution, logistics, etc.) Action Planning - AI generates specific recommended actions for TPM team in Arabic Data Logging - Records all requests, categories, and actions in Google Sheets with user details Confirmation Feedback - Sends acknowledgment message back to users via Telegram Set up Steps Setup Time: ~20 minutes Create Telegram Bot - Get bot token from @BotFather and configure webhook Configure APIs - Set up OpenAI (transcription + chat) and Google Sheets credentials Customize AI Prompts - Adjust system messages for your NGO's specific operations Set Up Spreadsheet - Link Google Sheets for request tracking and reporting Test Workflows - Verify both text and voice message processing paths Detailed Arabic language configuration and TPM-specific categorization examples are included as sticky notes within the workflow. What You'll Need: Telegram Bot Token (free from @BotFather) OpenAI API key (Whisper + GPT-4) Google Sheets API credentials Google Spreadsheet for logging requests Sample Arabic text/voice messages for testing Key Features: Dual input support (text + voice messages) Arabic language processing and responses Structured data extraction (category + recommended action) Complete audit trail with user information Real-time confirmation messaging TPM team-specific workflow optimization
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically tracks regional sentiment across social media and news outlets, giving you a real-time pulse on how people in a specific area feel about your brand or topic. Overview The automation queries Twitter, Reddit, and major news APIs filtered by geolocation. Bright Data handles location-specific scraping where APIs are limited. OpenAI performs sentiment and keyword extraction, aggregating scores into a daily report stored in Google Sheets and visualized in Data Studio. Tools Used n8n** – Coordinates all steps Bright Data** – Collects geo-targeted data beyond API limits OpenAI** – Runs sentiment analysis and topic modeling Google Sheets** – Houses cleaned sentiment metrics Data Studio / Looker** – Optional dashboard for visualization How to Install Import the Workflow into n8n with the provided .json. Configure Bright Data credentials. Set Up OpenAI API key. Connect Google Sheets and create a destination spreadsheet. Customize Regions & Keywords in the Start node. Use Cases Brand Monitoring**: Measure public opinion in target markets. Political Campaigns**: Gauge voter sentiment by district. Market Entry**: Understand regional attitudes before launching. Crisis Management**: Detect negative spikes early. Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #sentimentanalysis #geolocation #brightdata #openai #sociallistening #n8nworkflow #nocode #brandmonitoring
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically scrapes local business directories (Yelp, Yellow Pages, Google Maps, etc.) to build a structured database of prospects. Stop copying listings by hand—get fresh leads delivered straight to Google Sheets. Overview Using Bright Data, the automation fetches business names, contact details, ratings, and categories for a given city or ZIP code. OpenAI cleans and normalizes the data, while duplicate detection ensures each business appears only once. The result is emailed as a CSV and stored in Sheets for easy filtering. Tools Used n8n** – Workflow orchestration Bright Data** – Handles large-scale directory scraping OpenAI** – Performs entity cleanup and deduplication Google Sheets** – Houses the resulting lead list Gmail** – Sends the CSV file to your inbox How to Install Import the Workflow: Load the .json into n8n. Configure Bright Data: Add your credentials. Set Up OpenAI: Enter your API key. Connect Google Sheets & Gmail: Authorize both integrations. Customize Locations & Categories: Adjust parameters in the Start node. Use Cases Local Lead Generation**: Build outreach lists for agencies or SaaS. Market Research**: Analyze density of businesses in a region. Franchise Expansion**: Identify potential partners within a territory. Startup Sales**: Discover SMBs that match your ICP. Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #webscraping #localbusiness #brightdata #leadgeneration #n8nworkflow #nocode #businessdirectories #openai
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically discovers and collects information about events and attendee data from event platforms. It saves you time by eliminating the need to manually browse through event listings and provides a centralized database of event information including categories, venues, and attendee feedback. Overview This workflow automatically scrapes event data from 10times.com and other event platforms to extract categories, featured events, attendee feedback, and venue information. It uses Bright Data to access event websites without being blocked and AI to intelligently parse event data into structured format for storage in Google Sheets. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping event websites without being blocked OpenAI**: AI agent for intelligent event data extraction and parsing Google Sheets**: For storing and organizing event information How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and specify the target spreadsheet Customize: Adjust the event platform URLs and event criteria you want to monitor Use Cases Event Planners**: Monitor competing events and industry trends Marketing Teams**: Identify events for sponsorship and networking opportunities Business Development**: Find relevant events for lead generation and partnerships Market Research**: Track event attendance patterns and industry insights Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #events #eventdiscovery #brightdata #webscraping #eventplanning #eventscraping #attendeedata #eventmarketing #n8nworkflow #workflow #nocode #eventautomation #eventmonitoring #eventresearch #10times #eventintelligence #venuedata #eventfeedback #eventtracking #eventcalendar #eventanalytics #businessevents #eventorganizer #eventtech #eventindustry #eventcollection #networkingevents #conferencedata
by Yaron Been
This workflow automatically analyzes website conversion funnels to identify optimization opportunities and track user journey performance. It saves you time by eliminating the need to manually analyze funnel metrics and provides detailed insights into conversion bottlenecks and improvement areas. Overview This workflow automatically scrapes website pages to analyze funnel elements including CTAs, tracking scripts, page structure, and conversion paths. It uses Bright Data to access websites without restrictions and AI to intelligently extract funnel data, identify conversion elements, and provide optimization recommendations. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping website pages without being blocked OpenAI**: AI agent for intelligent funnel analysis and optimization insights Google Sheets**: For storing funnel analysis data and recommendations How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your funnel analysis spreadsheet Customize: Define target website URLs and funnel analysis parameters Use Cases Conversion Optimization**: Identify and fix conversion funnel bottlenecks UX Analysis**: Analyze user experience and journey optimization opportunities Competitor Research**: Study competitor funnel strategies and implementations A/B Testing**: Monitor funnel performance changes over time Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #funnelanalysis #conversionoptimization #brightdata #webscraping #uxanalysis #n8nworkflow #workflow #nocode #websiteanalysis #funneloptimization #conversiontracking #userjourney #websiteoptimization #cro #digitalmarketing #funnelalyzer #websiteperformance #conversionanalytics #uxresearch #websitemetrics #funnelmonitoring #performanceanalysis #websiteinsights #conversionfunnel #userexperience #websiteaudit #funneltracking #optimizationanalysis
by Yaron Been
Description This workflow automatically collects and organizes research papers from academic databases and journals into Google Sheets. It helps researchers and students save time by eliminating manual searches across multiple academic sources and centralizing research materials. Overview This workflow automatically scrapes research papers from academic databases and journals, then organizes them in Google Sheets. It uses Bright Data to access academic sources and extracts key information like titles, authors, abstracts, and citations. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping academic websites and research databases without getting blocked. Google Sheets:** For organizing and storing research paper data. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Connect Google Sheets: Authenticate your Google account. Customize: Specify research topics, journals, or authors to track. Use Cases Academic Researchers:** Stay updated on new papers in your field. Students:** Collect research for literature reviews and dissertations. Research Teams:** Collaborate on literature databases. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #research #academicpapers #brightdata #googlesheets #researchpapers #academicresearch #literaturesearch #scholarlyarticles #n8nworkflow #workflow #nocode #researchautomation #academicscraping #researchtools #papertracking #academicjournals #researchdatabase #literaturereview #academicwriting #datascraping #researchorganization #scholarlyresearch #citationmanagement #academicproductivity