by Ranjan Dailata
Who this is for? The LinkedIn Profile Extract and JSON Resume Builder is a powerful workflow that scrapes professional profile data from LinkedIn using Bright Data's infrastructure, then transforms that data into a clean, structured JSON resume using Google Gemini. The workflow is ideal for automating resume parsing, candidate profiling, or integrating into recruiting platforms. This workflow is tailored for: HR professionals & recruiters automating resume screening Talent acquisition platforms enriching candidate profiles Developers & AI builders creating resume-parsing AI pipelines Data scientists working on labor market analytics Growth hackers profiling prospects via public data What problem is this workflow solving? Parsing resumes or LinkedIn profiles into machine-readable formats is often a manual, error-prone process. Most scraping tools either fail due to anti-bot protections or return unstructured HTML that's hard to work with. This workflow solves that by: Using Bright Data's Web Unlocker for reliable, CAPTCHA-free LinkedIn scraping Extracting clean text and structured profile data via Google Gemini LLM Automatically generating a standards-compliant JSON Resume and Skills Sending the resume to webhooks or storing it for downstream usage What this workflow does Accepts LinkedIn Profile URL and required metadata (Bright Data zone, webhook) Scrapes LinkedIn profile using Bright Data Web Unlocker Extracts clean content and skills using Google Gemini LLM Builds a JSON-formatted resume following the JSON resume schema Sends the JSON resume via Webhook Notification Persists the output by saving the file to disk Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the Set URL and Bright Data Zone node with the LinkedIn profile, Bright Data Zone and the Webhook notification URL. For testing purposes, you can obtain a webhook url using https://webhook.site/ How to customize this workflow to your needs Add Language Translation Insert a translation LLM node to support multilingual profiles. Generate PDF Resumes Convert JSON to formatted PDF resumes using an HTML-to-PDF module. Push to ATS or CRM Add integration nodes to pipe data into applicant tracking systems (ATS), CRMs, or databases. Use Alternative LLMs Swap Gemini with OpenAI or Anthropic Claude if preferred.
by Jimleuk
This n8n demonstrates how to build your own Qdrant MCP server to extend its functionality beyond that of the official implementation. This n8n implementation exposes other cool API features from Qdrant such as facet search, grouped search and recommendations APIs. With this, we can build an easily customisable and maintainable Qdrant MCP server for business intelligence. This MCP example is based off an official MCP reference implementation which can be found here - https://github.com/qdrant/mcp-server-qdrant How it works A MCP server trigger is used and connected to 5 custom workflow tools. We're using custom workflow tools as there is quite a few nodes required for each task. We use a mix of n8n supported Qdrant nodes for simple operations such as insert documents and similarity search, and HTTP node to hit the Qdrant API directly for Facet search, group search and recommendations. We use "Edit Field" and "Aggregate" nodes to return suitable responses to the MCP client. How to use This Qdrant MCP server allows any compatible MCP client to manage a Qdrant Collection by supporting select and create operations. You will need to have a collection available before you can use this server. Use the Prerequisite manual steps to get started! Connect your MCP client by following the n8n guidelines here - https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/#integrating-with-claude-desktop Try the following queries in your MCP client: "Can you help me list the available companies in the collection?" "What do customers say about product deliveries from company X?" "What do customers of company X and company Y say about product ease of use?" Requirements Qdrant for vector store. This can be an a cloud-hosted instance or one you can self-host internally. MCP Client or Agent for usage such as Claude Desktop - https://claude.ai/download Customising this workflow Depending on what queries you'll receive, adjust the tool inputs to make it easier for the agent to set the right parameters. Not interested in Reviews? The techniques shared in this template can be used for other types of collections. Remember to set the MCP server to require credentials before going to production and sharing this MCP server with others!
by Akash Kankariya
🚀 Discover trending and viral YouTube videos easily with this powerful n8n automation! This workflow helps you perform bulk research on YouTube videos related to any search term, analyzing engagement data like views, likes, comments, and channel statistics — all in one streamlined process. ✨ Perfect for: Content creators wanting to find viral video ideas Marketers analyzing competitor content YouTubers optimizing their content strategy How It Works 🎯 1️⃣ Input Your Search Term — Simply enter any keyword or topic you want to research. 2️⃣ Select Video Format — Choose between short, medium, or long videos. 3️⃣ Choose Number of Videos — Define how many videos to analyze in bulk. 4️⃣ Automatic Data Fetch — The workflow grabs video IDs, then fetches detailed video data and channel statistics from the YouTube API. 5️⃣ Performance Scoring — Videos are scored based on engagement rates with easy-to-understand labels like 🚀 HOLY HELL (viral) or 💀 Dead. 6️⃣ Export to Google Sheets — All data, including thumbnails and video URLs, is appended to your Google Sheet for comprehensive review and easy sharing. Setup Instructions 🛠️ Google API Key Get your YouTube Data API key from Google Developers Console. Add it securely in the n8n credentials manager (do not hardcode). Google Sheets Setup Create a Google Sheet to store your results (template link is provided). Share the sheet with your Google account used in n8n. Update the workflow with your sheet's Document ID and Sheet Name if needed. Run the Workflow Trigger the form webhook via browser or POST call. Enter search term, format, and number of videos. Let it process and check your Google Sheet for insights! Features ✨ Bulk fetches the latest and top-viewed YouTube videos. Intelligent video performance scoring with emojis for quick insights 🔥🎬. Organizes data into Google Sheets with thumbnail previews 🖼️. Easy to customize search parameters via an intuitive form. Fully automated, no manual API calls needed. Get Started Today! 🌟 Boost your YouTube content strategy and stay ahead with this powerful viral video research automation! Try it now on your n8n instance and tap into the world of viral content like a pro 🎥💡
by Daniel Shashko
Note: This template is for self-hosted n8n instances only You can use this workflow to fully automate website content monitoring and change detection on a weekly basis—even when there’s no native node for scraping or structured comparison. It uses an AI-powered scraper, structured data extraction, and integrates Google Sheets, Drive, Docs, and email for seamless tracking and reporting. Main Use Cases Monitor and report changes to websites (e.g., pricing, content, headings, FAQs) over time Automate web audits, compliance checks, or competitive benchmarking Generate detailed change logs and share them automatically with stakeholders How it works The workflow operates as a scheduled process, organized into these stages: 1. Initialization & Configuration Triggers weekly (or manually) and initializes key variables: Google Drive folder, spreadsheet IDs, notification emails, and test mode. 2. Input Retrieval Reads the list of URLs to be monitored from a Google Sheet. 3. Web Scraping & Structuring For each URL, an AI agent uses Bright Data's scrape_as_markdown tool to extract the full web page content. The workflow then parses this content into a well-structured JSON, capturing elements like metadata, headings, pricing, navigation, calls to action, contacts, banners, and FAQs. 4. Saving Current Week’s Results The structured JSON is saved to Google Drive as the current week’s snapshot for each monitored URL. The Google Sheet is updated with file references for traceability. 5. Comparison with Previous Snapshot If a prior week’s file exists, it is downloaded and parsed. The workflow compares the current and previous JSON snapshots, detecting and categorizing all substantive content changes (e.g., new/updated plans, FAQ edits, contact info modifications). Optionally, in test mode, mock changes are introduced for demo and validation purposes. 6. Change Report Generation & Delivery A rich Markdown-formatted changelog is generated, summarizing the detected changes, and then converted to HTML. The changelog is uploaded to Google Docs and linked back to the tracking sheet. An HTML email with the full report and relevant links is sent to recipients. Summary Flow: Schedule/workflow trigger → initialize variables Read URL list from spreadsheet For each URL: Scrape & structure as JSON Save to Drive, update tracking sheet If previous week exists: Download & parse previous Compare, generate changelog Convert to HTML, save to Docs, update Sheet Email results Benefits: Fully automated website change tracking with end-to-end reporting Adaptable and extensible for any set of monitored pages and content types Easy integration with Google Workspace tools for collaboration and storage Minimal manual intervention required after initial setup
by Blockia Labs
Time Logging on Clockify Using Slack How it works This workflow simplifies time tracking for teams and agencies by integrating Slack with Clockify. It enables users to log, update, or delete time entries directly within Slack, leveraging an AI-powered assistant for seamless and conversational interactions. Key features include: Effortless Time Logging**: Create and manage time entries in Clockify without leaving Slack. AI-Powered Assistant**: Get step-by-step guidance to ensure accurate and efficient time logging. Project and Client Management**: Retrieve project and client information from Clockify effortlessly. Overlap Prevention**: Avoid overlapping entries with built-in time validation. Automated Descriptions**: Generate ethical, grammatically correct descriptions for time logs. Set up steps 1. Prepare your integrations Ensure you have active accounts for both Slack and Clockify. Generate your Clockify API credentials for integration. 2. Import the workflow Download and import the workflow template into your n8n instance. Configure the workflow to connect with your Slack and Clockify accounts. 3. Configure the workflow Add your Clockify API credentials in the workflow settings. Set up the Slack Trigger to listen for app mentions or specific commands. 4. Test the workflow Use Slack to create a time entry and verify it in Clockify. Test updating and deleting existing entries to ensure smooth functionality. Check for any overlapping time logs or incorrect data entries. Why use this workflow? Efficiency**: Eliminate the need to switch between tools for time tracking. Accuracy**: AI-driven validation ensures error-free entries. Automation**: Simplify repetitive tasks like updating or deleting time logs. Proactive Guidance**: Conversational assistant ensures smooth operations.
by InfyOm Technologies
✅ What problem does this workflow solve? If you're using a self-hosted n8n instance, there's no built-in version history or undo for your workflows. If a workflow is accidentally modified or deleted, there's no way to roll back. This backup workflow solves that problem by automatically syncing your workflows to Google Drive, giving you version control and peace of mind. ⚙️ What does this workflow do? ⏱ Runs on a set schedule (e.g., daily or every 12 hours). 🔍 Fetches all workflows from your self-hosted n8n instance. 🧠 Detects changes to avoid duplicate backups. 📁 Creates a dedicated folder for each workflow in Google Drive. 💾 Uploads new or updated workflow files in JSON format. 🗃️ Keeps backup history organized by date. 🔄 Allows for easy restore by importing backed-up JSON into n8n. 🔧 Setup Instructions 1. Google Drive Setup Connect your Google Drive account using the Google Drive node in n8n. Choose or create a root folder (e.g., n8n-workflow-backups) where backups will be stored. 2. n8n API Credentials Generate a Personal Access Token from your self-hosted n8n instance: Go to Settings → API in your n8n dashboard. Copy the token and use it in the HTTP Request node headers as: Authorization: Bearer <your_token> 3. Schedule the Workflow Use the Cron node to schedule this workflow to run at your desired frequency (e.g., once a day or every 12 hours). 🧠 How it Works Step-by-Step Flow: Scheduled Trigger The workflow begins on a timed schedule using the Cron node. Fetch All Workflows Uses the n8n API (/workflows) to retrieve a list of all existing workflows. Loop Through Workflows For each workflow: A folder is created in Google Drive using the workflow name. The workflow’s last updated timestamp is checked against Google Drive backups. Smart Change Detection If the workflow has changed since the last backup: A new .json file is uploaded to the corresponding folder. The file is named with the last updated date of the workflow (YYYY-MM-DD-HH-mm-ss.json) to maintain a versioned history. If no change is detected, the workflow is skipped. 🗂 Google Drive Folder Organization Backups are neatly organized by workflow and version: /n8n-workflow-backups/ ├── google-drive-backup-KqhdMBHIyAaE7p7v/ │ ├── 2025-07-15-13-03-32.json │ ├── 2025-07-14-03-08-12.json ├── resume-video-avatar-KqhdMBHIyAaE8p8vr/ │ ├── 2025-07-15-23-05-52.json Each folder is named after the workflow's name+id and contains timestamped versions. 🔧 Customization Options 📅 Change Backup Frequency Adjust the Cron node to run backups daily, weekly, or even hourly based on your needs. 📤 Use a Different Storage Provider You can swap out Google Drive for Dropbox, S3, or another cloud provider with minimal changes. 🧪 Add Workflow Filtering Only back up workflows that are active or match specific tags by filtering results from the n8n API. ♻️ How to Restore a Workflow from Backup Go to the Google Drive backup folder for the workflow you want to restore. Download the desired .json file (based on the date). Open your self-hosted n8n instance. Click Import Workflow from the sidebar menu. Upload the JSON file to restore the workflow. > You can choose to overwrite an existing workflow or import it as a new one. 👤 Who can use this? This template is ideal for: 🧑💻 Developers running self-hosted n8n 🏢 Teams managing large workflow libraries 🔐 Anyone needing workflow versioning, rollback, or disaster recovery 💾 Productivity enthusiasts looking for automated backups 📣 Tip Consider enabling version history in Google Drive so you get even more fine-grained backup recovery options on top of what this workflow provides! 🚀 Ready to use? Just plug in your n8n token, connect Google Drive, and schedule your backups. Your workflows are now protected!
by vinci-king-01
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works This workflow automatically monitors competitor prices, analyzes market demand, and optimizes product pricing in real-time for maximum profitability using advanced AI algorithms. Key Steps Hourly Trigger - Runs automatically every hour for real-time price optimization and competitive response. Multi-Platform Competitor Monitoring - Uses AI-powered scrapers to track prices from Amazon, Best Buy, Walmart, and Target. Market Demand Analysis - Analyzes Google Trends data to understand search volume trends and seasonal patterns. Customer Sentiment Analysis - Reviews customer feedback to assess price sensitivity and value perception. AI Pricing Optimization - Calculates optimal prices using weighted factors including competitor positioning, demand indicators, and inventory levels. Automated Price Updates - Directly updates e-commerce platform prices when significant opportunities are identified. Comprehensive Analytics - Logs all pricing decisions and revenue projections to Google Sheets for performance tracking. Set up steps Setup time: 15-20 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for AI-powered competitor and market analysis. Set up e-commerce API connection - Connect your e-commerce platform API for automated price updates. Configure Google Sheets - Set up Google Sheets connections for pricing history and revenue analytics logging. Set up Slack notifications - Connect your Slack workspace for real-time pricing alerts and team updates. Customize product catalog - Modify the product configuration with your actual products, costs, and pricing constraints. Adjust monitoring frequency - Change the trigger timing based on your business needs (hourly, daily, etc.). Configure competitor platforms - Update competitor URLs and selectors for your target market. What you get Real-time price optimization** with 15-25% potential revenue increase through intelligent pricing Competitive intelligence** with automated monitoring of major e-commerce platforms Market demand insights** with seasonal and trend-based pricing adjustments Customer sentiment analysis** to understand price sensitivity and value perception Automated price updates** when significant opportunities are identified (>2% change, >70% confidence) Comprehensive analytics** with pricing history, revenue projections, and performance tracking Team notifications** with detailed market analysis and pricing recommendations Margin protection** with intelligent constraints to maintain profitability
by Custom Workflows AI
Introduction The "High-Level Service Page SEO Blueprint Report" workflow is a powerful, AI-driven solution designed to generate comprehensive SEO content strategies for service-based businesses. By analyzing competitor websites and user intent, this workflow creates a detailed blueprint that outlines the optimal structure, content, and conversion elements for a service page. The workflow leverages the JINA Reader API to extract content from competitor websites and uses Google Gemini AI to perform deep analysis across multiple dimensions: competitor content structure, user intent, strategic opportunities, and conversion optimization. The final output is a professionally formatted Markdown document that provides actionable guidance for creating a high-performing service page that satisfies both user needs and search engine requirements. This workflow eliminates the time-consuming process of manually analyzing competitors and developing content strategies, providing a data-driven foundation for service page creation that would typically require hours of expert analysis. Who is this for? This workflow is designed for digital marketers, SEO specialists, content strategists, and web developers who need to create or optimize service pages for businesses. It's particularly valuable for marketing agencies and freelancers who regularly develop content strategies for clients across various industries. Users should have a basic understanding of SEO concepts, content marketing, and website structure. While technical SEO knowledge is beneficial, the workflow is designed to provide comprehensive guidance even for those with intermediate-level expertise. The ideal user is someone who wants to streamline their content planning process and ensure their service pages are built on data-driven insights rather than guesswork. What problem is this workflow solving? Creating effective service pages that rank well in search engines while converting visitors is a complex challenge that typically requires extensive competitive research, content planning, and conversion optimization expertise. This workflow addresses several key pain points: Time-consuming competitor analysis: Manually analyzing multiple competitor websites to identify content patterns, heading structures, and meta tag strategies can take hours. Difficulty identifying content gaps: Determining what topics competitors are missing that could provide a competitive advantage requires deep analysis and industry knowledge. Balancing SEO and conversion elements: Creating content that satisfies both search engines and user needs while driving conversions is a delicate balance that many struggle to achieve. Lack of structured approach: Many content creators work without a comprehensive blueprint, leading to inconsistent results and missed opportunities. Difficulty translating analysis into actionable recommendations: Even when analysis is performed, turning those insights into a concrete content plan can be challenging. This workflow automates these processes, providing a structured, data-driven approach to service page creation that saves hours of research and planning time. What this workflow does Overview The workflow takes a list of competitor URLs and a target keyword as input, then performs a multi-stage analysis to generate a comprehensive service page blueprint. It extracts and analyzes competitor content, evaluates user intent, identifies strategic opportunities, and creates detailed recommendations for page structure, content, and conversion elements. The final output is a professionally formatted Markdown document that serves as a complete roadmap for creating an effective service page. Process Data Collection: The workflow begins with a form that collects essential information: competitor URLs, target keyword, services offered, brand name, and whether the page is a homepage. Competitor Content Extraction: The workflow processes each competitor URL, using the JINA Reader API to extract the HTML content from each site. Content Structure Analysis: For each competitor site, the workflow extracts and analyzes heading structures, meta tags, schema markup, and recurring phrases (n-grams). Competitor Analysis Report: The AI synthesizes the competitive data to identify patterns in meta titles/descriptions, common outline sections, key heading concepts, and structural elements. User Intent Analysis: The workflow analyzes the target keyword to determine primary and secondary user intents, user personas, and their position in the buyer's journey. Gap Analysis: The AI identifies content overlaps ("table stakes"), content gaps (opportunities), SEO keyword priorities, and potential UX/conversion advantages. Page Outline Generation: Based on the previous analyses, the workflow creates an optimal page structure with H1, H2s, H3s, and potentially H4s, with justifications for each section. UX & Conversion Recommendations: The workflow adds detailed recommendations for calls-to-action, trust signals, copywriting tone, visual elements, and risk reversal strategies. Final Blueprint Creation: All analyses and recommendations are compiled into a comprehensive, well-structured Markdown document that serves as a complete service page blueprint. Setup Download or import the "High-Level Service Page SEO Blueprint Report" workflow JSON file into your n8n instance. Create a JINA Reader API key by visiting https://jina.ai/api-dashboard/key-manager. You can claim a free API key that allows up to 1 million tokens. Set up Google Gemini (PaLM) credentials by following the guide at https://docs.n8n.io/integrations/builtin/credentials/googleai/#using-geminipalm-api-key. Update the "Edit Fields" node with: Your JINA Reader API Key Adjust the "Waiting Time" to 20 seconds if using the free Google Gemini API tier (which limits to 5 requests per minute) Optionally change the Gemini model if needed Activate the workflow and start the form trigger. Complete the form with: Competitors (up to 5 direct competitor URLs) Target Keyword (the query related to your service) Services Offered (details of your complete service offerings) Brand Name (your company name) Whether the page is a homepage After processing, download the generated .txt file, which contains the blueprint in Markdown format. How to customize this workflow to your needs Adjust AI parameters: Modify the temperature settings in the Google Gemini Chat Model nodes to control creativity vs. precision in the AI outputs. Customize extraction logic: Edit the "Extract HTML Elements" code node to focus on specific HTML elements that are most relevant to your industry or content type. Modify analysis prompts: Customize the prompts in the various analysis nodes to focus on specific aspects of SEO or content strategy that are most important for your use case. Add industry-specific guidance: Enhance the prompts with industry-specific instructions or examples to make the output more relevant to particular sectors. Integrate with content management systems: Extend the workflow to automatically send the blueprint to content management systems, project management tools, or document storage platforms. Add competitor scoring: Implement a scoring system to evaluate and rank competitors based on specific criteria relevant to your strategy. Expand the analysis: Add additional analysis nodes to evaluate other aspects of competitor websites, such as page speed, mobile-friendliness, or backlink profiles.
by Yaron Been
Transform chaotic support requests into organized, actionable insights automatically. This intelligent workflow captures support tickets from forms, uses AI to categorize and analyze sentiment, stores everything in organized databases, and delivers comprehensive analytics reports to your team - eliminating manual sorting while providing valuable business intelligence. 🚀 What It Does Intelligent Ticket Processing: Automatically categorizes incoming support requests into Billing, Bug Reports, Feature Requests, How-To questions, and Complaints using advanced AI analysis. Sentiment Analysis: Analyzes customer emotion (Positive, Neutral, Negative) to prioritize responses and identify satisfaction trends. Real-Time Analytics: Generates instant reports showing ticket distribution, sentiment patterns, and team workload insights. Automated Data Storage: Organizes all ticket information in searchable Google Sheets with timestamps and customer details. Smart Reporting: Sends regular email summaries to stakeholders with actionable insights and trend analysis. 🎯 Key Benefits ✅ Save 10+ Hours Weekly: Eliminate manual ticket sorting and categorization ✅ Improve Response Times: Prioritize tickets based on category and sentiment ✅ Boost Customer Satisfaction: Never miss urgent issues or complaints ✅ Track Performance: Monitor support trends and team effectiveness ✅ Scale Operations: Handle increasing ticket volume without additional staff ✅ Data-Driven Decisions: Make informed improvements based on real patterns 🏢 Perfect For Customer Support Teams SaaS companies managing user inquiries and bug reports E-commerce stores handling order and product questions Service businesses organizing client communications Startups scaling support operations efficiently Business Applications Help Desk Management**: Organize and prioritize incoming support requests Customer Success**: Monitor satisfaction levels and identify improvement areas Product Development**: Track feature requests and bug report patterns Team Management**: Distribute workload based on ticket categories and urgency ⚙️ What's Included Complete Workflow Setup: Ready-to-use n8n workflow with all nodes configured AI Integration: Google Gemini-powered classification and sentiment analysis Form Integration: Works with Typeform (easily adaptable to other platforms) Data Management: Automated Google Sheets organization and storage Email Reporting: Professional summary reports sent to your team Documentation: Step-by-step setup and customization guide 🔧 Technical Requirements n8n Platform**: Cloud or self-hosted instance Google Gemini API**: For AI classification (free tier available) Typeform Account**: For support form creation (alternatives supported) Google Workspace**: For Sheets data storage and Gmail reporting SMTP Email**: For automated report delivery 📊 Sample Output Daily Support Summary Email: 📧 Support Ticket Summary - March 15, 2024 📊 TICKET BREAKDOWN: • Billing: 12 tickets (30%) • Bug Report: 8 tickets (20%) • Feature Request: 6 tickets (15%) • How-To: 10 tickets (25%) • Complaint: 4 tickets (10%) 😊 SENTIMENT ANALYSIS: • Positive: 8 tickets (20%) • Neutral: 22 tickets (55%) • Negative: 10 tickets (25%) ⚡ PRIORITY ACTIONS: • 4 complaints requiring immediate attention • 3 billing issues escalated to finance team • 6 feature requests for product backlog review 🎨 Customization Options Categories: Easily modify ticket categories for your specific business needs Form Platforms: Adapt to Google Forms, JotForm, Wufoo, or custom webhooks Reporting Frequency: Set daily, weekly, or real-time report delivery Team Notifications: Configure alerts for urgent tickets or negative sentiment Data Visualization: Create custom dashboards and charts in Google Sheets Integration Extensions: Connect to CRM, project management, or chat platforms 🔄 How It Works Customer submits support request via your form AI analyzes message content and assigns category + sentiment Data is automatically stored in organized Google Sheets System generates real-time analytics on all historical tickets Professional report is emailed to your support team Team can prioritize responses based on urgency and sentiment 💡 Use Case Examples SaaS Company: Automatically route billing questions to finance, bugs to development, and feature requests to product team E-commerce Store: Prioritize shipping complaints, categorize product questions, and track customer satisfaction trends Consulting Firm: Organize client requests by service type, monitor project-related issues, and ensure timely responses Healthcare Practice: Sort appointment requests, billing inquiries, and medical questions while maintaining HIPAA compliance 📈 Expected Results 80% reduction** in manual ticket sorting time 50% faster** initial response times through better prioritization 25% improvement** in customer satisfaction scores 100% visibility** into support trends and team performance Unlimited scalability** as your business grows 📞 Get Help & Learn More 🎥 Free Video Tutorials YouTube Channel: https://www.youtube.com/@YaronBeen/videos 💼 Professional Support LinkedIn: https://www.linkedin.com/in/yaronbeen/ Connect for implementation consulting Share your automation success stories Access exclusive templates and updates 📧 Direct Support Email: Yaron@nofluff.online Technical setup assistance Custom workflow modifications Integration with existing systems Response within 24 hours 🏆 Why Choose This Workflow Proven Results: Successfully deployed across 100+ businesses worldwide Expert Created: Built by automation specialist with 10+ years experience Continuously Updated: Regular improvements and new features added Money-Back Guarantee: Full refund if not satisfied within 30 days Lifetime Support: Ongoing help and updates included with purchase
by Kanaka Kishore Kandregula
Daily Magento 2 stock check Automation It identifies SKUs with low inventory per source and sends daily alerts via: 📬 Gmail (HTML email) 💬 Slack (formatted text message) This automation empowers store owners and operations teams to stay ahead of inventory issues by proactively monitoring stock levels across all Magento 2 sources. By receiving early alerts for low-stock products, businesses can restock before items sell out—ensuring continuous product availability, reducing missed sales opportunities, and maintaining customer trust. Avoiding stockouts not only protects your brand reputation but also keeps your store competitive by preventing customers from turning to competitors due to unavailable items. Timely restocking leads to higher fulfillment rates, improved customer satisfaction, and ultimately, stronger revenue and long-term loyalty. ✅ Features: Filters out configurable, virtual, and downloadable products Uses Magento 2 MSI stock per source Customizable thresholds (default: ≤10 overall or ≤5 per source) HTML-formatted email report Slack notification with a code-formatted Runs daily via Cron (08:50 AM) No need of any 3rd part Modules One time Setup 🔑 Credentials Used HTTP Request (Magento 2 REST API using Bearer Token) Gmail (OAuth2) Slack (OAuth2 or Webhook) 📊 Tags Magento, Inventory, MSI, Stock Alert, Ecommerce, Slack, Gmail, Automation 📂 Category E-commerce → Magento 2 (Adobe Commerce) 👤 Author Kanaka Kishore Kandregula Certified Magento 2 Developer https://gravatar.com/kmyprojects https://www.linkedin.com/in/kanakakishore
by scrapeless official
AI-Powered Web Data Pipeline with n8n How It Works This n8n workflow builds an AI-powered web data pipeline that automates the entire process of: Extraction** Structuring** Vectorization** Storage** It integrates multiple advanced tools to transform messy web pages into clean, searchable vector databases. Integrated Tools Scrapeless** Bypasses JavaScript-heavy websites and anti-bot protections to reliably extract HTML content. Claude AI** Uses LLMs to analyze unstructured HTML and generate clean, structured JSON data. Ollama Embeddings** Generates local vector embeddings from structured text using the all-minilm model. Qdrant Vector DB** Stores semantic vector data for fast and meaningful search capabilities. Webhook Notifications** Sends real-time updates when workflows complete or errors occur. From messy webpages to structured vector data — this pipeline is perfect for building intelligent agents, knowledge bases, or research automation tools. Setup Steps 1. Install n8n > Requires Node.js v18 / v20 / v22 npm install -g n8n n8n After installation, access the n8n interface via: URL: http://localhost:5678 2. Set Up Scrapeless Register at: Scrapeless Copy your API token Paste the token into the HTTP Request node labeled "Scrapeless Web Request" 3. Set Up Claude API (Anthropic) Sign up at Anthropic Console Generate your Claude API key Add the API key to the following nodes: Claude Extractor AI Data Checker Claude AI Agent 4. Install and Run Ollama macOS brew install ollama Linux curl -fsSL https://ollama.com/install.sh | sh Windows Download the installer from: https://ollama.com Start Ollama Server ollama serve Pull Embedding Model ollama pull all-minilm 5. Install Qdrant (via Docker) docker pull qdrant/qdrant docker run -d \ --name qdrant-server \ -p 6333:6333 -p 6334:6334 \ -v $(pwd)/qdrant_storage:/qdrant/storage \ qdrant/qdrant Test if Qdrant is running: curl http://localhost:6333/healthz 6. Configure the n8n Workflow Modify the Trigger (Manual or Scheduled) Input your Target URLs and Collection Name in the designated nodes Paste all required API Tokens / Keys into their corresponding nodes Ensure your Qdrant and Ollama services are running Ideal Use Cases Custom AI Chatbots Private Search Engines Research Tools Internal Knowledge Bases Content Monitoring Pipelines
by Chad McGreanor
Overview This workflow automates LinkedIn posts using OpenAI. The prompts are stored in the workflow and can be customized as needed to fit your needs. The workflow uses a combination of a Schedule Trigger, some code that determines what day of the week it is (no posting Friday - Sunday), a prompts node to set your OpenAI prompts, a random selection of a prompt so that you are not generating content that looks repetitive. We send that all to OpenAI API, select a random time, have the final LinkedIn post sent to your Telegram for approval, once approved wait for the correct time slot, and then Post to your LinkedIn account using the LinkedIn node. How it works: Run or schedule the workflow in n8n The automation can be triggered manually or on a custom schedule (excluding weekends if needed). You should customize the prompts in the Prompt Node to suit your needs. A random LinkedIn post prompt is selected Pre-written prompts are rotated to keep content fresh and non-repetitive. OpenAI generates the LinkedIn post The prompt is sent to OpenAI via API, and the result is returned in clean, ready-to-use form. You receive the draft via Telegram. The post is sent to Telegram for quick approval or review. Post is scheduled or published via the LinkedIn Connector Once approved, the workflow delays until the target time, then sends the content to LinkedIn. What's needed: An OpenAPI API key, LinkedIn Account, and a Telegram Account. For Telegram you will need to configure the Bot service. Step-by-Step: Telegram Approval for Your Workflow A. Set Up a Telegram Bot Open Telegram and search for “@BotFather”. Start a chat and type /newbot to create a bot. Give your bot a name and a unique username (e.g., YourApprovalBot). Copy the API token that BotFather gives you. B. Add Your Bot to a Private Chat (with You) Find your bot in Telegram, click “Start” to activate it. Send a test message (like “hello”) so the chat is created. C. Get Your User ID Search for “userinfobot” or use sites like userinfobot in Telegram. Type /start and it will reply with your Telegram user ID. OpenAI powers the LinkedIn post creation Add Your OpenAI API Key: Log in to your OpenAI Platform account: https://platform.openai.com/. Go to API keys and create a new secret key. In n8n, create a new "OpenAI API" credential and paste your API key. Give it a name. Apply Credential to Nodes: OpenAI Message Node Connect your LinkedIn account to the Linked in Node Select your account from the LinkedIn Dropdown box.