by Darien Kindlund
Do you consistently forget to set a Default Error Workflow when creating new workflows? Then this helper workflow is for you! When activated, this helper workflow will: Scan ALL other workflows every 4 hours Make sure ALL workflows have a default error workflow set (based on what Workflow ID you provide) This helper will SKIP OVER any workflows that have the default_error:false tag set (make sure your default error workflow has the default_error:false tag set, so that you don't end up with recursive loops during errors) Setup Nodes: Once imported, edit the Set Vars node with your default_error_workflow_id value. If you want to change the default_error:false tag to some other tag name, you can do so here as well. You need to update the Set Default Error Workflow node with your PostgreSQL credentials to access the n8n database.
by InfraNodus
Using the knowledge graphs instead of RAG vector stores This workflow creates an AI chatbot agent that has access to several knowledge bases at the same time (used as "experts"). These knowledge bases are provided using the InfraNodus GraphRAG using the knowledge graphs and providing high-quality responses without the need to set up complex RAG vector store workflows. The advantages of using GraphRAG instead of the standard vector stores for knowledge are: Easy and quick to set up (no complex data import workflows needed) A knowledge graph has a holistic view of your knowledge base Better retrieval of relations between the document chunks = higher quality responses How it works This template uses the n8n AI agent node as an orchestrating agent that decides which tool (knowledge graph) to use based on the user's prompt. Here's a description step by step: The user submits a question using the AI chatbot (n8n interface, in this case, which can be accessed via a URL or embedded to any website) The AI agent node checks a list of tools it has access to. Each tool has a description of the knowledge it has auto-generated by InfraNodus. The AI agent decides which tool should be used to generate a response. It may reformulate user's query to be more suitable for the expert. The query is then sent to the InfraNodus HTTP node endpoint, which will query the graph that corresponds to that expert. Each InfraNodus GraphRAG expert provides a rich response that takes the whole context into account and provides a response from each expertย (graph) along with a list of relevant statements retrieved using a combination or RAG and GraphRAG. The n8n AI Agent node integrates the responses received from the experts to produce the final answer. The final answer is sent back to the user's chat (or a webhook endpoint) How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Create a separate knowledge graph for each expert (using PDF / content import options) in InfraNodus For each graph, go to the workflow, paste the name of the graph into the body name field. Keep other settings intact or learn more about them at the InfraNodus access points page. Once you add one or more graphs as experts to your flow, add the LLM key to the OpenAI node and launch the workflow Requirements An InfraNodus account and API key An OpenAI (or any other LLM) API key Customizing this workflow You can use this same workflow with a Telegram bot, so you can interact with it using Telegram. There are many more customizations available. Check out the complete guide at https://support.noduslabs.com/hc/en-us/articles/20174217658396-Using-InfraNodus-Knowledge-Graphs-as-Experts-for-AI-Chatbot-Agents-in-n8n Also check out the video tutorial with a demo:
by Billy Christi
Who is this for? This workflow is perfect for: Companies that manage invoices through Google Drive Business owners who want to minimize manual data entry and maximize accuracy Accounting teams and finance departments seeking to automate invoice processing What problem is this workflow solving? Processing invoices manually is time-consuming, error-prone, and inconsistent. This workflow solves those issues by: Automating invoice processing** from detection to data extraction to storage Improving accuracy** by using AI to extract key invoice data fields reliably Reducing human workload** while maintaining compliance and consistency What this workflow does This workflow creates a fully automated invoice processing system by: Monitoring a Google Drive folder for new PDF invoices in real time Downloading the PDF files and extracting their content using OCR technology Using AI (OpenAI) to parse and extract key invoice fields such as invoice number, date, total amount, vendor name, itemized details, tax, and category Validating the extracted data to ensure compliance with a structured JSON schema Storing structured data in Google Sheets for easy access, review, and reporting Key Features: AI-powered extraction handles both text-based and scanned PDF invoices Provides a structured, searchable invoice database in Google Sheets Configured to run as frequently as the user needs, ensuring timely processing. Setup Copy the Google Sheet template here: ๐ PDF Invoice Parser โ Google Sheet Template Connect your Google Drive account to the Drive Trigger and File Download nodes Add your OpenAI API key in the AI Parser node Link the Google Sheet in the final storage node Drop a test invoice PDF into the monitored Drive folder Required Credentials: OpenAI API Key** Google Drive Credentials** Google Sheets Credentials** How to customize this workflow to your needs Modify the polling interval** (default: every minute) for higher/lower frequency. Integrate with your accounting software** by adding nodes (e.g., QuickBooks, Xero). Use alternative LLM** such as Gemini, Claude.
by Yaron Been
This cutting-edge n8n automation is a powerful market research tool designed to continuously monitor and capture User-Generated Content (UGC) opportunities on Fiverr. By intelligently scraping, parsing, and logging gig data, this workflow provides: Automated Market Scanning: Daily scrapes of Fiverr UGC gigs Real-time market intelligence Consistent, hands-off data collection Intelligent Data Extraction: Parses complex HTML structures Captures key gig details Transforms unstructured web data into actionable insights Seamless Data Logging: Automatic Google Sheets integration Comprehensive gig marketplace tracking Historical data preservation Key Benefits ๐ค Full Automation: Continuous market research ๐ก Smart Filtering: Detailed UGC gig insights ๐ Instant Reporting: Real-time market trends โฑ๏ธ Time-Saving: Eliminate manual research Workflow Architecture ๐ Stage 1: Automated Triggering Scheduled Scraping**: Daily gig discovery Precise Timing**: Configurable run intervals Consistent Monitoring**: Always-on market intelligence ๐ Stage 2: Web Scraping HTTP Request**: Fetch Fiverr search results Dynamic Headers**: Bypass potential scraping restrictions Targeted Search**: UGC-specific gig discovery ๐งฉ Stage 3: Data Extraction HTML Parsing**: Extract critical gig information Structured Data Collection**: Gig Prices Seller Names Gig Titles Direct Gig URLs ๐ Stage 4: Data Logging Google Sheets Integration**: Automatic data storage Historical Tracking**: Build comprehensive gig databases Easy Analysis**: Spreadsheet-ready format Potential Use Cases Content Creators**: Market rate research Freelance Platforms**: Competitive intelligence Marketing Agencies**: UGC trend analysis Recruitment Specialists**: Talent pool mapping Business Strategists**: Market opportunity identification Setup Requirements Fiverr Search Configuration Targeted search keywords Specific UGC categories Web Scraping Preparation User-agent rotation strategy Potential proxy configuration Robust error handling Google Sheets Setup Connected Google account Prepared spreadsheet Appropriate sharing permissions n8n Installation Cloud or self-hosted instance Import workflow configuration Configure API credentials Future Enhancement Suggestions ๐ค AI-powered gig trend analysis ๐ Advanced data visualization ๐ Real-time price change alerts ๐ง Machine learning market predictions ๐ Multi-platform gig tracking Ethical Considerations Respect Fiverr's Terms of Service Implement responsible scraping practices Avoid overwhelming target websites Use data for legitimate research purposes Technical Recommendations Implement exponential backoff for requests Use randomized delays between scrapes Maintain flexible CSS selector strategies Consider rate limiting and IP rotation Connect With Me Ready to unlock market insights? ๐ง Email: Yaron@nofluff.online ๐ฅ YouTube: @YaronBeen ๐ผ LinkedIn: Yaron Been Transform your market research with intelligent, automated workflows!
by nero
How it works This template uses the n8n AI agent node as an orchestrating agent that decides which tool (knowledge graph) to use based on the user's prompt. How to use Create an account and apply for an API key on https://ai.nero.com/ai-api?utm_source=n8n-base-workflow. Fill your key into the Create task and Query task status nodes. Select an AI service and modify Create task node parameters, the API doc: https://ai.nero.com/ai-api/docs. Execute the workflow so that the webhook starts listening. Make a test request by postman or other tools, the test URL from the Webhook node. You will receive the output in the webhook response. Our API doc Please create an account to access our API docs. https://ai.nero.com/ai-api/docs. Use cases Large Scale Printing Upscale images into ultra-sharp, billboard-ready masterpieces with 300+ DPI and billions of pixels. Game Assets Compression Improve your game performance with AI-Image Compression: Faster, Better & Lossless. E-commerce Image Editing Remove & replace your product image backgrounds, create virtual showrooms. Photo Retouching Remove & reduce grains & noises from images. Face Animation Transform static images into dynamic facial expression videos or GIFs with our cutting-edge Face Animation API Photo Restoration Our Al-driven Photo Restoration API offers advanced scratch removal, face enhancement, and image upscaling. Colorize Photo Transform black & white images into vivid colors. Avatar Generator Turn your selfie into custom avatars with different styles and backgrounds Website Compression Speed up your website, compress your images in bulk.
by Ahmed Alnaqa
Who is this template for? This workflow template is designed for content creators, researchers, educators, and professionals who need quick, accurate summaries of YouTube videos. Itโs ideal for those looking to save time, extract key insights, or repurpose video content into concise formats for reports, studies, or social media. What does it do? The workflow automates the process of summarizing YouTube videos by extracting the transcript, analyzing the content, and generating a concise summary. It leverages AI tools to ensure accuracy and relevance, making it easier to digest lengthy videos in seconds. Why is it useful? This template saves hours of manual effort by automating video summarization, enabling users to focus on analyzing or sharing insights rather than watching entire videos. Itโs particularly useful for staying updated with trends, conducting research, or creating content efficiently. How does it work? The workflow integrates with YouTubeโs Transcript API powered by Apify Actor to fetch video transcripts, process the text using AI-powered summarization tools, and deliver a clear, concise summary. Setup Instructions You need an Apify account and an API key to connect with the Actor. Follow the steps below: Create a Free Account. Choose the appropriate Actor from the Apify search. Under the Integration tab, click on โUse API endpoints.โ Select the API that best suits your needs.
by Bright Data
๐ Glassdoor Job Finder: Bright Data Scraping + Keyword-Based Automation A comprehensive n8n automation that scrapes Glassdoor job listings using Bright Data's web scraping service based on user-defined keywords, location, and country parameters, then automatically stores the results in Google Sheets. ๐ Overview This workflow provides an automated job search solution that extracts job listings from Glassdoor using form-based inputs and stores organized results in Google Sheets. Perfect for recruiters, job seekers, market research, and competitive analysis. Workflow Description: Automates Glassdoor job searches using Bright Data's web scraping capabilities. Users submit keywords, location, and country via form trigger. The workflow scrapes job listings, extracts company details, ratings, and locations, then automatically stores organized results in Google Sheets for easy analysis and tracking. โจ Key Features ๐ฏ Form-Based Input: Simple web form for job type, location, and country ๐ Glassdoor Integration: Uses Bright Data's Glassdoor dataset for accurate job data ๐ Smart Data Processing: Automatically extracts key job information ๐ Google Sheets Storage: Organized data storage with automatic updates ๐ Status Monitoring: Built-in progress tracking and retry logic โก Fast & Reliable: Professional scraping with error handling ๐ฏ Keyword Flexibility: Search any job type with location filters ๐ Structured Output: Clean, organized job listing data ๐ฏ What This Workflow Does Input Job Keywords:** Job title or role (e.g., "Software Engineer", "Marketing Manager") Location:** City or region for job search Country:** Target country for job listings Processing Form Submission Data Scraping via Bright Data Status Monitoring Data Extraction Data Processing Sheet Update Output Data Points | Field | Description | Example | |-------|-------------|---------| | Job Title | Position title from listing | Senior Software Engineer | | Company Name | Employer name | Google Inc. | | Location | Job location | San Francisco, CA | | Rating | Company rating score | 4.5 | | Job Link | Direct URL to listing | https://glassdoor.com/job/... | ๐ Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with Glassdoor scraping dataset access 5โ10 minutes for setup Step 1: Import the Workflow Copy the JSON workflow code from the provided file In n8n: Workflows โ + Add workflow โ Import from JSON Paste JSON and click Import Step 2: Configure Bright Data Set up Bright Data credentials in n8n Ensure access to dataset: gd_lpfbbndm1xnopbrcr0 Update API tokens in: "Scrape Job Data" node "Check Delivery Status of Snap ID" node "Getting Job Lists" node Step 3: Configure Google Sheets Integration Create a new Google Sheet (e.g., "Glassdoor Job Tracker") Set up Google Sheets OAuth2 credentials in n8n Prepare columns: Column A: Job Title Column B: Company Name Column C: Location Column D: Rating Column E: Job Link Step 4: Update Workflow Settings Update "Update Job List" node with your Sheet ID and credentials Test the form trigger and webhook URL Step 5: Test & Activate Submit test data (e.g., "Software Engineer" in "New York") Activate the workflow Verify Google Sheet updates and field extraction ๐ Usage Guide Submitting Job Searches Navigate to your workflow's webhook URL Fill in: Search Job Type Location Country Submit the form Reading the Results Real-time job listing data Company ratings and reviews Direct job posting links Location-specific results Processing timestamps ๐ง Customization Options More Data Points:** Add job descriptions, salary, company size, etc. Search Parameters:** Add filters for salary, experience, remote work Data Processing:** Add validation, deduplication, formatting ๐จ Troubleshooting Bright Data connection failed:** Check API credentials and dataset access No job data extracted:** Validate search terms and location format Google Sheets permission denied:** Re-authenticate and check sharing Form submission failed:** Check webhook URL and form config Workflow execution failed:** Check logs, add retry logic Advanced Troubleshooting Check execution logs in n8n Test individual nodes Verify data formats Monitor rate limits Add error handling ๐ Use Cases & Examples Recruitment Pipeline:** Track job postings, build talent database Market Research:** Analyze job trends, hiring patterns Career Development:** Monitor opportunities, salary trends Competitive Intelligence:** Track competitor hiring activity โ๏ธ Advanced Configuration Batch Processing:** Accept multiple keywords, loop logic, delays Search History:** Track trends, compare results over time External Tools:** Integrate with CRM, Slack, databases, BI tools ๐ Performance & Limits Single search:** 2โ5 minutes Data accuracy:** 95%+ Success rate:** 90%+ Concurrent searches:** 1โ3 (depends on plan) Daily capacity:** 50โ200 searches Memory:** ~50MB per execution API calls:** 3 Bright Data + 1 Google Sheets per search ๐ค Support & Community n8n Community Forum:** community.n8n.io Documentation:** docs.n8n.io Bright Data Support:** Via your dashboard GitHub Issues:** Report bugs and features Contributing: Share improvements, report issues, create variations, document best practices. Need Help? Check the full documentation or visit the n8n Community for support and workflow examples.
by Alex Kim
Automate Video Creation with Luma AI Dream Machine and Airtable (Part 1) Description This workflow automates video creation using Luma AI Dream Machine and n8n. It generates dynamic videos based on custom prompts, random camera motion, and predefined settings, then stores the video and thumbnail URLs in Airtable for easy access and tracking. This automation makes it easy to create high-quality videos at scale with minimal effort. ๐ Airtable Base Template ๐ฅ Tutorial Video Setup 1. Luma AI Setup Create an account with Luma AI. Generate an API key from Luma AI for authentication. Ensure the API key has permission to create and manage video requests. 2. Airtable Setup Create an Airtable base with the following fields: Generation ID** โ To match incoming webhook data. Status** โ Workflow status (e.g., "Done"). Video URL** โ Stores the generated video URL. Thumbnail URL** โ Stores the thumbnail URL. Prompt** โ The video prompt used in the request. Aspect Ratio** โ Defines the video format (e.g., 9:16). Duration** โ Length of the video. ๐ Use the Airtable template linked above to simplify setup. 3. n8n Setup Install n8n (local or cloud). Set up Luma AI and Airtable credentials in n8n. Import the workflow and customize the settings based on your needs. How It Works 1. Global Settings Configuration The Set node defines key settings such as: Prompt** โ Example: "A crocheted parrot in a crocheted pirate outfit swinging on a crocheted perch." Aspect Ratio** โ Example: "9:16" Loop** โ Example: "true" Duration** โ Example: "5 seconds" Cluster ID** โ Used to group related videos for easy tracking. Callback URL** - Used for the Webhook workflow in Part 2 2. Random Camera Motion The Code node randomly selects a camera motion (e.g., Zoom In, Pan Left, Crane Up) to create dynamic and visually engaging videos. 3. API Request to Luma AI The HTTP Request node sends a POST request to Luma AIโs API with the following parameters: Prompt โ Uses the defined global settings. Aspect Ratio โ Matches the target platform (e.g., TikTok or YouTube). Duration โ Length of the video. Loop โ Determines if the video should loop. Callback URL โ Sends a POST response when the video is complete. 4. Capture API Response Luma AI sends a POST response to the callback URL once video generation is complete. The response includes: Video URL โ Direct link to the video. Thumbnail URL โ Link to the video thumbnail. Generation ID โ Used to match the record in Airtable. 5. Store in Airtable The Airtable node updates the record with the video and thumbnail URLs. Generation ID** is crucial for matching future webhook responses to the correct video record. Why This Workflow is Useful โ Automates high-quality video creation โ Reduces manual effort by handling prompt generation and API calls โ Random camera motion makes videos more dynamic โ Ensures organized tracking with Airtable โ Scalable โ Ideal for automating large-scale content creation Next Steps Part 2** โ Handling webhook responses and updating Airtable automatically. Future Enhancements** โ Adding more camera motions, multi-platform support, and automated video editing.
by Joseph LePage
This n8n workflow demonstrates multiple ways to harness DeepSeek's AI models in your automation pipeline! ๐ Core Features Multiple Integration Methods ๐ Local deployment using Ollama for DeepSeek-R1 Direct API integration with DeepSeek Chat V3 Conversational agent with memory buffer HTTP request implementation with both raw and JSON formats Model Options ๐ง DeepSeek Chat V3 for general conversation DeepSeek-R1 for advanced reasoning Memory-enabled agent for persistent context Quick Setup ๐ ๏ธ API Configuration Base URL: https://api.deepseek.com Get your API key from platform.deepseek.com/api_keys Local Setup ๐ป Install Ollama for local deployment Set up DeepSeek-R1 via Ollama Configure local credentials in n8n Implementation Details ๐ง Conversational Agent Window Buffer Memory for context Customizable system messages Built-in error handling with retries API Endpoints ๐ Chat completions for V3 and R1 models OpenAI API format compatibles
by Artur
What this workflow does Monitors Google Drive: The workflow triggers whenever a new CSV file is uploaded. Uses AI to Identify PII Columns: The OpenAI node analyzes the data and identifies PII-containing columns (e.g., name, email, phone). Removes PII: The workflow filters out these columns from the dataset. Uploads Cleaned File: The sanitized file is renamed and re-uploaded to Google Drive, ensuring the original data remains intact. How to customize this workflow to your needs Adjust PII Identification: Modify the prompt in the OpenAI node to align with your specific data compliance requirements. Include/Exclude File Types: Adjust the Google Drive Trigger settings to monitor specific file types (e.g., CSV only). Output Destination: Change the folder in Google Drive where the sanitized file is uploaded. Setup Prerequisites: A Google Drive account. An OpenAI API key. Workflow Configuration: Configure the Google Drive Trigger to monitor a folder for new files. Configure the OpenAI Node to connect with your API Set the Google Drive Upload folder to a different location than the Trigger folder to prevent workflow loops.
by PollupAI
LinkedIn Profile Enrichment Workflow Who is this for? This workflow is ideal for recruiters, sales professionals, and marketing teams who need to enrich LinkedIn profiles with additional data for lead generation, talent sourcing, or market research. What problem is this workflow solving? Manually gathering detailed LinkedIn profile information can be time-consuming and prone to errors. This workflow automates the process of enriching profile data from LinkedIn, saving time and ensuring accuracy. What this workflow does Input: Reads LinkedIn profile URLs from a Google Sheet. Validation: Filters out already enriched profiles to avoid redundant processing. Data Enrichment: Uses RapidAPI's Fresh LinkedIn Profile Data API to retrieve detailed profile information. Output: Updates the Google Sheet with enriched profile data, appending new information efficiently. Setup Google Sheet: Create a sheet with a column named linkedin_url and populate it with the profile URLs to enrich. RapidAPI Account: Sign up at RapidAPI and subscribe to the Fresh LinkedIn Profile Data API. API Integration: Replace the x-rapidapi-key and x-rapidapi-host values with your credentials from RapidAPI. Run the Workflow: Trigger the workflow and monitor the updates to your Google Sheet. How to customize this workflow Filter Criteria**: Modify the filter step to include additional conditions for processing profiles. API Configuration**: Adjust API parameters to retrieve specific fields or extend usage. Output Format**: Customize how the enriched data is appended to the Google Sheet (e.g., format, column mappings). Error Handling**: Add steps to handle API rate limits or missing data for smoother automation. This workflow streamlines LinkedIn profile enrichment, making it faster and more effective for data-driven decision-making.
by Damian Karzon
This workflow randomly select recipes from a Mealie instance (can use a specific category) and then creates a meal plan in Mealie with those recipes. How it works: Workflow has a scheduled trigger (set to run weekly on a Friday) Config node sets a few properties to configure the workflow A call to the Mealie API to get the list of recipes The code node holds most of the logic, this will loop through the number of recipes defined in the config node and randomly select a recipe from the list (making sure not to double up any recipes) Once all the recipes are selected it will call the Mealie API to set up the meal plan on the days Setup Add your Mealie API token as a credential and set it on the Http Request nodes Set the relevant schedule trigger to run when you like Update the Config node with the config you want numberOfRecipes - Number of recipes to populate for the meal plan offsetPlanDays - Number of days in the future to start the plan (0 will start it today, 1 tomorrow, etc.) mealieCategoryId - A category id of the category you want to pull in recipes from (default to select from all recipes) mealieBaseUrl - The base url of your Mealie instance