by Harshil Agrawal
Note: This workflow uses the internal API which is not official. This workflow might break in the future. The workflow executes every night at 23:59. You can configure a different time bin the Cron node. Configure the GitHub nodes with your username, repo name, and the file path. In the HTTP Request nodes (making a request to localhost:5678), create Basic Auth credentials with your n8n instance username and password.
by Airtop
About The ICP Person Scoring Automation Sorting through lists of potential leads manually to determine who's truly worth your sales team's time isn't just tedious, it's incredibly inefficient. Without proper qualification, your team might spend hours pursuing prospects who aren't the right fit for your product, while ideal customers slip through the cracks. How to Automate Identifying Your Ideal Customers With this automation, you'll learn how to automatically score and prioritize leads using data extracted directly from LinkedIn profiles via Airtop's built-in integration with n8n. By the end, you'll have a fully automated workflow that analyzes prospects and calculates an Ideal Customer Profile (ICP) score, helping your sales team focus on high-potential opportunities. What You'll Need A free Airtop API key A copy of this Google Sheets Understanding the Process This automation transforms how you qualify and prioritize leads by extracting real-time, accurate information directly from LinkedIn profiles. Unlike static databases that quickly become outdated, this workflow taps into the most current professional information available. The workflow in this template: Uses Airtop to extract comprehensive LinkedIn profile data Analyzes the data to calculate an ICP score based on AI interest, technical depth, and seniority Updates your Google Sheet with the enriched data and the ICP score Person ICP Scoring Workflow Our person-focused workflow evaluates individual LinkedIn profiles to determine how well they match your ideal customer profile by: Extracting data for each individual Analyzing their profile to determine seniority and technical depth The system then automatically calculates an ICP score based on the following criteria: AI Interest: beginner-5 pts, intermediate-10 pts, advanced-25 pts, expert-35 pts Technical Depth: basic-5 pts, intermediate-15 pts, advanced-25 pts, expert-35 pts Seniority Level: junior-5 pts, mid-level-15 pts, senior-25 pts, executive-30 pts Setting Up Your Automation Here's how to get started: Configure your connections Connect your Google Sheets account Add your Airtop API key (obtain from the Airtop dashboard) Set up your Google Sheet Ensure your Google Sheet has the necessary columns for input data and result fields Ensure that columns Linkedin_URL_Person and ICP_Score_Person exist at least Configure the Airtop module Set up the Airtop module to use the appropriate LinkedIn extraction prompt Use our provided prompt that extracts individual profile data Customization Options While our templates work out of the box, you might want to customize them for your specific needs: Modify the ICP scoring criteria: Adjust the point values or add additional criteria specific to your business Add notification triggers: Set up Slack or email notifications for high-value leads that exceed a certain ICP threshold Implement batch processing: Modify the workflow to process leads in batches to optimize performance Add conditional logic: Create different scoring models for different industries or product lines Integrate with your CRM: Integrate this automation with your preferred CRM to get the details added automatically for you Real-World Applications Here's how businesses are using this automation: AI Sales Platform: A B2B AI company could implement this workflow to process their trade show lead list of contacts. Within hours, they can identify the top 50 prospects based on ICP score. SaaS Analytics Tool: A SaaS company could implement LinkedIn enrichment to identify which companies fit best. The automation processes weekly leads and categorizes them into high, medium, and low priority tiers, allowing their sales team to focus on the most promising opportunities first. Best Practices To get the most out of this automation: Review and refine your ICP criteria quarterly: What constitutes an ideal customer may evolve as your product and market develop Create tiered follow-up processes: Develop different outreach strategies based on ICP score ranges Perform regular data validation: Periodically check the accuracy of the automated scoring against your actual sales results What's Next? Now that you've automated your ICP scoring with LinkedIn data, you might be interested in: Setting up automated outreach sequences based on ICP score thresholds Creating custom reporting dashboards to track conversion rates by ICP segment Expanding your scoring model to include additional data sources Implementing lead assignment automation based on ICP scores Happy automating!
by Solomon
Based on Jonathan's work. Check out his templates. This workflow will backup your credentials to GitHub. It uses a CLI command to export all credentials. It then loops over the data, checks in GitHub to see if a file exists that uses the credential's ID. Once checked it will: update the file on GitHub if it exists; create a new file if it doesn't exist; ignore if it's the same. Config Options repo.owner - Github owner repo.name - Github repository name repo.path - Path within the Github repository ⚠ The credentials are all decrypted. Make sure you save them safely or tweak the CLI command to store them encrypted.== Check out my other templates 👉 https://n8n.io/creators/solomon/
by Harshil Agrawal
This workflow allows you to send position updates of the ISS every minute to a queue using the AWS SQS node. Cron node: The Cron node will trigger the workflow every minute. HTTP Request node: This node will make a GET request to the API https://api.wheretheiss.at/v1/satellites/25544/positions to fetch the position of the ISS. This information gets passed on to the next node in the workflow. Set node: We will use the Set node to ensure that only the data that we set in this node gets passed on to the next nodes in the workflow. AWS SQS: This node will send the data from the previous node to the iss-position queue. If you have created a queue with a different one, you can use that queue instead.
by Daniel Ng
Auto Backup n8n Workflows to Google Drive Imagine the sinking feeling: hours, weeks, or even months of meticulous work building your n8n workflows, suddenly gone. A server crash, an accidental deletion, data corruption, or an unexpected platform issue – and all your automated processes vanish. Without a reliable backup system, you're facing a complete rebuild from scratch, a scenario that's not just frustrating but can be catastrophic for business operations. Furthermore, consider the daunting task of migrating your n8n instance to a new host or server. Manually exporting each workflow, one by one, then painstakingly importing them into the new environment is not only incredibly time-consuming, especially if you have tens or hundreds of workflows, but also highly prone to errors and omissions. You need a systematic, automated solution. This workflow provides a robust solution for automatically backing up all your n8n workflows to Google Drive on schedule (default to every hour). It creates a uniquely named folder for each backup instance, incorporating the date and hour, and then systematically uploads each workflow as an individual JSON file. To manage storage space, the workflow also includes a cleanup mechanism that deletes backup folders older than a user-defined retention period (defaulting to 7 days). Ideally, this backup workflow should be used in conjunction with a restore solution like our "Restore Workflows from Google Drive Backups" template. For more powerful n8n templates, visit our website or contact us at AI Automation Pro. We help your business build custom AI workflow automation and apps. Feature highlights Triggers on schedule (default to hourly). Creates a \n8n\_backup\_YYYY-MM-DD\_HH\ folder in Google Drive. Fetches all n8n workflows. Saves each workflow as a JSON file to the new folder. Deletes backup folders older than the 'Coverage Period' (default to 7 days). Who is this for? This template is designed for: n8n Administrators and Developers:** Who need a reliable, automated system to safeguard their workflows against accidental loss, corruption, or system issues. Proactive n8n Users:** Who want to maintain a version history of their workflows, enabling easy rollback to previous configurations if necessary. Organizations:** Seeking to implement disaster recovery and data integrity practices for their n8n automation infrastructure. What problem is this workflow solving? / use case This workflow directly addresses these critical risks and challenges by: Automating Backups:** Eliminates the manual effort and inconsistency of ad-hoc backups, ensuring your workflows are regularly and reliably saved. Preventing Data Loss:** Safeguards your valuable automation assets against unforeseen disasters by creating secure, versioned copies in Google Drive. Facilitating Migration & Recovery:** Provides the foundational backups needed for a smoother, more systematic migration or a full disaster recovery, allowing you to restore your operations efficiently. Version Control:** By storing scheduled backups (defaulting to hourly), it allows you to access and restore previous versions of your workflows, offering an undo capability for significant changes or corruptions. Storage Management:** Automatically removes old backups based on a configurable retention period, preventing excessive use of Google Drive storage while keeping a relevant history. What this workflow does Scheduled Trigger: Runs automatically every hour. Timestamping: Fetches the current date and hour to create a unique name for the backup folder. Folder Creation: Creates a new folder in a specified Google Drive location. The folder is named in the format: n8n_backup_YYYY-MM-DD_HH. Workflow Retrieval: Connects to your n8n instance via its API and fetches a list of all existing workflows. Individual Backup: Processes each workflow one by one: Converts the workflow data to a binary JSON file. Uploads the JSON file (named after the workflow) to the hourly backup folder in Google Drive. Includes a short wait step between uploads to respect potential API rate limits. Old Backup Deletion: Calculates a cut-off date based on the "Coverage Period" set in the "Settings" node (e.g., 7 days prior to the current date). Searches Google Drive for backup folders (matching the naming convention) that are older than this cut-off date. Deletes these identified old backup folders to free up storage space. Step-by-step setup Import Template: Upload the provided JSON file into your n8n instance. Configure Credentials: Google Drive Nodes: You will need to create or select existing Google Drive OAuth2 API credentials for these nodes. n8n Node: n8n (node that fetches workflows) Configure n8n API credentials to allow the workflow to access your instance's workflow data. Specify Google Drive Backup Location: Open the "Google Drive Backup Folder Every Hour" node. Under the "Drive ID" parameter: select it from the list or provide its ID. Under the "Folder ID" parameter: select or input the ID of the parent folder in Google Drive where you want the n8n_backup_YYYY-MM-DD_HH folders to be created (e.g., a general "n8n\_Backups" folder). Set Backup Retention Period: Open the "Settings" node. Modify the value for "Coverage Period" (default is 7). This number represents the number of days backups should be kept before being deleted. Activate Workflow: Toggle the "Active" switch for the workflow in your n8n dashboard. How to customize this workflow to your needs Backup Frequency:* Adjust the "Rule" in the *"Schedule Trigger"** node to change the backup interval (e.g., daily, specific times). Folder/File Naming:* Modify the expressions in the "Parameters" tab of the *"Google Drive Backup Folder Every Hour"* node (for folder name) or the *"Google Drive Upload Workflows"** node (for file name) if you require a different naming convention. Targeted Backups:* To back up only specific workflows, insert a "Filter" node after the *"n8n"** node to filter workflows based on criteria like name, tags, or ID before they reach the "Move Binary Data" node. Wait Time:* The *"Wait"** node is set to 3 seconds between uploads. If you have a very large number of workflows or encounter rate limiting, you might adjust this duration. Error Workflow:** The workflow is pre-configured with an "Error Workflow" setting. Ensure this error workflow exists in your n8n instance, or update the setting to point to your preferred error handling workflow. This can be used to send notifications on failure. Important Considerations Resource Usage:** While the workflow includes a wait step between individual workflow uploads to minimize load, backing up an extremely large number of workflows could still consume resources on your n8n instance and make many API calls to Google Drive. Monitor performance if you have thousands of workflows. Testing Restore Process**: Regularly test restoring a few workflows from your Google Drive backups using the companion "Restore All n8n Workflows from Google Drive" template or a manual import. This verifies the integrity of your backups and ensures you can recover when needed. Workflow Modifications**: If you modify this backup workflow (e.g., change the folder naming convention), ensure your restore process or workflow is also updated to match these changes.
by Md. Nazmul Islam
AI-Powered MCQ Quiz Generator from YouTube Videos Transform any YouTube video into an interactive MCQ quiz automatically! This workflow uses Google Gemini AI to analyze video content and generate comprehensive multiple-choice questions with automatic grading - perfect for educators, trainers, and content creators. Who is this For This workflow is perfect for: Educators** creating quizzes from educational YouTube content Corporate Trainers** developing assessments from training videos Content Creators** engaging their audience with interactive quizzes Students** testing their knowledge on video lectures Online Course Creators** building assessments from video content Features AI Video Analysis**: Google Gemini 2.5 Flash analyzes entire YouTube videos (up to 50 minutes) Dynamic Question Generation**: Creates up to 90 MCQ questions with 3 options each Automatic Form Creation**: Generates Google Forms with quiz functionality Smart Grading**: Built-in correct answer identification and scoring Error Handling**: Robust error management with user feedback How It Works User Input via n8n Web Form: Form Name (Quiz Title) Email Address YouTube Video URL Number of Questions (1-90) AI Processing Pipeline: Google Gemini analyzes the YouTube video content AI extracts key concepts and generates relevant questions Structured output parser formats questions into JSON Google Forms Integration: Automatically creates a new Google Form Adds all generated questions with multiple choice options Configures quiz settings with correct answers and scoring Completion & Access: User receives direct link to the generated quiz Form ready for immediate use or sharing Video Demo: See this youtube Video to explore "how it works". Set Up Steps Import the Workflow Create a new workflow in n8n Import the JSON file by clicking "three dots" (upper right corner) > "Import from file..." Configure Google Gemini API Get your Google AI Studio API key from Google AI Studio On “HTTP Request to Gemini” node replace the “API_KEY” from url with your API key. Create a "Google Gemini (PaLM) API" credential in n8n Add your API key to the credential Connect the credential to the "Google Gemini Chat Model" node Set Up Google Forms Integration Enable Google Forms API in Google Cloud Console Create a "Google OAuth2 API" credential in n8n Authorize the credential with Forms permissions Connect the credential to both HTTP Request nodes (“Create a Google Form” node and “Create MCQ Quizzes” node) Configure Form Trigger The workflow includes a built-in form trigger No additional setup needed - the form URL will be generated automatically Customize form fields if needed in the “Input YouTube URL" node Test the Workflow Activate the workflow Submit the form to generate a test quiz Verify the Google Form is created successfully Pre-requisites Necessary Accounts:** Google Account (for Forms API access) Google AI Studio Account (for Gemini API access) n8n Instance (cloud or self-hosted) API Access:** Google Forms API enabled Google drive API enabled Google Generative AI API access Valid API keys and OAuth credentials N8N Requirements:** n8n version 1.95.2 or higher LangChain nodes package installed Internet access for API calls Customization Guidance Question Generation Prompts: Modify the prompt in "Set Prompt and model" node for different question styles Adjust difficulty levels or focus areas Change question format (True/False, Fill-in-blanks, etc.) Form Customization: Update form title and description templates Add additional input fields (difficulty level, subject area) Customize success/error messages Advanced Features You Can Add: Email Notifications: Send quiz links via email Analytics Integration: Track quiz performance and completion rates Multi-language Support: Generate quizzes in different languages Question Bank Storage: Save generated questions to a database Batch Processing: Generate multiple quizzes from a YouTube playlist Error Handling Enhancements: Add retry logic for API failures Implement fallback question generation Create detailed error logging Technical Specifications Video Length**: Up to 50 minutes supported Question Limit**: 1-90 questions per quiz Processing Time**: 2-10 minutes depending on video length Supported Formats**: YouTube videos (public and unlisted) Output Format**: Google Forms with automatic grading Limitations & Considerations YouTube video must be publicly accessible or unlisted Processing time increases with video length and question count API rate limits may apply for high-volume usage Some complex visual content may not be fully analyzed Ready to Transform Videos into Quizzes? This workflow streamlines the entire process from video analysis to quiz deployment. Perfect for educators and trainers looking to create engaging assessments from video content quickly and efficiently.
by Harshil Agrawal
This workflow demonstrates the use of $runIndex expression. It demonstrates how the expression can be used to avoid an infinite loop. The workflow will create 5 Tweets with the content 'Hello from n8n!'. You can use this workflow by replacing the Twitter node with any other node(s) and updating the condition in the IF node.
by Jakkrapat Ampring
Main Use Case This workflow enables automated, AI-assisted replies to users messaging a LINE Official Account, while storing and referencing chat history from Google Sheets to maintain context. Ideal for businesses or support teams that want to provide smart, personalized customer interactions using AI with memory. How It Works (Step-by-Step) Connect to LINE Official Account's API A Webhook listens for incoming messages from users on LINE. When a message is received, it triggers the workflow. Prepare the Data An Edit Fields module structures incoming data (e.g. extracts user ID, message content). This ensures data is clean and usable downstream. Retrieve Chat History The user’s previous conversations are fetched from a Google Sheet. This ensures the AI has memory and can continue conversations contextually. Prepare Prompt The retrieved chat history is combined with the new message to form a complete prompt for the AI. Example format: “User previously said X. Now they said Y. How should we respond?” AI Agent: Google Gemini The formatted prompt is passed to an AI Agent (Google Gemini Chat Model). The AI generates a response based on the message + history. Tools used: Chat ModeMemory, ToolOutputParser for accurate replies. Split & Clean History The conversation history is split into smaller chunks for cleaning and storage. This ensures the Google Sheet remains readable and manageable over time. Save Chat History The cleaned new message and AI reply are saved to Google Sheets. This updates the chat history for future context. Send Reply to LINE The AI-generated reply is sent back to the user via a POST HTTP Request to the LINE Messaging API. How to Set Up Prerequisites: LINE Official Account Google Sheet to store chat history Google Gemini API or AI agent with context memory Automation platform (e.g., n8n, as this seems visually similar) Step-by-Step: Create a Webhook on LINE: Set the webhook URL to your automation service. Enable webhook events. Design Your Google Sheet: Create a sheet with columns: User ID, Timestamp, Message, AI Reply. Set Up Modules in Automation Platform: Webhook: receives user messages. Edit Fields: extract user ID and message. Google Sheets Read: fetch message history. Prompt Composer: format prompt using past history + new message. AI Agent: connect to Google Gemini for smart replies. Split & Clean: clean and chunk history if needed. Google Sheets Write: save the updated conversation. HTTP Request: send reply to LINE via Messaging API. Test Your Workflow: Send a message from LINE. Watch the full loop: receive → process → AI → store → reply. Deploy & Monitor: Ensure error handling is in place (e.g., for blank messages or failed API calls). Regularly check your Google Sheets for storage limits. (If limits reached, you can increase the history row.) 📦 Benefits Maintains context in conversations Personalized, AI-driven responses Easy history tracking via Google Sheets Fully automated and scalable
by PiAPI
What does the workflow do? This workflow is designed to generate high-quality short videos, primarily uses GPT-4o-mini (unofficial), Midjourney (unofficial) and Kling (unofficial) APIs from PiAPI and Creatomate API mainly for content creator, social media bloggers and short-form video creators. Through this short video workflow, users can quickly validate their creative ideas and focus more on enhancing the quality of their video concepts. Who is the workflow for? Social Media Influencers: produce content videos based on inspiration efficiently. Vloggers: generate vlogs based on inspiration. Educational Creators: explain specific topics via animated short videos or demonstrate a specific imagined scenario to students for enhanced educational impact. Advertising Agencies: generate short videos based on specific products. AI Tool Developers: automatically generate product demo videos. Step-by-step Instructions Fill in X-API-key of PiAPI account in Basic Params node. Fill in the scenario of the image and video prompt. Set a video template on Creatomate and make an API call in the final node with core and processing modules provided in Creatomate. Before full video generation, you can first use basic assets in Creatomate for a prototype demo, then integrate with n8n after verifying the expected results. Fill in your Creatomate account settings following the image guildline. Click Test Workflow and wait for a generation (within 10~20min). In this workflow, we've established a basic structure for image-to-video generation with subtitle integration. You can further enhance it by adding music nodes using either PiAPI's audio models or your preferred music solution. All video elements will ultimately be composited through Creatomate. For best practice, please refer to PiAPI's official API documentation or Creatomate's API documentation to comprehend more use cases. Use Case Params Settings style: a children’s book cover, ages 6-10. --s 500 --sref 4028286908 --niji 6 character: A gentle girl and a fluffy rabbit explore a sunlit forest together, playing by a sparkling stream situational_keywords: Butterflies flutter around them as golden sunlight filters through green leaves. Warm and peaceful atmosphere Output Video
by Preston Zeller
How It Works This workflow automates the entire property lead generation process in a few simple steps: Property Search: Connects to BatchData's Property Search API with customizable parameters (location, property type, value range, equity percentage, etc.) Lead Filtering & Scoring: Processes results to identify the most promising leads based on criteria like absentee ownership, years owned, equity percentage, and tax status. Each property receives a lead score to prioritize follow-up. Skip Tracing: Automatically retrieves owner contact information (phone, email, mailing address) for each qualified property. Data Formatting: Structures all property and owner data into a clean, organized format ready for your systems. Multi-Channel Output: Generates an Excel spreadsheet with all lead details Pushes leads directly to your CRM (configurable for HubSpot, Salesforce, etc.) Sends a summary email with the spreadsheet attached The workflow can run on a daily schedule or be triggered manually as needed. All parameters are easily configurable through dedicated nodes, requiring no coding knowledge. Who's It For This workflow is perfect for: Real Estate Investors looking to find off-market properties with motivated sellers Real Estate Agents who want to generate listing leads from distressed or high-equity properties Investment Companies that need regular lead flow for acquisitions Real Estate Marketers who run targeted campaigns to property owners Wholesalers seeking to build a pipeline of potential deals Property Service Providers (roof repair, renovation contractors, etc.) who target specific property types Anyone who needs reliable, consistent lead generation for real estate without the manual work of searching, filtering, and organizing property data will benefit from this automation. About BatchData BatchData is a comprehensive property data provider that offers access to nationwide property information, owner details, and skip tracing services. Key features include: Extensive Database: Covers 150+ million properties across all 50 states Rich Property Data: Includes ownership information, tax records, sales history, valuation estimates, equity positions, and more Skip Tracing Services: Provides owner contact information including phone numbers, email addresses, and mailing addresses Distressed Property Indicators: Flags for pre-foreclosure, tax delinquency, vacancy, and other motivation factors RESTful API: Professional API for programmatic access to all property data services Regular Updates: Continuously refreshed data for accurate information BatchData's services are designed for real estate professionals who need reliable property and owner information to power their marketing and acquisition strategies. Their API-first approach makes it ideal for workflow automation tools like N8N.
by Dataki
This workflow allows you to easily evaluate and compare the outputs of two language models (LLMs) before choosing one for production. In the chat interface, both model outputs are shown side by side. Their responses are also logged into a Google Sheet, where they can be evaluated manually or automatically using a more advanced model. Use Case You're developing an AI agent, and since LLMs are non-deterministic, you want to determine which one performs best for your specific use case. This template is designed to help you compare them effectively. How It Works The user sends a message to the chat interface. The input is duplicated and sent to two different LLMs. Each model processes the same prompt independently, using its own memory context. Their answers, along with the user input and previous context, are logged to Google Sheets. You can review, compare, and evaluate the model outputs manually (or automate it later). In the chat, both responses are also shown one after the other for direct comparison. How To Use It Copy this Google Sheets template (File > Make a Copy). Set up your System Prompt and Tools in the AI Agent node to suit your use case. Start chatting! Each message will trigger both models and log their responses to the spreadsheet. Note: This version is set up for two models. If you want to compare more, you’ll need to extend the workflow logic and update the sheet. About Models You can use OpenRouter or Vertex AI to test models across providers. If you're using a node for a specific provider, like OpenAI, you can compare different models from that provider (e.g., gpt-4.1 vs gpt-4.1-mini). Evaluation in Google Sheets This is ideal for teams, allowing non-technical stakeholders (not just data scientists) to evaluate responses based on real-world needs. Advanced users can automate this evaluation using a more capable model (like o3 from OpenAI), but note that this will increase token usage and cost. Token Considerations Since each input is processed by two different models, the workflow will consume more tokens overall. Keep an eye on usage, especially if working with longer prompts or running multiple evaluations, as this can impact cost.
by Alfonso Corretti
Gmail to Vector Embeddings with PGVector and Ollama Who is this for? Everyone! Did you dream of asking an AI "what hotel did I stay in for holidays last summer?" or "what were my marks last semester like?". Dream no more, as vector similarity searches and this workflow are the foundations to make it possible (as long as the information appears in your e-mails 😅). 100% local This workflow is designed to use locally-hosted open source. Ollama as LLM provider, nomic-embed-text as the embeddings model, and pgvector as the vector database engine, on top of Postgres. But.. how?! Firstly, specify the date you created your Gmail account on, then manually run the workflow in order to bulk read all your e-mail in monthly batches. Your database is now populated! Now it's the task for other workflows to query the vector database. Activate the workflow so that new e-mail is continuously added by the Gmail Trigger upon receiving it. Structured AND Vectorized This workflow stores your e-mail activity in two ways: In a structured table In a vector embeddings table And the information in both of them can be correlated by Gmail's messages id, which is stored in the vectors table as metadata property emails_metadata.id. That way consumers can benefit from both worlds! ✨ Vector similarity searches enable semantic searches, while structured queries can retrieve more factual data like the message id, its date or who it came from. Other useful templates My template Chat with Your Email History using Telegram, Mistral and Pgvector for RAG is a ready-made solution to consume this workflow. You may also pair this workflow with my other template to Email Assistant: Convert Natural Language to SQL Queries with Phi4-mini and PostgreSQL and you'll enable RAG workflows that use both structured and vectorized databases. Customizations I suppose the e-mail provider could be changed, but then you'd have to identify an alternative id field. Message-ID would be a more standard option. There are a few opinionated choices as to what metadata to store, but those shouldn't need adjustments.