by Dart
Task-based Assignee billing via Time Tracking This workflow automates billing by scanning a target Dartboard on schedule, aggregating time logs from completed tasks, cross‑referencing assignee rates in Google Sheets, calculating total pay, and updating the sheet with final billable hours and amounts. Who's it for Individuals, agencies, companies, and project managers automating payroll or client invoicing from task data. How to setup Link your Dart and Google accounts. Replace the dummy ID in the List tasks node with your actual target Dartboard ID. Set your preferred run frequency (e.g., Weekly). Create a Google Sheet with these exact headers: Name, HourlyRate, TotalHours, TotalPay, DateCalculated. Connect the Sheet nodes to your file. Pre-fill Name (matching Dart Assignees exactly) and HourlyRate in your Google Spreadsheet. Optional: Add a last header column in the sheet as a Status header to track if the bill is paid or pending. Customizing the workflow Choose your AI model for your AI time tracking and assignee scanner Use your own google sheet account and target spreadsheet document
by Ryo Sayama
Who is this for This workflow is built for tutoring schools, cram schools, and independent teachers who want to offer students 24/7 Q&A support via LINE — Japan's most widely used messaging app. No coding knowledge is required to set it up. What this workflow does When a student sends a question on LINE, an AI Agent (powered by Gemini) reads the conversation history stored in Google Sheets, searches your teaching materials in Google Drive, and replies with a clear explanation directly in LINE. If the agent cannot find a confident answer — for example, the topic is not covered in the materials or the question is too complex — it automatically sends an escalation alert to a designated Slack channel so a teacher can follow up personally. The agent also logs every question and its resolution status to Google Sheets, giving teachers a running record of each student's weak points and question frequency. How to set up Connect your LINE Messaging API credentials in n8n. Create a Google Sheet with columns: student ID, timestamp, question, answer, resolved (yes/no). Upload your teaching materials (PDF or text files) to a Google Drive folder. Set the Drive folder ID and Sheet ID in the Set Fields node. Add your Slack webhook URL for teacher escalation notifications. Activate the workflow and share the LINE bot link with students. Requirements LINE Messaging API channel (free tier works) Google Drive folder with study materials (PDF or .txt) Google Sheets spreadsheet for conversation history and progress tracking Slack incoming webhook URL Gemini API key (free tier available) How to customize You can swap Google Drive for a different file source, adjust the Gemini prompt in the AI Agent node to change the explanation style (e.g., simpler language for younger students), or add more Slack channels to route questions by subject. The escalation threshold can be tuned by editing the condition in the If node.
by Ryo Sayama
Who is this for Anyone who wants a fun and practical AI chatbot on LINE. Great for people who enjoy getting advice from multiple angles — whether they face work stress, personal dilemmas, or everyday decisions. What this workflow does When a user sends a text message to the LINE bot, the workflow: Parses the incoming LINE Webhook event Passes the message to Google Gemini via Basic LLM Chain Gemini replies as three distinct personas in a single structured response The advice is logged to Google Sheets for history tracking A Flex Message carousel is sent back to the user — one card per persona, each color-coded The three personas: 🔮 Fortune Teller — mystical, fate-driven advice 💼 Business Coach — logical, action-oriented guidance 😊 Best Friend — casual, empathetic encouragement How to set up Create a LINE Messaging API channel and copy the Channel Access Token Set your n8n webhook URL as the LINE Webhook URL Create a Google Sheets spreadsheet with a sheet named advice_history and these headers in row 1: Timestamp, User ID, Message, Fortune Teller, Business Coach, Best Friend Open the Set config node and paste your LINE token and Sheet ID Connect your Google Gemini credential to the Google Gemini Chat Model node Connect your Google Sheets credential to the Save advice to Sheets node Activate the workflow and send a message to your LINE bot Requirements LINE Messaging API channel (free) Google Gemini API key (free tier available at aistudio.google.com) Google Sheets (any Google account) How to customize Change the three personas in the Generate advice with Gemini prompt to fit your use case (e.g. therapist, investor, comedian) Adjust the Flex Message card colors in Send Flex Message to LINE Add extra columns to Google Sheets to track additional metadata
by Msaid Mohamed el hadi
🧠 Browsing History Automation Analyzer – Automation Toolkit (Google Sheets + AI) This n8n workflow analyzes your browsing history to identify opportunities for automation. It reads history from a Google Sheet, groups visits by domain, filters out irrelevant entries, and uses AI to recommend what can be automated — including how and why. 📌 What It Does 📄 Reads your browsing history from Google Sheets 🌐 Groups history by domain 🚫 Filters out common non-actionable domains (e.g., YouTube, Google) 🤖 Uses AI to analyze whether your activity on each site is automatable 💡 Provides suggestions including what to automate, how to do it, and which tools to use 📝 Saves results into a new tab in the same Google Sheet 🔍 Searches for n8n workflow templates related to the suggested automation 📊 Demo Sheet Input + output are handled via the following Google Sheet: 📎 Spreadsheet: View on Google Sheets Sheet: history** → Input browsing history Sheet: automations** → Output AI automation suggestions 🧠 AI Analysis Logic The AI agent receives each domain's browsing history and responds with: domain: The website domain automatable: true/false what_to_automate: Specific actions that can be automated reason: Why it's suitable (or not) for automation tool: Suggested automation tool (e.g., n8n, Apify) automation_rating: High, Medium, Low, or Not Automatable n8n_template: Relevant automation template (if found) 🔧 Technologies Used | Tool | Purpose | |--------------------------|-------------------------------------| | n8n | Workflow automation | | LangChain AI Agent | AI-based analysis | | Google Sheets Node | Input/output data handling | | OpenRouter (LLM) | Language model for intelligent reasoning | | JavaScript Code Node | Grouping and formatting logic | | Filter Node | Remove unwanted domains | | HTTP Request Node | Search n8n.io templates | 💻 Chrome History Export You can use this Chrome extension to export your browsing history in a format compatible with the workflow: 🔗 Export Chrome History Extension 📧 Want Personalized Automation Advice? If you'd like personalized automation recommendations based on your browsing history—just like what this workflow provides—feel free to contact me directly: > 📩 msaidwolfltd@gmail.com I'll help you discover what tasks you can automate to save time and boost productivity. 🚀 Example Use Cases Automate daily logins to dashboards Auto-fill forms on repetitive websites Schedule data exports from web portals Trigger reminders based on recurring visits Discover opportunities for scraping and integration 📜 License This workflow is provided as-is for educational and personal use. For commercial or customized use, contact the author.
by Ertay Kaya
Generate and post Apple App Store review replies with Anthropic Claude, Google Drive and App Store Connect API This workflow empowers app developers and community management teams by automating the generation and posting of responses to user reviews on the Apple App Store. Designed to streamline the engagement process, it drastically reduces the manual workload on community managers by integrating AI-driven responses with necessary human oversight. By leveraging n8n's workflow automation capabilities, this solution eliminates the need for costly third-party platforms like AppFollow or Appbot, making it a cost-effective and efficient alternative. Pre-requisites Google Drive & Google Sheets access: To store and manage review spreadsheets. App Store Connect API credentials: To fetch and respond to app reviews. LLM credentials (e.g., Anthropic): Required for generating responses. Slack account Workflow steps 1. Initialise and trigger workflow: The process begins daily at 10 AM through a scheduled trigger. 2. Fetch application data: Utilizes a data table (Apple App Store apps) to retrieve a list of applications with their app id and name, essential for identifying review sources. 3. Collect App Store Reviews: Retrieves previous day's reviews from the App Store Connect API based on app data. Stores the reviews in Google Sheets for further processing. 4. Generate AI Responses: AI model generates initial responses based on review content. Responses are structured and stored along with reviews within a Google Spreadsheet located in a Google Drive folder called ToReview. A Slack message is sent with the URL of the file 5. Human Review & Modification: Community managers review and refine AI-generated responses. Reviewed spreadsheets are moved to the ToSubmit Google Drive folder by the editor. 6. Post Verified Responses: Workflow triggers again at 5 PM to access reviewed spreadsheets in ToSubmit folder. It posts the human-verified responses back to the respective reviews on Apple App Store using App Store Connect API. Logs are maintained, recording each response's success or failure. 7. Archive processed spreadsheets: After posting the responses, workflow moves the processed files from ToSubmit to a different folder called Archived
by Luan Correia
🔍 Overview This template uses Firecrawl’s /search API to perform AI-powered web scraping and screenshots — no code required. Just type natural language prompts, and an AI Agent will convert them into precise Firecrawl queries. ⚙️ Setup Get your Firecrawl API Key from https://firecrawl.dev Add it to n8n using HTTP Header Auth: Key: Authorization Value: Bearer YOUR_API_KEY 🚀 What It Does Turns natural language into smart search queries Scrapes web data and captures full-page screenshots Returns titles, links, content, and images 💡 Example Input: > Find AI automation pages on YouTube (exclude Shorts) Result: { "query": "intitle:AI automation site:youtube.com -shorts", "limit": 5 }
by Oneclick AI Squad
This automated n8n workflow automates AWS S3 bucket and file operations (create, delete, upload, download, copy, list) by parsing simple email commands and sending back success or error confirmations. Good to Know The workflow processes email requests via a Start Workflow (GET Request) node. Data extraction from emails identifies S3 operation commands. Error handling is included for invalid or missing email data. Responses are sent via email for each action performed. How It Works Start Workflow (GET Request)** - Captures incoming email requests. Extract Data from Email** - Parses email content to extract S3 operation commands. Check Task Type** - Validates the type of task (e.g., create bucket, delete file). Create a Bucket** - Creates a new S3 bucket. Delete a Bucket** - Deletes an existing S3 bucket. Copy a File** - Copies a file within S3. Delete a File** - Deletes a file from S3. Download a File** - Downloads a file from S3. Upload a File** - Uploads a file to S3. Get Many Files** - Lists multiple files in a bucket. Check Success or Fail** - Determines the outcome of the operation. Send Success Email** - Sends a success confirmation email. Send Failed Email** - Sends a failure notification email. How to Use Import the workflow into n8n. Configure the Start Workflow (GET Request) node to receive email commands. Test the workflow with sample email commands (e.g., "create bucket: my-bucket", "upload file: document.pdf"). Monitor email responses and adjust command parsing if needed. Example Email for Testing List files from the bucket json-test in Mumbai region. Requirements AWS S3 credentials configured in n8n. Email service integration (e.g., SMTP settings). n8n environment with workflow execution permissions. Customizing This Workflow Adjust the Extract Data from Email node to support additional command formats. Modify the Send Success Email or Send Failed Email nodes to customize messages. Update the S3 nodes to include additional bucket or file attributes.
by Matthew
Multi-Channel Cold Email Generator (LinkedIn + Website Fallback) Description This workflow automates the generation of hyper-personalized cold emails. It intelligently switches between two data sources: LinkedIn Activity and Company Website. If the lead has recent LinkedIn posts, the AI generates an icebreaker referencing their specific thoughts or news. If no posts are found, the workflow falls back to scraping their company website and generating an angle based on their business proposition. How it Works Fetch Data: Pulls a list of leads from a Google Sheet. Scrape LinkedIn: Uses Apify to attempt to scrape recent posts for the lead. Conditional Logic: Path A (Posts Found): Aggregates the posts, analyzes the context using GPT-4, and writes an email referencing the content. Path B (No Posts): Scrapes the URL provided in companyWebsite, converts the HTML to Markdown, analyzes the company value prop, and writes an email based on that. Save Results: Writes the generated Icebreaker, Intro, and CompanyType back to the original Google Sheet. Requirements n8n:** Self-hosted or Cloud. Google Sheets Account:** A sheet containing columns for email_final, linkedin_url, and companyWebsite. Apify Account:** You must have the LinkedIn Scraper actor (ID: A3cAPGpwBEG8RJwse or similar) configured and an API Token. OpenAI API Key:** Access to GPT-4 model is recommended for best quality. Setup Instructions Import the JSON: Copy the provided JSON template and paste it into your n8n canvas. Configure Credentials: Set up your Google Sheets and OpenAI credentials in n8n. Apify Token: Locate the Apify LinkedIn Scraper node (HTTP Request). In Header Parameters > Authorization, replace YOUR_APIFY_API_TOKEN with your actual Apify Bearer token. Google Sheet Configuration: Open the Fetch Leads node. Select your generic Sheet and specific Workbook. Open both Update Row nodes (there are two: one for the Website branch, one for the LinkedIn branch) and ensure they point to the same Sheet/Workbook. Customize AI Prompts: Open the two Write Email Copy nodes. In the system prompt, look for [YOUR_BUSINESS_TYPE] and [YOUR_COMPANY_NAME]. Replace these with your actual business details to ensure the AI generates relevant outreach. Customization Model Selection:** You can switch the OpenAI model to gpt-3.5-turbo to save costs, though the quality of the "Icebreakers" may decrease. Output Columns:* The workflow currently outputs Icebreaker, intro, and companyType. You can modify the *Update Row** nodes to map these to different column headers in your sheet if needed.
by osama goda
What this workflow does This AI agent researches any product across e-commerce marketplaces and generates a full market analysis report from a single chat message. Tell it what you're looking for, your budget, and optionally the region — it handles the rest. The agent will: Search across marketplaces (Amazon, Noon, Jumia, AliExpress, and more) Scrape the top product pages for real pricing, ratings, and reviews Analyze the data and return a structured report including market overview, top products with links, buying insights, common complaints, and a recommendation with market gap analysis for sellers Key features Multi-marketplace support**: Amazon (.com, .eg, .sa, .ae), Noon, Jumia, AliExpress, eBay Regional awareness**: Automatically detects Egypt, Saudi Arabia, UAE, or defaults to global search Bilingual**: Works in English and Arabic (Egyptian dialect) Currency-aware**: Uses EGP, SAR, AED, or USD based on user input Market gap analysis**: Identifies selling opportunities, not just buying recommendations Example prompts "Research the best wireless earbuds under $30" "ابحثلي عن أحسن ماكينة قهوة في مصر أقل من 2500 جنيه" "Find the best robot vacuum on Amazon under $200" "ابحثلي عن أحسن powerbank في السعودية أقل من 100 ريال" How it works User sends a product research request via the chat trigger The AI Agent plans a search strategy based on product, budget, and region Firecrawl Search finds relevant product and review pages across the web Firecrawl Scrape extracts detailed product data from the top results The AI analyzes everything and generates a structured report with actionable insights Set up steps (takes 2 minutes) Firecrawl API key: Sign up free at firecrawl.dev and add your API key to both Firecrawl nodes LLM API key: Add your OpenAI or OpenRouter API key to the Chat Model node. Recommended model: any model with a large context window (e.g., Gemini 2.5 Flash, GPT-4o) Start researching: Click "Open Chat" and type your product research query Who is this for E-commerce sellers doing product research Dropshippers finding winning products Affiliate marketers comparing products Anyone who wants to shop smarter Built with n8n AI Agent node Firecrawl (Search + Scrape) Compatible with OpenAI, OpenRouter, Google Gemini, and other LLM providers
by phil
This workflow automates the search and extraction of hotel data from Booking.com. Triggered by a chat message, it uses a combination of web scraping with Bright Data's Web Scraper and AI-powered data processing with OpenRouter to deliver a concise, human-friendly list of hotels. The final output is a clean and formatted report, making it a valuable tool for travelers, event planners, and business professionals who need to quickly find accommodation options. Who's it for This template is ideal for: Event Planners:** Quickly identify and compare hotel options for conferences, meetings, or group travel. Travel Agents:** Efficiently research and provide clients with a curated list of accommodations based on their specified destination. Business Travelers:** Instantly find and assess hotel availability and pricing for upcoming trips. Individuals:** Streamline the hotel search process for personal vacations or short-term stays. How it works The workflow is triggered by a chat message containing a city name from an n8n chat application. It uses Bright Data to initiate a web scraping job on Booking.com for the specified city. The workflow continuously checks the status of the scraping job. Once the data is ready, it downloads the snapshot. The extracted data is then passed to a custom AI agent powered by OpenRouter. This AI agent uses a calculator tool to convert prices and an instruction prompt to refine and format the raw data. The final output is a well-presented list of hotels, ready for display in the chat application. How to set up Bright Data Credentials: Sign up for a Bright Data account and create a Web Scraper dataset. In n8n, create new Bright Data API credentials and copy your API key. OpenRouter Credentials: Create an account on OpenRouter and get your API key. In n8n, create new OpenRouter API credentials and paste your key. Chat Trigger Node: Configure the "When chat message received" node. Copy the production webhook URL to integrate with your preferred chat platform. Requirements An active n8n instance. A Bright Data account with a Web Scraper dataset. An OpenRouter account with API access. How to customize this workflow Search Parameters:** The "Initiate batch extraction from URL" node can be modified to change search criteria, such as check-in/check-out dates, number of adults and children, or property type. Output Format:** Edit the "Human Friendly Results" node's system message to change the format of the final report. You can modify the prompt to generate a JSON object, a CSV, or a different text format. Price Conversion:** The "Calculator" tool can be adjusted to perform different mathematical operations or currency conversions by modifying the AI agent's prompt. . Phil | Inforeole | Linkedin 🇫🇷 Contactez nous pour automatiser vos processus
by Robin Geuens
Overview Use this workflow to create SEO-friendly outlines based on articles that do well in Google. Enter a keyword, and the workflow scrapes the top results, scrapes the content, analyzes it with AI, and builds a MECE (mutually exclusive, collectively exhaustive) outline. It’s useful for content creators and SEO specialists who want relevant, well-structured content. How it works Accepts a keyword submitted through a form Uses the SerpAPI to get top Google results for a chosen country Collects the top five URLs. We use five because we expect some to fail at the scraping stage Scrapes each URL separately Uses the first three articles to fit the AI model’s context window Extracts the main text from the page body Converts HTML to Markdown to get rid of tags and attributes. Combines the cleaned text into a single list for AI processing Analyzes the content with an AI language model to find common topics and headings Generates an SEO-focused outline based on the most frequent topics Setup steps Sign up for a SerpAPI account (free tier available) Create an OpenAI account and get an API key Set up your credentials within N8N Run the workflow and enter your keyword in the form. The workflow will generate an SEO-friendly outline for your content Improvement ideas Add another LLM to turn the outline into an article Use the Google docs API to add the outline to a Google doc Enright the outline with data from Perplexity or Tavily
by Luka Zivkovic
Description A production-ready authentication workflow implementing secure user registration, login, token verification, and refresh token mechanisms. Perfect for adding authentication to any application without needing a separate auth service. Get started with n8n now! What it does This template provides a complete authentication backend using n8n workflows and Data Tables: User Registration**: Creates accounts with secure password hashing (SHA-512 + unique salts) Login System**: Generates access tokens (15 min) and refresh tokens (7 days) using JWT Token Verification**: Validates access tokens for protected endpoints Token Refresh**: Issues new access tokens without requiring re-login Security Features**: HMAC-SHA256 signatures, hashed refresh tokens in database, protection against rainbow table attacks Why use this template No external services**: Everything runs in n8n - no Auth0, Firebase, or third-party dependencies Production-ready security**: Industry-standard JWT implementation with proper token lifecycle management Easy integration**: Simple REST API endpoints that work with any frontend framework Fully customizable**: Adjust token lifespans, add custom user fields, implement your own business logic Well-documented**: Extensive inline notes explain every security decision and implementation detail How to set up Prerequisites n8n instance (cloud or self-hosted) n8n Data Tables feature enabled Setup Steps Create Data Tables: users table: id, email, username, password_hash, refresh_token refresh_tokens table: id, user_id, token_hash, expires_at Generate Secret Keys: Run this command to generate a random secret: node -e "console.log(require('crypto').randomBytes(32).toString('hex'))" Generate two different secrets for ACCESS_SECRET and REFRESH_SECRET Configure Secrets: Update the three "SET ACCESS AND REFRESH SECRET" nodes with your generated keys Or migrate to n8n Variables for better security (instructions in workflow notes) Connect Data Tables: Open each Data Table node Select your created tables from the dropdown Activate Workflow: Save and activate the workflow Note your webhook URLs API Endpoints Register: POST /webhook/register-user Request body: { "email": "user@example.com", "username": "username", "password": "password123" } Login: POST /webhook/login Request body: { "email": "user@example.com", "password": "password123" } Returns: { "accessToken": "...", "refreshToken": "...", "user": {...} } Verify Token: POST /webhook/verify-token Request body: { "access_token": "your_access_token" } Refresh: POST /webhook/refresh Request body: { "refresh_token": "your_refresh_token" } Frontend Integration Example (Vue.js/React) Login flow: const response = await fetch('https://your-n8n.app/webhook/login', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ email, password }) }); const { accessToken, refreshToken } = await response.json(); localStorage.setItem('accessToken', accessToken); Make authenticated requests: const data = await fetch('https://your-api.com/protected', { headers: { 'Authorization': Bearer ${accessToken} } }); Key Features Secure Password Storage**: Never stores plain text passwords; uses SHA-512 with unique salts Two-Token System**: Short-lived access tokens (security) + long-lived refresh tokens (convenience) Database Token Revocation**: Refresh tokens can be revoked for logout-all-devices functionality Duplicate Prevention**: Checks username and email availability before account creation Error Handling**: Generic error messages prevent information leakage Extensive Documentation**: 30+ sticky notes explain every security decision Use Cases SaaS applications needing user authentication Mobile app backends Internal tools requiring access control MVP/prototype authentication without third-party costs Learning JWT and auth system architecture Customization Token Lifespan**: Modify expiration times in "Create JWT Payload" nodes User Fields**: Add custom fields to registration and user profile Password Rules**: Update validation in "Validate Registration Request" node Token Rotation**: Implement refresh token rotation for enhanced security (notes included) Security Notes :warning: Important: Change the default secret keys before production use Use HTTPS for all webhook endpoints Store secrets in n8n Variables (not hardcoded) Regularly rotate secret keys in production Consider rate limiting for login endpoints Support & Documentation The workflow includes comprehensive documentation: Complete authentication flow overview Security explanations for every decision Troubleshooting guide Setup instructions FAQ section with common issues Perfect for developers who want full control over their authentication system without the complexity of managing separate auth infrastructure. Get Started with n8n now! Tags: authentication, jwt, login, security, user-management, tokens, password-hashing, api, backend