by Alexandru Burca
Automated multilingual article publishing from RSS feeds to WordPress using ACF Instalations Instructions Youtube Instalation Instructions # Who’s it for This workflow is built for news publishers, media organizations, and content aggregators who need to automatically: pull articles from RSS feeds rewrite them into original text translate them into multiple languages generate a featured image publish everything directly to WordPress. It is ideal for multilingual news portals, editorial teams with limited resources, and businesses that want to automate high-volume content production. How it works The workflow monitors a selected RSS feed at regular intervals and retrieves new article links. It scrapes each article’s HTML and uses AI to extract structured text: title full content and a short summary. The text is then rewritten into an original article tailored to your target audience’s language and country context. Next, the workflow translates the rewritten article into any number of additional languages while preserving the formatting. It also generates a unique AI-based featured image, uploads it to WordPress, assembles multilingual ACF fields, and publishes the final post with the correct metadata. How to set up Insert your RSS feed URL, add your OpenAI and Replicate API keys, configure your WordPress API credential, and ensure the ACF fields on your site match the workflow’s naming structure. Requirements WordPress with REST API enabled ACF WP Plugin installed OpenAI API key Replicate API key Firebase API Key How to customize the workflow Adjust the RSS source, modify the default language and list of translated languages, change the rewriting style or country context, refine the image generation prompt, or remap ACF fields to match your WordPress layout.
by Port IO
Complete security workflow from vulnerability detection to automated remediation, with severity-based routing and full organizational context from Port's catalog. This template provides end-to-end lifecycle management including automatic Jira ticket creation with appropriate priority, AI-powered remediation planning, and Claude Code-triggered fixes for critical vulnerabilities. The full guide is available here. How it works The n8n workflow orchestrates the following steps: Webhook trigger**: Receives vulnerability alerts from security scanners (Snyk, Wiz, SonarQube, etc.) via POST request. Port context enrichment**: Uses Port's n8n node to query your software catalog for service metadata, ownership, environment, SLA requirements, and dependencies related to the vulnerability. AI remediation planning**: OpenAI analyzes the vulnerability with Port context and generates a remediation plan, determining if automated fixing is possible. Severity-based routing**: Routes vulnerabilities through different paths based on severity level: Critical: Jira ticket (Highest priority) → Check if auto-fixable → Trigger Claude Code fix → Slack alert with fix status High: Jira ticket (High priority) → Slack notification to team channel Medium/Low: Jira ticket only for tracking Jira integration**: Creates tickets with full context including vulnerability details, affected service information from Port, and AI-generated remediation steps. Claude Code remediation**: For auto-fixable critical vulnerabilities, triggers Claude Code via Port action to create a pull request with the security patch, referencing the Jira ticket. Slack notifications**: Sends contextual alerts to the appropriate team channel (retrieved from Port) with Jira ticket reference and remediation status. Prerequisites You have a Port account and have completed the onboarding process. Services and repositories are cataloged in Port with ownership information. Your security scanner (Snyk, Wiz, SonarQube) can send webhooks. You have a working n8n instance (Cloud or self-hosted) with Port's n8n custom node installed. Jira Cloud account with appropriate project permissions. Slack workspace with bot permissions to post messages. OpenAI API key for remediation planning. Setup Register for free on Port.io if you haven't already. Create the Context Retriever Agent in Port following the guide. Import the workflow and configure credentials (Port, Jira, Slack, OpenAI, Bearer Auth). Select your Jira project in each Jira node (Critical, High, Medium/Low). Update default-organization/repository with your default repository for Claude Code fixes. Point your security scanner webhook to the workflow URL. Test with a sample vulnerability payload. ⚠️ This template is intended for Self-Hosted instances only.
by MUHAMMAD SHAHEER
What It Does Build your own AI Chatbot that listens, thinks, searches, and speaks — all inside n8n. This template combines Groq AI, LangChain Agent, SerpAPI, and StreamElements TTS to create a chatbot that: Understands natural language input Searches the web for real-time answers Remembers previous messages (context memory) Replies with a realistic AI voice How It Works Chat Trigger: The workflow activates whenever a new message is received. Groq AI Agent: Processes user input, performs reasoning, and integrates with SerpAPI for live web results. Memory Node: Keeps the chat context for a natural conversation flow. TTS Node: Converts AI responses into realistic voice replies using StreamElements API. Setup Steps Connect your Groq, SerpAPI, and StreamElements credentials (no coding required). Customize the chatbot behavior directly inside n8n. Deploy instantly and chat via webhook or UI widget. Use Cases Voice-enabled customer-support bots AI chat widgets for websites Personal assistants that talk and search the web
by 1Shot API
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Monetize Your Private LLM Models with x402 and Ollama Self-hosting custom LLMs is becoming more popular and easier with turn-key inferencing tools like Ollama. With Ollama you can host your own proprietary models for customers in a private cloud or on your own hardware. But monetizing custom-trained, propietary models is still a challenge, requiring integrations with payment processors like Stripe, which don't support micropayments for on-demand API consumption. With this free workflow you can quickly monetize your proprietary LLM models with the x402 payment scheme in n8n with 1Shot API. Setup Steps: Authenticate the 1Shot API node against your 1Shot API business account. Point the 1Shot API simulate and execute nodes at the x402-compatible payment token you'd like to receive as payment. Configure the Ollama n8n node in the workflow (with optional authentication) to forward to your Ollama API endpoint and let users query it through an n8n webhook endpoint while paying you directly in your preferred stablecoin (like USDC). Through x402, users and AI agents can pay per-inference, whith no overhead wasted on centralized payment processors. Walkthrough Tutorial Check out the YouTube tutorial for this workflow so see the full end-to-end process.
by Daniel Rosehill
This workflow provides a way to capture detailed AI prompts using a voice note transcription service and then passes them on for completion to an AI agent. To preserve outputs in a knowledge management system, the AI response and the prompt are combined into one document that is created in a Nuclino collection (note: the Nuclino step is configured manually with a HTTP request node). How it works A webhook receives voice note data from Voicenotes.com containing the title and transcript The transcript is extracted and sent to an AI Agent powered by OpenRouter's Claude Sonnet model The AI generates a structured response in markdown format with Summary, Prompt, and Response sections The original prompt and AI response are merged and prepared for multiple outputs A Nuclino document is created via HTTP Request with the structured content A Slack notification is sent with the prompt, response, and Nuclino note URL Both the original prompt and AI response are archived in NocoDB for future reference How to use The webhook trigger can be configured to receive data from Voicenotes.com or any service that provides title and transcript data Replace the manual trigger with webhook, form, or other triggers as needed Customize the AI system message to change response format and behavior Configure Nuclino workspace and collection IDs for proper document organization Requirements OpenRouter account** for AI model access (Claude Sonnet) Nuclino account** and API token for document creation Slack workspace** with bot permissions for notifications NocoDB instance** for archiving (optional) Voicenotes.com account** for voice input (or alternative webhook source) Customising this workflow AI Models**: Switch between different OpenRouter models by changing the model parameter Response Format**: Modify the AI Agent system message to change output structure Documentation Platforms**: Replace Nuclino HTTP Request with other documentation APIs Notification Channels**: Add multiple Slack channels or other notification services Archive Storage**: Replace NocoDB with other database solutions Input Sources**: Adapt webhook to accept data from different voice note or transcription services Nuclino API The Nuclino API is documented here.
by swathi
The problem Ever attend a networking event and find yourself taking screenshots of people's LinkedIn? Sounds counter-intuitive because you are connecting on LinkedIn. But you find it hard to keep track of everyone you've met. You also don't want to miss diligently updating your CRM with details and insights. *The solution * There's no need for yet another app. Continue taking screenshots. Just share them on a 2-field only Google Form: screenshot + your quick notes about the person. Create a shortcut to the Google Form link on your phone homescreen. Voila! You have app-like access without the need for an app. Once you submit with just these 2 pieces of info, AI parses the image AND crafts a follow-up message. Within minutes! Just open your spreadsheet to have all that information consolidated - automatically - for your review. Promote yourself from do-er to manager. Who should use it? Anyone really. If you find yourself meeting people but want to be more meticulous or efficient staying on top, use this. How to set it up Time: ~10 minutes end-to-end. Import the provided workflow JSON in n8n. Connect credentials: Google Drive (read), Google Sheets (write), OpenAI. Configure key information: Google Sheets and relevant columns Configure Open AI models based on your cost/ efficiency requirements Confirm column headers in your Sheet match the variables (or update the variables). Test with one screenshot. Pro-tip: Add that Google Form link as a shortcut on your phone's home screen. Get app-like convenience without downloading yet another app.
by Msaid Mohamed el hadi
🧠 Browsing History Automation Analyzer – Automation Toolkit (Google Sheets + AI) This n8n workflow analyzes your browsing history to identify opportunities for automation. It reads history from a Google Sheet, groups visits by domain, filters out irrelevant entries, and uses AI to recommend what can be automated — including how and why. 📌 What It Does 📄 Reads your browsing history from Google Sheets 🌐 Groups history by domain 🚫 Filters out common non-actionable domains (e.g., YouTube, Google) 🤖 Uses AI to analyze whether your activity on each site is automatable 💡 Provides suggestions including what to automate, how to do it, and which tools to use 📝 Saves results into a new tab in the same Google Sheet 🔍 Searches for n8n workflow templates related to the suggested automation 📊 Demo Sheet Input + output are handled via the following Google Sheet: 📎 Spreadsheet: View on Google Sheets Sheet: history** → Input browsing history Sheet: automations** → Output AI automation suggestions 🧠 AI Analysis Logic The AI agent receives each domain's browsing history and responds with: domain: The website domain automatable: true/false what_to_automate: Specific actions that can be automated reason: Why it's suitable (or not) for automation tool: Suggested automation tool (e.g., n8n, Apify) automation_rating: High, Medium, Low, or Not Automatable n8n_template: Relevant automation template (if found) 🔧 Technologies Used | Tool | Purpose | |--------------------------|-------------------------------------| | n8n | Workflow automation | | LangChain AI Agent | AI-based analysis | | Google Sheets Node | Input/output data handling | | OpenRouter (LLM) | Language model for intelligent reasoning | | JavaScript Code Node | Grouping and formatting logic | | Filter Node | Remove unwanted domains | | HTTP Request Node | Search n8n.io templates | 💻 Chrome History Export You can use this Chrome extension to export your browsing history in a format compatible with the workflow: 🔗 Export Chrome History Extension 📧 Want Personalized Automation Advice? If you'd like personalized automation recommendations based on your browsing history—just like what this workflow provides—feel free to contact me directly: > 📩 msaidwolfltd@gmail.com I'll help you discover what tasks you can automate to save time and boost productivity. 🚀 Example Use Cases Automate daily logins to dashboards Auto-fill forms on repetitive websites Schedule data exports from web portals Trigger reminders based on recurring visits Discover opportunities for scraping and integration 📜 License This workflow is provided as-is for educational and personal use. For commercial or customized use, contact the author.
by Dart
Task-based Assignee billing via Time Tracking This workflow automates billing by scanning a target Dartboard on schedule, aggregating time logs from completed tasks, cross‑referencing assignee rates in Google Sheets, calculating total pay, and updating the sheet with final billable hours and amounts. Who's it for Individuals, agencies, companies, and project managers automating payroll or client invoicing from task data. How to setup Link your Dart and Google accounts. Replace the dummy ID in the List tasks node with your actual target Dartboard ID. Set your preferred run frequency (e.g., Weekly). Create a Google Sheet with these exact headers: Name, HourlyRate, TotalHours, TotalPay, DateCalculated. Connect the Sheet nodes to your file. Pre-fill Name (matching Dart Assignees exactly) and HourlyRate in your Google Spreadsheet. Optional: Add a last header column in the sheet as a Status header to track if the bill is paid or pending. Customizing the workflow Choose your AI model for your AI time tracking and assignee scanner Use your own google sheet account and target spreadsheet document
by Nima Salimi
Overview This n8n workflow automatically fetches the Forex Factory calendar for yesterday using Rapid API, then saves the data to a connected Google Sheet and sends Telegram alerts for high and medium impact events. It runs daily on schedule, collecting key fields such as currency, time, impact, and market indicators, and organizes them for easy tracking and analysis. Perfect for forex traders and analysts who need quick access to reliable market data from the previous day’s events. ✅ Tasks ⏰ Runs automatically every day 🌐 Fetches yesterday’s Forex Factory calendar via Rapid API 🧾 Collects key data fields: year, date, time, currency, impact, actual, forecast, previous 📊 Saves all records to Google Sheets for tracking and analysis 🚨 Sends Telegram alerts for high and medium impact events ⚙️ Keeps your market data updated and organized with no manual work required 🛠 How to Use 📄 Create a Google Spreadsheet Create a new spreadsheet in Google Sheets and add two sheets: High Impact and Low Impact. Connect it to your Google Sheets nodes in n8n. 🌐 Find the API on Rapid API Go to Rapid API and search for Forex Factory Scraper. Subscribe to the API to get your access key. 🔑 Connect Rapid API to n8n In your HTTP Request node, add the header below to authenticate your request: 💬 Add Your Telegram Chat ID In the Telegram node, paste your Chat ID to receive daily alerts for high-impact news. 🕒 Activate the Workflow Enable the Schedule Trigger to run daily. The workflow will automatically fetch yesterday’s Forex Factory calendar, save it to Google Sheets, and send Telegram notifications.
by phil
This workflow automates the search and extraction of hotel data from Booking.com. Triggered by a chat message, it uses a combination of web scraping with Bright Data's Web Scraper and AI-powered data processing with OpenRouter to deliver a concise, human-friendly list of hotels. The final output is a clean and formatted report, making it a valuable tool for travelers, event planners, and business professionals who need to quickly find accommodation options. Who's it for This template is ideal for: Event Planners:** Quickly identify and compare hotel options for conferences, meetings, or group travel. Travel Agents:** Efficiently research and provide clients with a curated list of accommodations based on their specified destination. Business Travelers:** Instantly find and assess hotel availability and pricing for upcoming trips. Individuals:** Streamline the hotel search process for personal vacations or short-term stays. How it works The workflow is triggered by a chat message containing a city name from an n8n chat application. It uses Bright Data to initiate a web scraping job on Booking.com for the specified city. The workflow continuously checks the status of the scraping job. Once the data is ready, it downloads the snapshot. The extracted data is then passed to a custom AI agent powered by OpenRouter. This AI agent uses a calculator tool to convert prices and an instruction prompt to refine and format the raw data. The final output is a well-presented list of hotels, ready for display in the chat application. How to set up Bright Data Credentials: Sign up for a Bright Data account and create a Web Scraper dataset. In n8n, create new Bright Data API credentials and copy your API key. OpenRouter Credentials: Create an account on OpenRouter and get your API key. In n8n, create new OpenRouter API credentials and paste your key. Chat Trigger Node: Configure the "When chat message received" node. Copy the production webhook URL to integrate with your preferred chat platform. Requirements An active n8n instance. A Bright Data account with a Web Scraper dataset. An OpenRouter account with API access. How to customize this workflow Search Parameters:** The "Initiate batch extraction from URL" node can be modified to change search criteria, such as check-in/check-out dates, number of adults and children, or property type. Output Format:** Edit the "Human Friendly Results" node's system message to change the format of the final report. You can modify the prompt to generate a JSON object, a CSV, or a different text format. Price Conversion:** The "Calculator" tool can be adjusted to perform different mathematical operations or currency conversions by modifying the AI agent's prompt. . Phil | Inforeole | Linkedin 🇫🇷 Contactez nous pour automatiser vos processus
by Robin Geuens
Overview Use this workflow to create SEO-friendly outlines based on articles that do well in Google. Enter a keyword, and the workflow scrapes the top results, scrapes the content, analyzes it with AI, and builds a MECE (mutually exclusive, collectively exhaustive) outline. It’s useful for content creators and SEO specialists who want relevant, well-structured content. How it works Accepts a keyword submitted through a form Uses the SerpAPI to get top Google results for a chosen country Collects the top five URLs. We use five because we expect some to fail at the scraping stage Scrapes each URL separately Uses the first three articles to fit the AI model’s context window Extracts the main text from the page body Converts HTML to Markdown to get rid of tags and attributes. Combines the cleaned text into a single list for AI processing Analyzes the content with an AI language model to find common topics and headings Generates an SEO-focused outline based on the most frequent topics Setup steps Sign up for a SerpAPI account (free tier available) Create an OpenAI account and get an API key Set up your credentials within N8N Run the workflow and enter your keyword in the form. The workflow will generate an SEO-friendly outline for your content Improvement ideas Add another LLM to turn the outline into an article Use the Google docs API to add the outline to a Google doc Enright the outline with data from Perplexity or Tavily
by Luka Zivkovic
Description A production-ready authentication workflow implementing secure user registration, login, token verification, and refresh token mechanisms. Perfect for adding authentication to any application without needing a separate auth service. Get started with n8n now! What it does This template provides a complete authentication backend using n8n workflows and Data Tables: User Registration**: Creates accounts with secure password hashing (SHA-512 + unique salts) Login System**: Generates access tokens (15 min) and refresh tokens (7 days) using JWT Token Verification**: Validates access tokens for protected endpoints Token Refresh**: Issues new access tokens without requiring re-login Security Features**: HMAC-SHA256 signatures, hashed refresh tokens in database, protection against rainbow table attacks Why use this template No external services**: Everything runs in n8n - no Auth0, Firebase, or third-party dependencies Production-ready security**: Industry-standard JWT implementation with proper token lifecycle management Easy integration**: Simple REST API endpoints that work with any frontend framework Fully customizable**: Adjust token lifespans, add custom user fields, implement your own business logic Well-documented**: Extensive inline notes explain every security decision and implementation detail How to set up Prerequisites n8n instance (cloud or self-hosted) n8n Data Tables feature enabled Setup Steps Create Data Tables: users table: id, email, username, password_hash, refresh_token refresh_tokens table: id, user_id, token_hash, expires_at Generate Secret Keys: Run this command to generate a random secret: node -e "console.log(require('crypto').randomBytes(32).toString('hex'))" Generate two different secrets for ACCESS_SECRET and REFRESH_SECRET Configure Secrets: Update the three "SET ACCESS AND REFRESH SECRET" nodes with your generated keys Or migrate to n8n Variables for better security (instructions in workflow notes) Connect Data Tables: Open each Data Table node Select your created tables from the dropdown Activate Workflow: Save and activate the workflow Note your webhook URLs API Endpoints Register: POST /webhook/register-user Request body: { "email": "user@example.com", "username": "username", "password": "password123" } Login: POST /webhook/login Request body: { "email": "user@example.com", "password": "password123" } Returns: { "accessToken": "...", "refreshToken": "...", "user": {...} } Verify Token: POST /webhook/verify-token Request body: { "access_token": "your_access_token" } Refresh: POST /webhook/refresh Request body: { "refresh_token": "your_refresh_token" } Frontend Integration Example (Vue.js/React) Login flow: const response = await fetch('https://your-n8n.app/webhook/login', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ email, password }) }); const { accessToken, refreshToken } = await response.json(); localStorage.setItem('accessToken', accessToken); Make authenticated requests: const data = await fetch('https://your-api.com/protected', { headers: { 'Authorization': Bearer ${accessToken} } }); Key Features Secure Password Storage**: Never stores plain text passwords; uses SHA-512 with unique salts Two-Token System**: Short-lived access tokens (security) + long-lived refresh tokens (convenience) Database Token Revocation**: Refresh tokens can be revoked for logout-all-devices functionality Duplicate Prevention**: Checks username and email availability before account creation Error Handling**: Generic error messages prevent information leakage Extensive Documentation**: 30+ sticky notes explain every security decision Use Cases SaaS applications needing user authentication Mobile app backends Internal tools requiring access control MVP/prototype authentication without third-party costs Learning JWT and auth system architecture Customization Token Lifespan**: Modify expiration times in "Create JWT Payload" nodes User Fields**: Add custom fields to registration and user profile Password Rules**: Update validation in "Validate Registration Request" node Token Rotation**: Implement refresh token rotation for enhanced security (notes included) Security Notes :warning: Important: Change the default secret keys before production use Use HTTPS for all webhook endpoints Store secrets in n8n Variables (not hardcoded) Regularly rotate secret keys in production Consider rate limiting for login endpoints Support & Documentation The workflow includes comprehensive documentation: Complete authentication flow overview Security explanations for every decision Troubleshooting guide Setup instructions FAQ section with common issues Perfect for developers who want full control over their authentication system without the complexity of managing separate auth infrastructure. Get Started with n8n now! Tags: authentication, jwt, login, security, user-management, tokens, password-hashing, api, backend