by Julian Kaiser
Turn Your Reading Habit into a Content Creation Engine This workflow is built for one core purpose: to maximize the return on your reading time. It turns your passive consumption of articles and highlights into an active system for generating original content and rediscovering valuable ideas you may have forgotten. Why This Workflow is Valuable End Writer's Block Before It Starts:** This workflow is your personal content strategist. Instead of staring at a blank page, you'll start your week with a list of AI-generated content ideas—from LinkedIn posts and blog articles to strategic insights—all based on the topics you're already deeply engaged with. It finds the hidden connections between articles and suggests novel angles for your next piece. Rescue Your Insights from the Digital Abyss:** Readwise is fantastic for capturing highlights, but the best ones can get lost over time. This workflow acts as your personal curator, automatically excavating the most impactful quotes and notes from your recent reading. It doesn't just show them to you; it contextualizes them within the week's key themes, giving them new life and relevance. Create an Intellectual Flywheel:** By systematically analyzing your reading, generating content ideas, and saving those insights back into your "second brain," you create a powerful feedback loop. Your reading informs your content, and the process of creating content deepens your understanding, making every reading session more valuable than the last. How it works This workflow automates the process of generating a "Weekly Reading Insights" summary based on your activity in Readwise. Trigger:** It can be run manually or on a weekly schedule Fetch Data:** It fetches all articles and highlights you've updated in the last 7 days from your Readwise account. Filter & Match:** It filters for articles that you've read more than 10% of and then finds all the corresponding highlights for those articles. Generate Insights:** It constructs a detailed prompt with your reading data and sends it to an AI model (via OpenRouter) to create a structured analysis of your reading patterns, key themes, and content ideas. Save to Readwise:** Finally, it takes the AI-generated markdown, converts it to HTML, and saves it back to your Readwise account as a new article titled "Weekly Reading Insights". Set up steps Estimated Set Up Time:** 5-10 minutes. Readwise Credentials: Authenticate the two HTTP Request nodes and the two Fetch nodes with your Readwise API token Get from Reader API. Also check how to set up Header Auth AI Model Credentials: Add your OpenRouter API key to the OpenRouter Chat Model node. You can swap this for any other AI model if you prefer. Customize the Prompt: Open the Prepare Prompt Code node to adjust the persona, questions, and desired output format. This is where you can tailor the AI's analysis to your specific needs. Adjust Schedule: Modify the Monday - 09:00 Schedule Trigger to run on your preferred day and time.
by Țugui Dragoș
How it works This workflow fetches articles from any RSS feed, processes them with an AI model (DeepSeek), and sends only the most relevant alerts directly to Slack. Normalizes and deduplicates RSS items Extracts article text and cleans HTML Summarizes and classifies with AI (sentiment + flags) Filters out irrelevant news Sends real-time alerts to your Slack channel Setup steps Add your Slack Bot Token (via Slack API) Add your DeepSeek API Key Import this workflow into n8n Deploy and start receiving smart news alerts in Slack Use case Perfect for tracking AI, startups, finance, and tech industry news without the noise.
by Abrar Sami
How it works Fetches a blog post HTML from your blog URL using an HTTP request node Extracts readable content using Cheerio (code node) Saves the raw blog text to Airtable Translates the content to a language of your choice using Google Translate Updates the same Airtable record with the translated version in a different column Set up steps Estimated setup time:** 15–20 minutes (includes connecting Airtable and Google Translate credentials) You’ll need an Airtable base with HTML and TRANSLATED fields Or use this pre-made base: Airtable Template Simply add your blog post URL inside the HTTP Request node
by Harshil Agrawal
This workflow allows you to create transcription jobs for all your audio and video files stored in AWS S3. AWS S3: This node will retrieve all the files from an S3 bucket you specify. AWS Transcribe: This node will create a transcription job for the files that get returned by the previous node.
by Jonathan
Task: Merge two datasets into one based on matching rules Why: A powerful capability of n8n is to easily branch out the workflow in order to process different datasets. Even more powerful is the ability to join them back together with SQL-like joining logic. Main use cases: Appending data sets Keep only new items Keep only existing items
by jan
Shows how it is possible to use the data of a command line tool which returns JSON. Example shows how: to bring data in flow to use data directly in a node Note that the 'execute command' node is not available on n8n Cloud.
by Kumar Shivam
The AI-powered MIS Agent is an intelligent, automated system built using n8n that streamlines email-based data collection and document organization for businesses. It classifies incoming emails, extracts and processes attachments or Drive links, and routes them to the correct destination folders in Google Drive. Additionally, it provides advanced file operations like cleaning, merging, joining, and transforming data. Advantages 📥 Automated Email and File Management Detects and processes emails containing attachments or Drive links, ensuring seamless classification and routing of business-critical files. 🧠 AI-Based Classification Uses LLMs (like GPT-4o Mini) to classify emails into categories such as Daily Sales, Customer Info, and Address based on their content. 📂 Smart File Routing and Upload Recognizes whether a file is a direct attachment or a Google Drive link, extracts the file ID if necessary, and uploads it to predefined folders. 📊 Powerful Data Operations Supports operations like append, join, group by, aggregation, and standardization of data directly from spreadsheets using Python and Pandas within the workflow. 🔁 Scheduled and Triggered Automation Supports scheduled runs and real-time email triggers, making it highly reliable and timely. 🔧 Fully Modular and Scalable Easily expandable with more logic, new folders, or different workflows. Clean architecture and annotations make maintenance simple. How It Works Email Trigger The system uses a Gmail trigger to monitor incoming emails with specific labels or attachments. Classification An LLM-based text classifier identifies the purpose of the email (e.g., sales data, address list, customer details). Conditional Logic Regex-based conditions check if the email contains Google Drive links or attachments. File Handling If it's a Drive link, it extracts the file ID and copies it to the correct folder. If it's an attachment, it uploads directly. Scheduled Data Management Periodically moves or logs files from predefined folders using a schedule trigger. Data Cleaning and Processing Performs data cleaning and transformation tasks like replacing missing values, standardizing formats, and joining datasets based on criteria provided by the user. Final Output Cleaned and processed files are saved in designated folders with their public links shared back through the system. Set Up Steps Configure Nodes: Gmail Trigger: Detects relevant incoming emails. Text Classifier: Uses OpenAI model to categorize email content. Regex Conditions: Determine whether a link or attachment is present. Google Drive Operations: Upload or copy files to categorized folders. Python Nodes: Handle data manipulation using Pandas. Google Sheets Nodes: Extract, clean, and write structured data. LLM-based Chat Models: Extract and apply cleaning configurations. Connect Nodes: Seamlessly connect Gmail inputs, classification, file processing, and data logic. Output links or processed files are uploaded back to Drive and ready to share. Credentials: Ensure OAuth credentials for Gmail, Google Drive, and OpenAI are correctly set. Ideal For Sales & CRM teams managing large volumes of email-based reports. Data teams needing structured pipelines from unstructured email inputs. Businesses looking to automate classification, storage, and transformation of routine data. Testing and Additional customization If you want to test this bot capability before purchasing the workflow. ask me on my mail kumar.shivam19oce@gmail.com I will share the chat url and the links of associated google drives to see the result once you are satisfied then we are good to go. I have just kept $1 for testing purposes because of paid open ai . -If there is any customization needed like charts and other request like adding databases feel free to let me know i can do it accordingly. This is the first version i will come with more advancements based on the request and responses. Use it and let me know on kumar.shivam19oce@gmail.com
by Harshil Agrawal
This workflow allows you to receive updates when a new contact is added in Autopilot and add them to a base in Airtable. Autopilot Trigger node: The Autopilot Trigger node will trigger the workflow when a new contact is added in Autopilot. Set node: We use the Set node to ensure that only the data that we set in this node gets passed on to the next nodes in the workflow. Airtable node: This node will store the data coming from the previous node in a table in Airtable.
by Nicholas Lewanowicz
A goal for 2022 is to write 1 thing I do each day. This workflow will automatically remind you on telegram to write something you did yesterday, optionally you can enable the second workflow which will allow you to reply to the message and have it recorded in a google sheet. Note: Make sure to configure your Telegram credentials!
by Jacob @ vwork Digital
This workflow helps small business owners using Wave Apps to easily access the Wave Accounting API using n8n In this example, the workflow is triggered from a new payout from Stripe. It then logs the transaction as a journal entry in Wave Accounting, helping you automate your accounting without needing to pay for expensive subscriptions! What this workflow template does This workflow triggers when a payout is paid in Stripe and sends a GraphQL request to the Wave Accounting API recording the payout as a journal entry automatically. The benefits of this worklow are to instantaneously keep your books up to date and to ensure accuracy. How to setup Setup your Stripe credential in n8n using the native Stripe nodes. Follow this guide to get your Wave Apps authorization token. Setup the node with the correct header auth -> {"Authorization": "Bearer TOKEN-HERE"} Get your account IDs from Wave The payload uses GraphQL so ensure to update it accordingly with what you are trying to achieve, the current example creates a journal entry. Tips Getting Wave account and IDs It is easiest to use network logs to obtain the right account IDs from Wave, especially if you have created custom accounts (default Wave account IDs can be obtained via making that API call). Go to Wave and make a new dummy transaction using the accounts you want to use in n8n. Have the network logs open while you do this. Search network logs for the name of the account you are trying to use. You should see account IDs in the log data. Sales tax This example uses sales tax baked into the total Stripe payout amount (5%) You can find the sales tax account IDs the same way as you found the Wave account IDs using network logs Use AI to help you with your API call Ask Claude or ChatGPT to draft you your ideal payload
by Konstantin
How it works This workflow powers an intelligent, conversational AI bot for VK that can understand and respond to both text and voice messages. The bot uses an AI agent with built-in memory, allowing it to remember the conversation history for each unique user (or in each chat) and answer follow-up questions. It's a complete solution for creating an engaging, automated assistant within your VK group. Step-by-step VK Webhook (Trigger): The workflow starts when the Webhook node receives a new message from your VK group. Duplicate Filtering: The data immediately passes through the Filter Dubles node, which checks for the x-retry-counter header. This is a crucial step to prevent processing duplicate retry requests sent by the VK API. Voice or Text Routing: A Voice/Text (Switch) node checks if the message contains text (message.text) or a voice attachment (audio_message.link_mp3). Voice Transcription: If it's a voice note, the Get URL (HTTP Request) node downloads the audio file. The file is then passed to the Transcribe (OpenAI) node, which uses the Whisper model to convert the audio to text. Input Unification: Both the original text (from the 'Text' path) and the newly transcribed text (from the 'Voice' path) are routed to the Set Prompt node. This node standardizes the input into a single prompt variable. AI Agent Processing: The prompt variable is passed to the AI Agent. This agent is powered by an OpenAI Chat Model and uses Simple Memory to retain conversation history, using the VK peer_id as the sessionKey. This allows it to maintain a separate history for both private messages and group chats. Response Generation: The successful AI response is passed to the Send to VK (HTTP Request) node, which sends the message back to the user. Error Handling: The AI Agent node has error handling enabled (onError). If it fails, the flow is redirected to the Error (HTTP Request) node, which sends a fallback message to the user. Set up steps Estimated set up time: 10 minutes Add your OpenAI credentials to the OpenAI Chat Model and Transcribe nodes. Add your VK group's API Bearer Token credentials to the two HTTP Request nodes named Send to VK and Error. Webhook Setup (Important\!): This is a two-stage process: confirmation and operation. Copy the Production Webhook URL from the Webhook node. Stage A: Confirm Address (One-time) In the Webhook node settings, set Response Mode to On Received. In Options -\> Response Data, temporarily paste the confirmation string that VK provides. Activate the workflow (toggle "Active" in the top-right). Paste the URL into your VK group's Callback API settings (Management -\> API -\> Callback API) and click "Confirm". Stage B: Operational Mode (Permanent) Return to the Webhook node. Set Response Mode to Immediate. In Options -\> Response Data, type the word ok (lowercase). Save and reactivate the workflow. The bot is now live. (Optional) Customize the system prompt in the AI Agent node to define your bot's name and personality.
by vinci-king-01
How it works This workflow automatically discovers and analyzes backlinks for any website, providing comprehensive SEO insights and competitive intelligence using AI-powered analysis. Key Steps Website Input - Accepts target URLs via webhook or manual input for backlink analysis. Backlink Discovery - Scrapes and crawls the web to find all backlinks pointing to the target website. AI-Powered Analysis - Uses GPT-4 to analyze backlink quality, relevance, and SEO impact. Data Processing & Categorization - Cleans, validates, and automatically categorizes backlinks by type, authority, and relevance. Database Storage - Saves processed backlink data to PostgreSQL database for ongoing analysis and reporting. API Response - Returns structured summary with backlink counts, domain authority scores, and SEO insights. Set up steps Setup time: 8-12 minutes Configure OpenAI credentials - Add your OpenAI API key for AI-powered backlink analysis. Set up PostgreSQL database - Connect your PostgreSQL database and create the required table structure. Configure webhook endpoint - The workflow provides a /analyze-backlinks endpoint for URL submissions. Customize analysis parameters - Modify the AI prompt to include your preferred SEO metrics and analysis criteria. Test the workflow - Submit a sample website URL to verify the backlink discovery and analysis process. Set up database table - Ensure your PostgreSQL database has a backlinks table with appropriate columns. Features Comprehensive backlink discovery**: Finds all backlinks pointing to target websites AI-powered analysis**: GPT-4 analyzes backlink quality, relevance, and SEO impact Automatic categorization**: Backlinks categorized by type (dofollow/nofollow), authority level, and relevance Data validation**: Cleans and validates backlink data with error handling Database storage**: PostgreSQL integration for data persistence and historical tracking API responses**: Clean JSON responses with backlink summaries and SEO insights Competitive intelligence**: Analyzes competitor backlink profiles and identifies link building opportunities Authority scoring**: Calculates domain authority and page authority metrics for each backlink