by Adam Bertram
An AI-powered chat assistant that analyzes Azure virtual machine activity and generates detailed timeline reports showing VM state changes, performance metrics, and operational events over time. How It Works The workflow starts with a chat trigger that accepts user queries about Azure VM analysis. A Google Gemini AI agent processes these requests and uses six specialized tools to gather comprehensive VM data from Azure APIs. The agent queries resource groups, retrieves VM configurations and instance views, pulls performance metrics (CPU, network, disk I/O), and collects activity log events. It then analyzes this data to create timeline reports showing what happened to VMs during specified periods, defaulting to the last 90 days unless the user specifies otherwise. Prerequisites To use this template, you'll need: n8n instance (cloud or self-hosted) Azure subscription with virtual machines Microsoft Azure Monitor OAuth2 API credentials Google Gemini API credentials Proper Azure permissions to read VM data and activity logs Setup Instructions Import the template into n8n. Configure credentials: Add Microsoft Azure Monitor OAuth2 API credentials with read permissions for VMs and activity logs Add Google Gemini API credentials Update workflow parameters: Open the "Set Common Variables" node Replace <your azure subscription id here> with your actual Azure subscription ID Configure triggers: The chat trigger will automatically generate a webhook URL for receiving chat messages No additional trigger configuration needed Test the setup to ensure it works. Security Considerations Use minimum required Azure permissions (Reader role on subscription or resource groups). Store API credentials securely in n8n credential store. The Azure Monitor API has rate limits, so avoid excessive concurrent requests. Chat sessions use session-based memory that persists during conversations but doesn't retain data between separate chat sessions. Extending the Template You can add more Azure monitoring tools like disk metrics, network security group logs, or Application Insights data. The AI agent can be enhanced with additional tools for Azure cost analysis, security recommendations, or automated remediation actions. You could also integrate with alerting systems or export reports to external storage or reporting platforms.
by CustomJS
n8n Workflow: Invoice PDF Generator This n8n workflow captures invoice data and generates a PDF invoice, ready to be sent or saved. It uses a webhook to trigger the process, preprocesses the invoice data, and converts it to a PDF using HTML and custom styling. @custom-js/n8n-nodes-pdf-toolkit Features: Webhook Trigger**: Receives incoming data, including invoice details. Preprocessing**: Transforms the invoice data into HTML format. HTML to PDF Conversion**: Converts the preprocessed HTML into a styled PDF document. Response**: Sends the generated PDF back to the webhook response. Notice Community nodes can only be installed on self-hosted instances of n8n. Requirements Self-hosted** n8n instance A CustomJS API key for website screenshots. Invoice data** for PDF generation Workflow Steps: Webhook Trigger: Accepts incoming data (e.g., invoice number, recipient details, itemized list). This data is passed to the next node for processing. Set Data Node: Configures initial values for the invoice, including the recipient, sender, invoice number, and the items on the invoice. The invoice details include information like description, unit price, and quantity. Preprocess Node: Processes the raw data to format it correctly for HTML. This includes splitting addresses and converting the items into an HTML table format. HTML to PDF Conversion: Converts the generated HTML into a PDF document. The HTML includes a header, a detailed invoice table, and a footer with contact information. Respond to Webhook: Returns the generated PDF as a response to the initial webhook request. Setup Guide: 1. Configure CustomJS API Sign up at CustomJS. Retrieve your API key from the profile page. Add your API key as n8n credentials. 2. Design Workflow Create a Webhook: Set up a webhook to trigger the workflow when invoice data is received. Prepare Data: Ensure the incoming request contains fields like "Invoice No", "Bill To", "From", and "Details" (list of items with price and quantity). Customize the HTML: The HTML template for the invoice includes custom styling to give the invoice a professional look. Convert to PDF: The HTML to PDF node is configured with the data generated from the preprocessing step to convert the invoice HTML to a PDF format. Example Invoice Data: { "Invoice No": "1", "Bill To": "John Doe\n1234 Elm St, Apt 567\nCity, Country, 12345", "From": "ABC Corporation\n789 Business Ave\nCity, Country, 67890", "Details": [ { "description": "Web Hosting", "price": 150, "qty": 2 }, { "description": "Domain", "price": 15, "qty": 5 } ], "Email": "support@mycompany.com" } Result PDF File
by Oneclick AI Squad
This automated n8n workflow monitors ingredient price changes from external APIs or manual sources, analyzes historical trends, and provides smart buying recommendations. The system tracks price fluctuations in a PostgreSQL database, generates actionable insights, and sends alerts via email and Slack to help restaurants optimize their purchasing decisions. What is Price Trend Analysis? Price trend analysis uses historical price data to identify patterns and predict optimal buying opportunities. The system analyzes price movements over time and generates recommendations on when to buy ingredients based on current trends and historical patterns. Good to Know Price data accuracy depends on the reliability of external API sources Historical data improves recommendation accuracy over time (recommended minimum 30 days) PostgreSQL database provides robust data storage and complex trend analysis capabilities Real-time alerts help capture optimal buying opportunities Dashboard provides visual insights into price trends and recommendations How It Works Daily Price Check - Triggers the workflow daily to monitor price changes Fetch API Prices - Retrieves the latest prices from an external ingredient pricing API Setup Database - Ensures database tables are ready before inserting new data Store Price Data - Saves current prices to the PostgreSQL database for tracking Calculate Trends - Analyzes historical prices to detect patterns and price movements Generate Recommendations - Suggests actions based on price trends (buy/wait/stock up) Store Recommendations - Saves recommendations for future reporting Get Dashboard Data - Gathers necessary data for dashboard generation Generate Dashboard HTML - Builds an HTML dashboard to visualize insights Send Email Report - Emails the dashboard report to stakeholders Send Slack Alert - Sends key alerts or recommendations to Slack channels Database Structure The workflow uses PostgreSQL with two main tables: price_history - Historical price tracking with columns: id (Primary Key) ingredient (VARCHAR 100) - Name of the ingredient price (DECIMAL 10,2) - Current price value unit (VARCHAR 50) - Unit of measurement (kg, lbs, etc.) supplier (VARCHAR 100) - Source supplier name timestamp (TIMESTAMP) - When the price was recorded created_at (TIMESTAMP) - Record creation time buying_recommendations - AI-generated buying suggestions with columns: id (Primary Key) ingredient (VARCHAR 100) - Ingredient name current_price (DECIMAL 10,2) - Latest price price_change_percent (DECIMAL 5,2) - Percentage change from previous price trend (VARCHAR 20) - Price trend direction (INCREASING/DECREASING/STABLE) recommendation (VARCHAR 50) - Buying action (BUY_NOW/WAIT/STOCK_UP) urgency (VARCHAR 20) - Urgency level (HIGH/MEDIUM/LOW) reason (TEXT) - Explanation for the recommendation generated_at (TIMESTAMP) - When recommendation was created Price Trend Analysis The system analyzes historical price data over the last 30 days to calculate percentage changes, identify trends (INCREASING/DECREASING/STABLE), and generate actionable buying recommendations based on price patterns and movement history. How to Use Import the workflow into n8n Configure PostgreSQL database connection credentials Set up external ingredient pricing API access Configure email credentials for dashboard reports Set up Slack webhook or bot credentials for alerts Run the Setup Database node to create required tables and indexes Test with sample ingredient data to verify price tracking and recommendations Adjust trend analysis parameters based on your purchasing patterns Monitor recommendations and refine thresholds based on actual buying decisions Requirements PostgreSQL database access External ingredient pricing API credentials Email service credentials (Gmail, SMTP, etc.) Slack webhook URL or bot credentials Historical price data for initial trend analysis Customizing This Workflow Modify the Calculate Trends node to adjust the analysis period (currently 30 days) or add seasonal adjustments. Customize the recommendation logic to match your restaurant's buying patterns, budget constraints, or supplier agreements. Add additional data sources like weather forecasts or market reports for more sophisticated predictions.
by ivn
About: This workflow automates the transcription of YouTube videos by processing a video URL provided via a chat message. Designed for users who need quick access to video content in text form, this workflow ensures a seamless experience for transcribing videos on demand, regardless of the topic. Who is this for? This workflow is designed for individuals who need quick and accurate transcriptions of YouTube videos without watching them in full. It is particularly useful for: Students who need text-based notes from educational videos. Researchers looking to extract information from lectures or discussions. Professionals who prefer reading over watching videos. Casual users who want an efficient way to summarize video content. What problem is this workflow solving? Manually transcribing YouTube videos is time-consuming and prone to errors. Watching long videos just to extract key information is inefficient. This workflow automates transcription, allowing users to quickly convert video content into text. Use cases include: Summarizing lectures or webinars. Extracting insights from interviews and discussions. Creating searchable text from video content. Generating reference material without watching entire videos. What This Workflow Does? This workflow automates the transcription of YouTube videos by: Accepting Input: User provide a YouTube video URL through a chat message. Processing the Video: It utilizes an external transcription service to retrieve the full transcript of the YouTube video from the provided URL. Enhancing Output: An AI model (OpenAI) refines the transcription for accuracy and readability. Delivering Results: The final text transcript is returned to the user via the chat interface. Setup: Install n8n: Ensure you have n8n installed and running. Import the Workflow: Copy the JSON workflow file into your n8n instance. Configure API Keys: Set up your Supadata (Supadata) API key for transcription. Configure the OpenAI (OpenAI) API key for additional processing. Run the Workflow: Provide a YouTube video URL and receive a transcription in response. How to customize this workflow to your needs: The workflow is flexible and can be tailored to suit specific requirements. Here are some customization ideas: Language Support:** Adjust the transcription language in both the HTTP Request and OpenAI nodes to support transcriptions in different languages (e.g., French, German). Integrate with Other Services:** Store transcriptions in a database, send them via email, or connect with a document management system. Notification:** Add a notification node (e.g., email or Slack) to alert you when the transcription is complete, especially for long videos. Quality Check:** Integrate an additional AI step to summarize or highlight key points in the transcript for quicker insights. This workflow is designed to be scalable, efficient, and adaptable to various transcription needs. Limitations Video Length Limitation:** Very long videos may not have a complete transcription due to constraints in processing capacity or service limitations. Transcription Dependency:** The accuracy of the transcription relies entirely on the presence of video captions or subtitles. If a video lacks these, no transcription will be generated. Access Restrictions:** Private or restricted YouTube videos may not be accessible for transcription due to permission limitations. Processing Time:** The time required to process a video can vary significantly, especially for longer videos, depending on the transcription service and server resources. Regional Restrictions:** Some YouTube videos may have geographic or regional access limitations, which could prevent the workflow from retrieving the content for transcription.
by Oneclick AI Squad
In this guide, we’ll walk you through setting up an AI-driven workflow that automatically processes highly-rated food photos from a Google Sheet, generates AI-powered captions, shares them to Pinterest, and updates the sheet to reflect the posts. Ready to automate your food photo sharing? Let’s dive in! What’s the Goal? Automatically detect and process highly-rated food photos (4 stars or above) from a Google Sheet. Use AI to generate engaging and relevant captions. Share the photos with captions to Pinterest via the Pinterest API. Update the Google Sheet to mark photos as posted. Enable scheduled automation for consistent posting. By the end, you’ll have a self-running system that shares your best food photos effortlessly. Why Does It Matter? Manual photo sharing is time-consuming and inconsistent. Here’s why this workflow is a game changer: Zero Human Error**: AI ensures consistent captions and posting accuracy. Time-Saving Automation**: Automatically handle photo sharing, boosting efficiency. Scheduled Posting**: Maintain a regular presence on Pinterest without manual effort. Focus on Creativity**: Free your team from repetitive posting tasks. Think of it as your tireless social media assistant that keeps your Pinterest feed vibrant. How It Works Here’s the step-by-step magic behind the automation: Step 1: Trigger the Workflow Detect new photos to post using the Daily Post Scheduler node (e.g., once daily). Initiate the workflow at a scheduled time to check for new food photos. Step 2: Fetch Food Photos from Sheet Retrieve rows from the Google Sheet that contain food photo metadata like image URLs, ratings, and status. Step 3: Filter 4+ Star Dishes Filter only those food entries with high ratings (4 stars or above) and unposted status. Step 4: AI Caption Generator Use AI (e.g., GPT/OpenAI) to create engaging and relevant captions for selected food photos. Step 5: Upload to Pinterest Automatically post the food photo with the generated caption to Pinterest via the Pinterest API. Step 6: Mark as Posted in Sheet Update the Google Sheet to reflect that the photo has been successfully shared. How to Use the Workflow? Importing a workflow in n8n is a straightforward process that allows you to use pre-built workflows to save time. Below is a step-by-step guide to importing the Automated Food Photo Sharing workflow in n8n. Steps to Import a Workflow in n8n Obtain the Workflow JSON Source the Workflow: Workflows are shared as JSON files or code snippets, e.g., from the n8n community, a colleague, or exported from another n8n instance. Format: Ensure you have the workflow in JSON format, either as a file (e.g., workflow.json) or copied text. Access the n8n Workflow Editor Log in to n8n (via n8n Cloud or self-hosted instance). Navigate to the Workflows tab in the n8n dashboard. Click Add Workflow to create a blank workflow. Import the Workflow Option 1: Import via JSON Code (Clipboard): Click the three dots (⋯) in the top-right corner to open the menu. Select Import from Clipboard. Paste the JSON code into the text box. Click Import to load the workflow. Option 2: Import via JSON File: Click the three dots (⋯) in the top-right corner. Select Import from File. Choose the .json file from your computer. Click Open to import. Setup Notes Google Sheet Columns**: Ensure your Google Sheet includes the following columns: Image URL, Rating (numeric, e.g., 1-5), Feedback (text), Pin Title, Pin Description, Destination URL, Board ID, and Status (e.g., "Pending" or "Posted"). Google Sheets Credentials**: Configure OAuth2 settings in the Fetch Food Photos node with your Google Sheet ID and credentials. AI Model**: Set up the AI Caption Generator node with OpenAI credentials (e.g., API key). Pinterest API**: Authorize the Upload to Pinterest node with Pinterest API credentials (e.g., Bearer Token) and obtain the Board ID. Scheduling**: Adjust the Daily Post Scheduler node to your preferred posting time (e.g., daily at 9 AM).
by Dvir Sharon
🔍 Extract Competitor SERP Rankings from Google Search to Sheets with Bright Data This template requires a self-hosted n8n instance to run. A comprehensive n8n automation that extracts competitor data from Google search results for specific keywords and target countries, automatically saving structured data to Google Sheets for competitive analysis and market research. 📋 Overview This workflow provides a professional competitor analysis solution that identifies ranking websites for specific search terms across different countries. Perfect for SEO research, competitive intelligence, market analysis, and content strategy planning. The system uses Bright Data's SERP API for accurate search result extraction and advanced HTML parsing for detailed competitor information. Who is this for? SEO professionals conducting competitive analysis Digital marketers researching market landscapes Business analysts studying competitor positioning Content strategists analyzing competitor content approaches Market researchers tracking competitive intelligence across regions What problem is this workflow solving? Extracting competitor data from Google search results Processing multiple keywords across different countries Organizing results in a structured, analyzable format Eliminating manual copy-paste work Ensuring consistent data collection methodology What this workflow does Manual Trigger: Starts the workflow execution Get Keywords from Sheet: Fetches keywords and target countries from Google Sheets URL Encode Keywords: Converts keywords to URL-safe format Process Keywords in Batches: Handles multiple keywords sequentially Fetch Google Search Results: Uses Bright Data SERP API to scrape HTML Extract Competitor Data from HTML: Parses HTML to extract competitor details Save Competitor Results to Sheet: Stores structured data in Google Sheets Wait to Avoid Rate Limits: Implements 30-second delays between requests Output Data Points | Field | Description | Example | | :--------------- | :--------------------------------- | :------------------------------------------ | | Keyword | Original search term | digital marketing services | | Target Country | Geographic target | US | | websiteName | Domain/company name | hubspot | | websiteUrl | Complete website URL | https://www.hubspot.com/marketing | | websiteTitle | Page title from search results | Digital Marketing Software & Tools | | websiteDescription | Meta description/snippet | Grow your business with HubSpot's digital marketing tools... | ⚙️ Setup Prerequisites n8n instance (self-hosted) Google account with Sheets access Bright Data account with SERP API access Google Sheet Structure This workflow utilizes two Google Sheets: one for input keywords and one for outputting competitor data. Input Sheet: "Keywords" This sheet should contain the keywords and target countries for your search queries. | Column Header | Data Type | Description | Example | | :------------- | :-------- | :------------------------------------------------- | :-------------- | | Keyword | Text | The search term you want to analyze. | digital marketing | | Country | Text | The 2-letter ISO country code for the target region of the search (e.g., US, GB, DE). | US | Output Sheet: "Competitor Results" This sheet will be populated automatically by the workflow with the extracted competitor data. | Column Header | Data Type | Description | Example | | :----------------- | :-------- | :---------------------------------------------------------------------------------- | :----------------------------------------------- | | Keyword | Text | The original search term used for the query. | digital marketing services | | Target Country | Text | The 2-letter ISO country code of the search results. | US | | websiteName | Text | The name of the website or domain found in the search results. | hubspot | | websiteUrl | URL | The full URL of the website or page found in the search results. | https://www.hubspot.com/marketing | | websiteTitle | Text | The title of the page as displayed in the Google search results. | Digital Marketing Software & Tools | | websiteDescription | Text | The meta description or snippet text displayed under the title in search results. | Grow your business with HubSpot's digital marketing tools... | Step-by-Step Setup Import the Workflow: Copy JSON → n8n → Workflows → + Add → Import from JSON Configure Bright Data Credentials: Credential Type: HTTP Header Auth Header Name: Authorization Header Value: Bearer YOUR_API_TOKEN Configure Google Sheets: Create two new Google Sheets as described above: one named "Keywords" (for input) and one named "Competitor Results" (for output). Set up Google Sheets OAuth2 credentials within n8n. Update Workflow Settings: Replace placeholders: YOUR_GOOGLE_SHEET_ID (for both input and output sheets), YOUR_BRIGHTDATA_CREDENTIAL_ID. Ensure correct sheet/tab names are selected in the Google Sheets nodes. Test & Activate: Add test data to your "Keywords" sheet → Execute workflow → Verify output in your "Competitor Results" sheet. 🛠 How to Customize Add More Data Points:** Modify the JavaScript code in the "Extract Competitor Data from HTML" node to parse and extract additional information from the HTML. Custom Filtering:** Implement logic to exclude specific domains, filter results by title length, or other criteria. Expand Geographic Coverage:** Add more 2-letter ISO country codes to the Bright Data SERP API call to broaden your competitive analysis. Batch Processing:** Adjust the settings in the "Process Keywords in Batches" node to optimize for your Bright Data plan and desired execution speed. Rate Limiting:** Modify the "Wait" node (default: 30 seconds) to increase or decrease the delay between requests based on API limits or performance needs. 📊 Use Cases & Examples SEO Competitive Analysis:** Identify top-ranking competitors for your target keywords and analyze their strategies. Market Entry Research:** Understand the competitive landscape in new geographic regions before expanding. Content Strategy Planning:** Analyze competitor page titles and meta descriptions for inspiration and to identify content gaps. International Market Research:** Compare search engine results and competitor positioning across different countries. 📈 Performance & Limits Single Keyword:** 30–60 seconds per keyword. Batch of 10 Keywords:** Typically takes 5–10 minutes. Large Lists (50+ Keywords):** Expect execution times of 30–60 minutes or more, depending on batching and rate limits. Success Rate:** Generally 95%+ for data extraction. Data Accuracy:** Typically 98%+ for extracted fields. API Calls:** 1 Bright Data SERP API call per keyword, plus multiple Google Sheets writes per execution. Rate Limit:** A 30-second delay between requests is recommended to prevent exceeding API limits. 🧰 Troubleshooting Bright Data API error:** Double-check your API token, ensure you have sufficient credits, and confirm SERP API access is enabled on your Bright Data account. No keywords found:** Verify the Google Sheet ID and ensure the column headers in your "Keywords" sheet precisely match the specifications (e.g., "Keyword", "Country"). Google Sheets permission denied:** Re-authenticate your Google Sheets credentials within n8n and check that the correct sharing settings are applied to your sheets. No results extracted:** Review the JavaScript parsing logic in the "Extract Competitor Data from HTML" node. Also, verify the validity of your keywords and target countries. Loop not processing all:** Check the batch settings in the "Process Keywords in Batches" node and ensure all connections within the loop are correctly configured. 🤝 Support & Community n8n Forum:** <https://community.n8n.io> n8n Docs:** <https://docs.n8n.io> Bright Data Support:** Access support directly via your Bright Data dashboard. GitHub Issues:** Report any bugs or suggest new features on the n8n GitHub repository. 🎯 Final Notes This workflow provides a comprehensive foundation for competitor research and market analysis. Customize it to fit your specific industry needs and competitive intelligence requirements. Please note that this template uses Community Nodes. Ensure you understand the risks before using community nodes.
by Laura Piraux
This n8n workflow template uses community nodes and is only compatible with the self-hosted version of n8n. Build an AI agent for Notion (with Notion official MCP server) Use case This template empowers Notion power-users to build their own AI assistant, deeply integrated with their workspace. It solves the constant problem of copy-pasting and context-switching between a separate AI chat and Notion by creating a direct, conversational bridge. Now you can interact with an intelligent agent that can create, retrieve, and update your Notion databases and pages on your behalf, turning your workspace into a truly dynamic productivity hub. How it works When you send a message via the chat interface, the workflow passes it to your chosen AI model. The model, connected to the official Notion tool server, analyzes your request to see if it can be fulfilled by one of its available Notion actions. If it matches a tool, the workflow executes the command using the Notion API—like creating a new page or searching a database—and the AI then confirms the action is complete back in the chat. Setup Prerequisite: This template is for self-hosted n8n instances only, as it requires a community node. Copy this workflow into your self-hosted n8n instance Install the required community node (n8n-nodes-mcp). Add your credentials for your chosen AI Model and the Notion MCP Server. Test the workflow by starting chatting with your new Notion assistant. How to adjust it to your needs You can use the AI model you want and even easily compare different AI models. You can start from this template and then provide other tools to your AI agent to build more powerful workflows.
by Yaron Been
Workflow Overview This advanced n8n automation is a powerful channel research and intelligence gathering tool designed to transform raw YouTube channel data into actionable insights. By intelligently connecting multiple APIs and data sources, this workflow: Discovers Channel Metrics: Automatically retrieves channel statistics Captures detailed performance indicators Provides comprehensive channel intelligence Performs Deep Analysis: Extracts recent video performance data Calculates engagement metrics Aggregates view count insights Uncovers Contact Information: Attempts to retrieve public email addresses Provides direct outreach opportunities Enhances lead generation capabilities Seamless Data Logging: Automatically updates Google Sheets Maintains a live intelligence dashboard Preserves historical channel data Key Benefits 🤖 Full Automation: Continuous channel intelligence gathering 💡 Smart Analysis: Comprehensive performance insights 📊 Real-Time Tracking: Always-updated channel metrics 🔍 Lead Generation: Direct contact information extraction Workflow Architecture 🔹 Stage 1: Channel Identification Google Sheets Trigger**: Detects new channel additions YouTube Data API**: Fetches channel statistics Comprehensive Metric Collection**: Subscriber count Total view metrics Channel overview 🔹 Stage 2: Video Performance Analysis Recent Video Retrieval**: Fetches 5 latest uploads View Count Aggregation**: Calculates total recent views Provides engagement snapshot Performance Insights**: Measures content effectiveness 🔹 Stage 3: Contact Discovery SerpAPI Integration**: Attempts email extraction Public Contact Information**: Retrieves available email addresses Supports outreach and networking 🔹 Stage 4: Data Compilation Intelligent Data Formatting** Google Sheets Update** Live Intelligence Dashboard** Potential Use Cases Marketing Teams**: Influencer research Sales Professionals**: Lead qualification Content Strategists**: Competitive analysis Recruitment Specialists**: Talent scouting Business Development**: Partnership identification Setup Requirements YouTube Data API Google Cloud API credentials Configured API access SerpAPI Account API key for email extraction Web scraping permissions Google Sheets Connected Google account Prepared tracking spreadsheet Appropriate sharing settings n8n Installation Cloud or self-hosted instance Workflow configuration API credential management Future Enhancement Suggestions 🤖 AI-powered channel scoring 📊 Advanced trend analysis 🔔 Automated alert system 🌐 Multi-platform channel tracking 🧠 Machine learning insights generation Technical Considerations Implement robust error handling Use exponential backoff for API calls Maintain flexible data extraction strategies Ensure compliance with platform terms of service Ethical Guidelines Respect content creator privacy Use data for legitimate research Maintain transparent data collection practices Provide opt-out mechanisms Connect With Me Ready to unlock YouTube channel insights? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your channel research with intelligent, automated workflows!
by Yosua Surojo
Who it's for This workflow is for anyone who wants to build an automated, AI-enhanced reading list. Ideal for: Knowledge workers and researchers who collect and organize articles Students managing study materials Productivity hackers who use Telegram and Notion for personal knowledge management Anyone using the AI-Enhanced Knowledge Base Tracker Notion Template How it works This workflow takes any article link sent to your Telegram bot and automatically: Parses the article into a clean title and body Uses OpenAI to generate a 1–2 sentence highlight and topic tag Saves it into your Notion database Sends a confirmation message with the highlight and Notion link back to Telegram Main steps: Telegram Trigger - Listens for incoming message containing an article link. Fetch Article Title & Content - Calls the article-parser-api deployed on Vercel to fetch and parse the article content into structured JSON (title and content). Generate Highlight + Tag (AI Agent) - Processes the parsed content to generate Highlight and Type tag values. Structured Metadata for Notion - Adjusts the extracted data before saving it to Notion. Save Article to Notion Database - Inserts the article and generated metadata into your Notion knowledge base. Confirm Save via Telegram - Sends a confirmation message and the Notion page link back to the Telegram bot chat after the entry is created. Setup Create and connect your API credentials: Telegram Bot OpenAI API Key Notion Integration Deploy the article parser: Use this repo: article-parser-api Deploy it to Vercel or any serverless environment Link your Notion database: Duplicate the AI‑Enhanced Knowledge Base Tracker Copy the database URL and connect it in the Notion node Test your workflow: Click Execute workflow Send an article link to your Telegram bot Once verified, activate the workflow so it runs automatically Requirements Telegram bot token OpenAI API key Notion integration and shared database A deployed article parser (e.g., article-parser-api) Optional customization Edit the AI Agent prompt to change tone or tagging style Add filtering or additional fields in the Edit Fields node Trigger from other sources (e.g., Slack or Email)
by Sherlockes
What this template is made for: I have a personal Telegram channel and a bot inside it where I save interesting links that I want to save or read later. The idea is that n8n will take care of reading the new links added to this channel and send them, through the corresponding API, to the Hoarder and Readeck installations. How it works Since my server where n8n runs is not always on, a "Schedule Trigger" will be responsible for checking every so often if there is any new content in the Telegram channel where I store the links. This request is made through "http request" and the Telegram API. Next, a code block is responsible for filtering out everything that is not a hyperlink. At this point, the flow splits into two so that parallel and similar processes are performed for Hoarder and Readeck. The corresponding API is accessed to get a list of all the links saved in the corresponding service. A code block is responsible for filtering the list of hyperlinks previously obtained from Telegram so that only those that are not already saved in the service continue. Finally, another "Http Request" node is responsible for using the service API to save the link in the corresponding service. Configuration instructions The template makes use of the environment variables that I have declared in the n8n "docker-compose.yml" file through an external ".env" file. These are the variables I use: Telegram Bot Token Sherlink TG_SHERLINK_BOT_TOKEN=XXXXXXXX:XXXXXXXXXXXXXXXX Id Telegram Channel Sherlink TG_SHERLINK_ID=-XXXXXXXXXXXXX Readeck server READECK_SERVER=http://readeck.midomain.com READECK_API_KEY=xxxxxxxxxxxxx Hoarder server HOARDER_SERVER=http://hoarder.midomain.com HOARDER_API_KEY=xxxxxxxxxxxxxx Created in 1.85.4 n8n version
by Jimleuk
Note: This template only works for self-hosted n8n. This n8n template demonstrates how to use the Langchain code node to track token usage and cost for every LLM call. This is useful if your templates handle multiple clients or customers and you need a cheap and easy way to capture how much of your AI credits they are using. How it works In our mock AI service, we're offering a data conversion API to convert Resume PDFs into JSON documents. A form trigger is used to allow for PDF upload and the file is parsed using the Extract from File node. An Edit Fields node is used to capture additional variables to send to our log. Next, we use the Information Extractor node to organise the Resume data into the given JSON schema. The LLM subnode attached to the Information Extractor is a custom one we've built using the Langchain Code node. With our custom LLM subnode, we're able to capture the usage metadata using lifecycle hooks. We've also attached a Google Sheet tool to our LLM subnode, allowing us to send our usage metadata to a google sheet. Finally, we demonstrate how you can aggregate from the google sheet to understand how much AI tokens/costs your clients are liable for. Check out the example Client Usage Log - https://docs.google.com/spreadsheets/d/1AR5mrxz2S6PjAKVM0edNG-YVEc6zKL7aUxHxVcffnlw/edit?usp=sharing How to use SELF-HOSTED N8N ONLY** - the Langchain Code node is only available in the self-hosted version of n8n. It is not available in n8n cloud. The LLM subnode can only be attached to non-"AI agent" nodes; Basic LLM node, Information Extractor, Question & Answer Chain, Sentiment Analysis, Summarization Chain and Text Classifier. Requirements Self-hosted version of n8n OpenAI for LLM Google Sheets to store usage metadata Customising this template Bring the custom LLM subnode into your own templates! In many cases, it can be a drop-in replacement for the regular OpenAI subnode. Not using Google Sheets? Try other databases or a HTTP call to pipe into your CRM.
by Adam Janes
How it works The automation loads rows from a Google Sheet of leads that you want to contact. It makes a Google search via Apify for LinkedIn links based on the First name / Last name / Company. Another Apify actor fetches the right LinkedIn profile based on the first profile which is retuned The same process is done for the company that the lead works for, giving extra context. If the lead has a current company listed on their LinkedIn, we use that URL to do the lookup, rather than doing a separate Google search. A call is made to OpenRouter to get an LLM to generate an email based on a prompt designed to do personalized outreach. An email is sent via a Gmail node. Set up steps Connect your Google Sheets + Gmail accounts to use these APIs. Make an account with Apify and enter your credentials. Set your details in the "Set My Data" node to customize the workflow to revolve around your company + value proposition. I would recommend changing the prompt in the "Generate Personalized Email" node to match the tone of voice that you want your agent to have. You can change the guidelines to e.g. change whether the agent introduces itself, and give more examples in the style you want to make the output better.