by Yaron Been
🎤 Audio-to-Insights: Auto Meeting Summarizer Transform your meeting recordings into actionable insights automatically. This powerful n8n workflow monitors your Google Drive for new audio files, transcribes them using OpenAI's Whisper, generates intelligent summaries with ChatGPT, and logs everything in Google Sheets - all without lifting a finger. 🔄 How It Works This workflow operates as a seamless 6-step automation pipeline: Step 1: Smart Detection The workflow continuously monitors a designated Google Drive folder (polls every minute) for newly uploaded audio files. Step 2: Secure Download When a new audio file is detected, the system automatically downloads it from Google Drive for processing. Step 3: AI Transcription OpenAI's Whisper technology converts your audio recording into accurate text transcription, supporting multiple audio formats. Step 4: Intelligent Summarization ChatGPT processes the transcript using a specialized prompt that extracts: Key discussion points and decisions Action items with assigned persons and deadlines Priority levels and follow-up tasks Clean, professional formatting Step 5: Timestamp Generation The system automatically adds the current date and formats it consistently for tracking purposes. Step 6: Automated Logging The final summary is appended to your Google Sheets document with the date, creating a searchable archive of all meeting insights. ⚙️ Setup Steps Prerequisites Before setting up the workflow, ensure you have: Active Google Drive account OpenAI API key with credits Google Sheets access n8n instance (cloud or self-hosted) Configuration Steps 1. Credential Setup Google Drive OAuth2**: Required for folder monitoring and file downloads OpenAI API Key**: Needed for both transcription (Whisper) and summarization (ChatGPT) Google Sheets OAuth2**: Essential for writing summaries to your spreadsheet 2. Google Drive Configuration Create a dedicated folder in Google Drive for meeting recordings Copy the folder ID from the URL (the long string after /folders/) Update the folderToWatch parameter in the workflow 3. Google Sheets Preparation Create a new Google Sheet or use an existing one Ensure it has columns: Date and Meeting Summary Copy the spreadsheet ID from the URL Update the documentId parameter in the workflow 4. Audio Requirements Supported Formats**: MP3, WAV, M4A, MP4 Recommended Size**: Under 100MB for optimal processing Language**: Optimized for English (customizable for other languages) Quality**: Clear audio produces better transcriptions 5. Workflow Activation Import the workflow JSON into your n8n instance Configure all credential connections Test with a sample audio file Activate the workflow trigger 🚀 Use Cases Project Management Team Standup Summaries**: Convert daily standups into actionable task lists Sprint Retrospectives**: Extract improvement points and action items Stakeholder Updates**: Generate concise reports for leadership Sales & Customer Success Discovery Call Notes**: Capture prospect pain points and requirements Demo Follow-ups**: Track questions, objections, and next steps Customer Check-ins**: Monitor satisfaction and expansion opportunities Consulting & Professional Services Client Strategy Sessions**: Document recommendations and implementation plans Requirements Gathering**: Organize complex project specifications Progress Reviews**: Track deliverables and milestone achievements HR & Training Interview Debriefs**: Standardize candidate evaluation notes Training Sessions**: Create searchable knowledge bases Performance Reviews**: Document development plans and goals Research & Development Brainstorming Sessions**: Capture innovative ideas and concepts Technical Reviews**: Log decisions and architectural choices User Research**: Organize feedback and insights systematically 💡 Advanced Customization Options Enhanced Summarization Modify the ChatGPT prompt to focus on specific elements: Add speaker identification for multi-person meetings Include sentiment analysis for customer calls Generate department-specific summaries (technical, sales, legal) Extract financial figures and metrics automatically Integration Expansions Slack Integration**: Auto-post summaries to relevant channels Email Notifications**: Send summaries to meeting participants CRM Updates**: Push action items directly to Salesforce/HubSpot Calendar Integration**: Schedule follow-up meetings based on action items Quality Improvements Audio Preprocessing**: Add noise reduction before transcription Multi-language Support**: Configure for international teams Custom Templates**: Create industry-specific summary formats Approval Workflows**: Add human review before final storage 🛠️ Troubleshooting & Best Practices Common Issues Large File Processing**: Split recordings over 100MB into smaller segments Poor Audio Quality**: Use noise reduction tools before uploading API Rate Limits**: Implement delay nodes for high-volume usage Formatting Issues**: Adjust ChatGPT prompts for consistent output Optimization Tips Upload files in supported formats only Ensure stable internet connection for cloud processing Monitor OpenAI API usage and costs Regularly backup your Google Sheets data Test workflow changes with sample files first 📊 Expected Outputs Sample Summary Format: Meeting Summary - March 15, 2024 Key Discussion Points: Q1 budget review and allocation decisions New product launch timeline and milestones Team restructuring and role assignments Action Items: John: Finalize budget proposal by March 20th (High Priority) Sarah: Schedule product demo sessions for March 25th Team: Submit org chart feedback by March 18th Decisions Made: Approved additional marketing budget of $50K Delayed product launch to April 15th for quality assurance Promoted Lisa to Senior Developer role 📞 Questions & Support For any questions, customizations, or technical support regarding this workflow: 📧 Email Support Primary Contact**: Yaron@nofluff.online Response Time**: Within 24 hours on business days Best For**: Setup questions, customization requests, troubleshooting 🎥 Learning Resources YouTube Channel**: https://www.youtube.com/@YaronBeen/videos Step-by-step setup tutorials Advanced customization guides Workflow optimization tips 🔗 Professional Network LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Connect for ongoing support Share your workflow success stories Get updates on new automation ideas 💡 What to Include in Your Support Request Describe your specific use case Share any error messages or logs Mention your n8n version and setup type Include sample audio file characteristics (if relevant) Ready to transform your meeting chaos into organized insights? Download the workflow and start automating your meeting summaries today!
by Airtop
Extracting Comments from an X Post Use Case Engaging with conversations on X (formerly Twitter) is critical for brands and individuals monitoring sentiment, leads, or emerging trends. Manually collecting comments is time-consuming—this automation enables scalable extraction of comment data to inform your outreach or analysis. What This Automation Does This automation extracts comments from a specified X post, with the following input parameters: airtop_profile**: The name of your Airtop Profile connected to X. x_post_url**: The URL of the X post to extract comments from. max_number_of_comments**: The maximum number of comments to retrieve. How It Works Takes input via a form or another workflow. Normalizes the input values. Creates a new browser session using Airtop. Navigates to the provided X post. Uses a prompt to extract up to the specified number of comments, returning: Author name Author profile URL Comment text Setup Requirements Airtop API Key — free to generate. An Airtop Profile connected to X (requires one-time login). Next Steps Pair with X Monitoring**: Use this with the X monitoring automation to detect relevant posts and extract discussion context automatically. Feed into Analytics**: Combine with summarization or sentiment analysis tools to understand audience response at scale. Export for CRM/BI**: Pipe the structured comment data into your CRM or business intelligence stack for lead tracking or reporting. Read more about Extracting Comments from X Posts
by Yang
Who is this for? This workflow is perfect for marketers, SEO specialists, product teams, and competitive analysts who want to monitor and summarize public reviews of their competitors. It’s especially helpful for small teams who want fast insights from Google reviews without spending hours manually reading and sorting them. What problem is this workflow solving? Manually going through competitor reviews is time-consuming and repetitive. You risk missing patterns or insights, and it’s hard to share summaries with your team quickly. This workflow automatically scrapes reviews from Google and generates a structured summary of pain points and positive feedback. That way, you can focus on strategy instead of sorting through dozens of reviews. What this workflow does This automation watches for new competitor entries in a Google Sheet, then: Uses Dumpling AI to scrape the latest Google reviews (up to 20) for each business. Splits and cleans the reviews for analysis. Sends them to GPT-4o, which summarizes the most common complaints and praises. Saves the structured result back to the same Google Sheet. You’ll instantly get an overview of what people are saying about any competitor. Setup Google Sheet Setup Create a Google Sheet with at least one column: Business Add names or search queries for the competitors you want to analyze Optional: Add columns for Summary of Reviews and Pain Points Connect Dumpling AI Sign up at Dumpling AI Create an agent using the get-google-reviews endpoint Copy your agent key Use it in the HTTP Request node in this workflow OpenAI Setup Use your API key with GPT-4o access The prompt is already structured to generate grouped summaries from reviews Run the Workflow Trigger it manually or schedule it Make sure your Google Sheets, OpenAI, and Dumpling AI connections are active How to customize this workflow to your needs You can expand the number of reviews retrieved by changing the Dumpling AI agent config Replace Google Sheets with Airtable if you want more robust data views Add more fields like star ratings or review dates in your agent for richer analysis Change the GPT prompt to highlight emotional tone, urgency, or feature mentions 🧠 Node Details Google Sheets Trigger**: Watches for new competitor names HTTP Request (Dumpling AI)**: Scrapes 20 recent reviews from Google SplitOut Node**: Breaks review array into individual items Code Node**: Extracts and combines review text Edit Fields Node**: Structures the review content before GPT GPT-4o Node**: Analyzes and summarizes top pain points and praise Google Sheets Output**: Saves the summary back to the same sheet Dependencies Dumpling AI account and review scraping agent setup OpenAI API key with GPT-4o access Google Sheets OAuth2 credentials
by Angel Menendez
Enhance Query Resolution with the Knowledge Base Tool! Our KB Tool - Confluence KB is crafted to seamlessly integrate into the IT Ops AI SlackBot Workflow, enhancing the IT support process by enabling sophisticated search and response capabilities via Slack. Workflow Functionality: Receive Queries**: Directly accepts user queries from the main workflow, initiating a dynamic search process. AI-Powered Query Transformation**: Utilizes OpenAI's models or local ai to refine user queries into searchable keywords that are most likely to retrieve relevant information from the Knowledge Base. Confluence Integration**: Executes searches within Confluence using the refined keywords to find the most applicable articles and information. Deliver Accurate Responses**: Gathers essential details from the Confluence results, including article titles, links, and summaries, preparing them to be sent back to the parent workflow for final user response. To view a demo video of this workflow in action, click here. Quick Setup Guide: Ensure correct configurations are set for OpenAI and Confluence API integrations. Customize query transformation logic as per your specific Knowledge Base structure to improve search accuracy. Need Help? Dive into our Documentation or get support from the Community Forum! Deploy this tool to provide precise and informative responses, significantly boosting the efficiency and reliability of your IT support workflow.
by please-open.it
Intro This workflow needs a user to authenticate by using an openid connect provider in order to call the webhook. If the user is not authenticated, it starts a login process by using an Authorization Code with PKCE https://datatracker.ietf.org/doc/html/rfc7636, a standard way to authenticate users with openid connect. Then, after the user logs in, the webhook is refreshed and gets the user's token from a cookie. With this token, all details about the user are requested through the userinfo endpoint on the identity provider. How to set up with Keycloak Keycloak Keycloak is an open source identity and access management solution. Feel free to get a demo realm at https://please-open.it or get your own Keycloak server up and running. After creating a realm, go to "Realm Settings" and click on "OpenID Endpoint Configuration" Retrieve authorization_endpoint, token_endpoint and userinfo_endpoint values. Set those variables in the "Set variables" node. In Keycloak, create a new client (name it as you want) Disable the client authentication, check only "standard flow" : At the third step, put the webhook url in "valid redirect URIs", fill "Web origins" with a "+". You're done, open the webhook and it asks you to authenticate. Usage User informations The userinfo node returns this structure about the user has logged in : [ { "sub":"73a6543f-f420-4fa6-9811-209e903c348b", "email_verified":true, "preferred_username": "mathieu.passenaud@please-open.it", "email": "mathieu.passenaud@please-open.it" } ] I can use those infos in my workflow for custom operations. APIs calls the "code" node returns me a cookie named "n8n-custom-auth" which is the access_token returned by the identity provider. This access_token can be used to call APIs connected to this identity provider (for example, we call userinfo API with this token). Example : asks a user to log in with his Google account then call an API (Gmail, drive...) with his own token. How it works We published a blog post about this flow, how it works and how you can use it : https://blog.please-open.it/n8n-openid-client/
by Aitor | 1Node
Who is this for? This template is designed for anyone who wants to integrate MCP with their AI Agents using Airtable. Whether you're a developer, a data analyst, or an automation enthusiast, if you're looking to leverage the power of MCP and Airtable in your n8n workflows, this template is for you. What problem is this workflow solving? This template caters to MCP beginners seeking a hands-on example and developers looking to integrate Airtable MCP service. When integrating MCP with Airtable, manually updating AI Agents after changes to Airtable data on the MCP Server is time-consuming and error-prone. This template automates the process, enabling the AI Agent to instantly recognize changes made to Airtable on the MCP Server. In data management, for example, it ensures that record updates or additions in Airtable are automatically detected by the AI Agent. With detailed steps, it simplifies the integration process for all users. What this workflow does This workflow focuses on integrating MCP with Airtable within n8n. Specifically, it allows you to build an MCP Server and Client using Airtable nodes in n8n. Any changes made to the Airtable Base/Table on the MCP Server are automatically recognized by the MCP Client in the workflow. This means that you can make changes to your Airtable (such as adding, deleting, or modifying records) on the MCP Server, and the MCP Client in the n8n workflow will immediately detect these changes without any manual intervention. Setup Requirements An active n8n account. Access to Airtable API. A sample base and rows in Airtable that you can use to test. An API key from your preferred LLM to power the AI agent. Step-by-step guide Create a new workflow in n8n: Log in to your n8n account and create a new workflow. Add Airtable nodes: Search for and add the Airtable nodes to your workflow that you wish the MCP client to have access to. Set up the MCP Server and Client: Use the appropriate nodes in n8n to set up the MCP Server and Client. Connect the Airtable nodes to the MCP nodes as required. Activate and test the workflow: Talk to the chat trigger once all credentials have been updated and table data synced and try adding some rows, deleting or finding and updating cells. How to customize this workflow to your needs If you want to customize this workflow, you can: Modify the triggers:** You can change the conditions under which the MCP Client detects changes. For example, you can set it to detect changes only in specific fields or based on certain record values in Airtable. Integrate with other services:** You can add more nodes to the workflow to integrate with other services, such as sending notifications to Slack or triggering further actions based on the detected Airtable changes. Need help? Feel free to contact us at 1 Node. Get instant access to a library of free resources we created.
by Aurélien P.
📈 Daily Crypto Market Summary Bot (Binance to Telegram) This workflow fetches 24h price change data from Binance for selected crypto pairs (BTC/USDC, ETH/USDC, SOL/USDC) every hour using a cron schedule. It performs in-depth analysis—including volatility, volume, bid-ask spread, momentum, and market comparison—then formats a detailed market summary. The final report is sent to a Telegram chat using HTML formatting, highlighting top gainers, losers, and key metrics in a clean, readable layout. 🔑 Key Features ⏱ Runs every hour (cron: 5 * * * *) 🔍 Filters and analyzes major coins: BTC, ETH, SOL 📊 Calculates market metrics: Volatility Bid-ask spread Momentum Estimated market cap Market average comparison 📈 Highlights gainers, losers, and top coins by volume ✂️ Splits messages to fit Telegram’s 4096 character limit 💬 Sends output in rich HTML format to a Telegram group or chat 🎯 Use Cases ✅ Crypto traders wanting hourly performance insights ✅ Telegram groups needing automated market updates ✅ Analysts monitoring key coin metrics in real-time ✅ Bot developers creating crypto dashboards or alerts 🛠 Technical Details Data Source:** Binance 24hr ticker API (/api/v3/ticker/24hr) Coins Monitored:** BTCUSDC, ETHUSDC, SOLUSDC (can be expanded) Metrics Calculated:** Price change percentage Volatility (high vs low price) Bid-ask spread % Momentum (vs weighted average) Estimated market cap Number of trades Market average movement Message Format:** HTML with emojis, bold styling, and section headings Auto-split messages when exceeding Telegram's 4096-char limit Error Handling:** Retry on HTTP failure (up to 5 times with 5s delay) Message length checked and split for Telegram compatibility ⚙️ Setup Requirements Telegram Bot Token — Create a bot via @BotFather on Telegram Chat ID — Use a personal ID or group chat ID (add the bot to the group) n8n Instance — Either cloud or self-hosted (Optional) Modify relevantSymbols in the Function node to track different coins 🧠 Notes This workflow is highly customizable—feel free to modify the analytics, tracked pairs, or formatting. Great base for alerting systems or crypto dashboards. 📷 Example Output (Telegram) 📊 Crypto Market Summary — 2025-04-20 14:05:05 UTC 🌐 Market Overview (BTC, ETH, SOL) Average Change: -1.54% 24h Volume: $850,358,765.46 Most Volatile: SOLUSDC (4.53%) Most Liquid: BTCUSDC (0.0000% spread) 💹 Top by Volume ETHUSDC: $403,860,356.75 | -1.640% SOLUSDC: $279,241,338.60 | -1.706% BTCUSDC: $167,257,070.12 | -1.261% 📉 Losers SOLUSDC 🔻 Change: -1.71% (24h) 💰 Current: $137.10 📊 Range: $135.82 - $141.97 📈 Volatility: 4.53% 🔄 Volume: 2.01M | $279,241,338.60 ⚖️ Bid-Ask Spread: 0.0073% ⬇️ vs Market Avg: -0.17% 🔽 Momentum: -1.42% 🔢 Trades: 366,119 ETHUSDC 🔻 Change: -1.64% (24h) 💰 Current: $1,577.42 📊 Range: $1,565.60 - $1,631.98 📈 Volatility: 4.24% 🔄 Volume: 252.11K | $403,860,356.75 ⚖️ Bid-Ask Spread: 0.0044% ⬇️ vs Market Avg: -0.10% 🔽 Momentum: -1.53% 🔢 Trades: 596,801 BTCUSDC 🔻 Change: -1.26% (24h) 💰 Current: $84,336.65 📊 Range: $83,963.35 - $85,634.50 📈 Volatility: 1.99% 🔄 Volume: 1.97K | $167,257,070.12 ⚖️ Bid-Ask Spread: 0.0000% ⭐ vs Market Avg: 0.27% 🔽 Momentum: -0.68% 🔢 Trades: 124,202
by Corentin Ribeyre
This template can be used to verify email addresses with Icypeas. Be sure to have an active account to use this template. How it works This workflow can be divided into four steps : The workflow initiates with a manual trigger (On clicking ‘execute’). It reads your Google Sheet file. It converts your file to an array. It connects to your Icypeas account. It performs an HTTP request to verify the emails. Set up steps You will need a formated Google sheet file with email addresses. You will need a working icypeas account to run the workflow and get your API Key, API Secret and User ID. You will need email addresses to verify them.
by Nskha
n8n Creators Template: Creator Profile Stats Updater This n8n workflow template is designed to automate the process of updating a creator's profile statistics, including total workflows, complex workflows, approved workflows, pending workflows, total nodes, and total views. It utilizes various nodes to fetch data, process it, and update a SVG file hosted on GitHub to reflect the latest stats. Workflow Overview Schedule Trigger: Triggers the workflow execution at specified intervals. Config: Sets up configuration details like creator username, colors for text, icons, border, and card. Get Workflows: Fetches workflows associated with the creator from the n8n API. Workflows Data: Processes the fetched data to calculate various statistics. Get User: Fetches user details from the n8n API. Download Image: Downloads the creator's profile image. Extract From File: Extracts binary data from the downloaded image file. SVG: Generates an SVG file with updated stats and visual representation. GitHub: Commits the updated SVG file to the specified GitHub repository. Final: Prepares the final data set for further processing or output. Sticky Note: Provides a visual note or reminder within the workflow editor. Embed & Live Preview Since it's a .SVG format you can host it anywhere. treat it like normal image so you can embed it with any site, forum, page that support posting images. here's example code for markdown: Here's the result Or served through CDN & Cache Setup Instructions GitHub Credentials: Ensure you have GitHub credentials set up in your n8n instance to allow the workflow to commit changes to your repository. Configure Trigger: Adjust the Schedule Trigger node to set the desired execution intervals for the workflow. Set Configuration: Customize the Config node with your GitHub username and preferred aesthetic options for the SVG. Deploy Workflow: Import the workflow into your n8n instance and deploy it. Customization Options Text and Icon Colors**: Customize the colors used in the SVG by modifying the respective fields in the Config node. Profile Image Size**: Adjust the image size in the Download Image node URL if needed. Commit Messages**: Modify the commit messages in the GitHub nodes to suit your version control conventions [I've used $now funaction to include current time in message which will gives allways a diffrent commit value]. Requirements n8n (Self-hosted or Cloud version compatible with 2024 releases and up) GitHub account and repository Basic understanding of n8n workflow configuration Support and Contributions For support, please refer to the n8n community forum or the official n8n documentation. Contributions to the template can be made you're allowed to reuse this workflow and reshare with edit (like new design/colors etc..) under your name.
by Hiroshi
What this workflow does This workflow in n8n demonstrates how to send a message in Lark using a Lark bot. It begins with a manual trigger and then retrieves the necessary Lark token via a POST request. The token is used to authenticate and send a message to a specific chat using the Lark API. The input node provides the required app_id, app_secret, chat_id, and message content. After obtaining the token, the message is sent with the Lark API's message/v4/send/ endpoint. Who This Is For This n8n workflow is ideal for organizations, teams, and developers who need to automate message sending within Lark, especially those managing notifications, alerts, or team reminders. It can help users reduce manual messaging tasks by leveraging a Lark bot to deliver messages at specific intervals or based on particular conditions, enhancing team communication and responsiveness. Setup Fill the Input node with your values Exchange the bearer token in the Send Message node with your token Author: Hiroshi
by Yaron Been
Automated pipeline to collect and analyze investor data from Crunchbase, tracking investment patterns, funding history, and portfolio companies for market analysis and lead generation. 🚀 What It Does Investor Profiling**: Collects comprehensive data on investors and VC firms Investment Pattern Analysis**: Tracks funding history and investment preferences Portfolio Monitoring**: Keeps tabs on investor portfolios and new investments Data Enrichment**: Enhances raw data with additional context and metrics 🎯 Perfect For Startup founders seeking investors Market research analysts Investment professionals Business development teams Competitive intelligence ⚙️ Key Benefits ✅ Comprehensive investor profiles ✅ Real-time investment tracking ✅ Market trend analysis ✅ Data-driven investment decisions ✅ Time-saving automation 🔧 What You Need Crunchbase API access n8n instance Storage solution (database or spreadsheet) 📊 Data Points Collected Investor/Firm details Investment history Portfolio companies Funding rounds participated in Investment focus areas Contact information (when available) 🛠️ Setup & Support Quick Setup Deploy in 30 minutes with our step-by-step configuration guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Transform your investor research with automated data collection and analysis. Spend less time gathering data and more time making strategic decisions.
by Jihene
AI-Agent Code Review for GitHub Pull Requests Description: This n8n workflow automates the process of reviewing code changes in GitHub pull requests using an OpenAI-powered agent. It connects your GitHub repo, extracts modified files, analyzes diffs, and uses an AI agent to generate a code review based on your internal code best practices (fed from a Google Sheet). It ends by posting the review as a comment on the PR and tagging it with a visual label like ✅ Reviewed by AI. 🔧 What It Does Triggered on PR creation Extracts code diffs from the PR Formats and feeds them into an OpenAI prompt Enriches the prompt using a Google Sheet of Swift best practices Posts an AI-generated review as a comment on the PR Applies a PR label to visually mark reviewed PRs ✅ Prerequisites Before deploying this workflow, ensure you have the following: n8n Instance (Self-hosted or Cloud) GitHub Repository with PR activity OpenAI API Key** for GPT-4o, GPT-4-turbo, or GPT-3.5 GitHub OAuth App** (or PAT) connected to n8n to post comments and access PR diffs (Optional) Google Sheets API credentials if using the code best practices lookup node. ⚙️ Setup Instructions 1. Import the Workflow in n8n, click on Workflows → Import from file or JSON Paste or upload the JSON code of this template 2. Configure Triggers and Connections 🔁 GitHub Trigger Node**: PR Trigger Repository**: Select the GitHub repo(s) to monitor Events**: Set to pull_request Auth**: Use GitHub OAuth2 credentials 📥 HTTP Request Node: Get file's Diffs from PR No authentication needed; it uses dynamic path from trigger 🧠 OpenAI Model Node**: OpenAI Chat Model Model**: Select gpt-4o, gpt-4-turbo, or gpt-3.5-turbo Credential**: Provide your OpenAI API Key 🧑💻 Code Review Agent Node : Code Review Agent Connected to OpenAI and optionally to tools like Google Sheets 💬 GitHub Comment Poster Uses GitHub API to post review comments back on PR Node: GitHub Robot Credential: Use the agent Github account (OAuth or PAT) Repo : Pick your owen Github Repository 🏷️ PR Labeler (optional) Adds label ReviewedByAI after successful comment Node: Add Label to PR Label : you ca customize the label text of your owen tag. 📊 Google Sheet Best Practices config (optional) Connects to a Google Sheet for coding guideline lookups, we can replace Google sheet by another tool or data base First prepare your best practices list with the clear description and the code bad/good examples Add al the best practices in your Google Sheet Configure* the Code *Best Practices node** in the template : Credential : Use your Google Sheet account by OAuth2 URL : Add your Google Sheet document URL Sheet : Add the name of the best practices sheet