by Oneclick AI Squad
In this guide, we’ll walk you through setting up an AI-driven workflow that automatically processes highly-rated food photos from a Google Sheet, generates AI-powered captions, shares them to Pinterest, and updates the sheet to reflect the posts. Ready to automate your food photo sharing? Let’s dive in! What’s the Goal? Automatically detect and process highly-rated food photos (4 stars or above) from a Google Sheet. Use AI to generate engaging and relevant captions. Share the photos with captions to Pinterest via the Pinterest API. Update the Google Sheet to mark photos as posted. Enable scheduled automation for consistent posting. By the end, you’ll have a self-running system that shares your best food photos effortlessly. Why Does It Matter? Manual photo sharing is time-consuming and inconsistent. Here’s why this workflow is a game changer: Zero Human Error**: AI ensures consistent captions and posting accuracy. Time-Saving Automation**: Automatically handle photo sharing, boosting efficiency. Scheduled Posting**: Maintain a regular presence on Pinterest without manual effort. Focus on Creativity**: Free your team from repetitive posting tasks. Think of it as your tireless social media assistant that keeps your Pinterest feed vibrant. How It Works Here’s the step-by-step magic behind the automation: Step 1: Trigger the Workflow Detect new photos to post using the Daily Post Scheduler node (e.g., once daily). Initiate the workflow at a scheduled time to check for new food photos. Step 2: Fetch Food Photos from Sheet Retrieve rows from the Google Sheet that contain food photo metadata like image URLs, ratings, and status. Step 3: Filter 4+ Star Dishes Filter only those food entries with high ratings (4 stars or above) and unposted status. Step 4: AI Caption Generator Use AI (e.g., GPT/OpenAI) to create engaging and relevant captions for selected food photos. Step 5: Upload to Pinterest Automatically post the food photo with the generated caption to Pinterest via the Pinterest API. Step 6: Mark as Posted in Sheet Update the Google Sheet to reflect that the photo has been successfully shared. How to Use the Workflow? Importing a workflow in n8n is a straightforward process that allows you to use pre-built workflows to save time. Below is a step-by-step guide to importing the Automated Food Photo Sharing workflow in n8n. Steps to Import a Workflow in n8n Obtain the Workflow JSON Source the Workflow: Workflows are shared as JSON files or code snippets, e.g., from the n8n community, a colleague, or exported from another n8n instance. Format: Ensure you have the workflow in JSON format, either as a file (e.g., workflow.json) or copied text. Access the n8n Workflow Editor Log in to n8n (via n8n Cloud or self-hosted instance). Navigate to the Workflows tab in the n8n dashboard. Click Add Workflow to create a blank workflow. Import the Workflow Option 1: Import via JSON Code (Clipboard): Click the three dots (⋯) in the top-right corner to open the menu. Select Import from Clipboard. Paste the JSON code into the text box. Click Import to load the workflow. Option 2: Import via JSON File: Click the three dots (⋯) in the top-right corner. Select Import from File. Choose the .json file from your computer. Click Open to import. Setup Notes Google Sheet Columns**: Ensure your Google Sheet includes the following columns: Image URL, Rating (numeric, e.g., 1-5), Feedback (text), Pin Title, Pin Description, Destination URL, Board ID, and Status (e.g., "Pending" or "Posted"). Google Sheets Credentials**: Configure OAuth2 settings in the Fetch Food Photos node with your Google Sheet ID and credentials. AI Model**: Set up the AI Caption Generator node with OpenAI credentials (e.g., API key). Pinterest API**: Authorize the Upload to Pinterest node with Pinterest API credentials (e.g., Bearer Token) and obtain the Board ID. Scheduling**: Adjust the Daily Post Scheduler node to your preferred posting time (e.g., daily at 9 AM).
by Dvir Sharon
🔍 Extract Competitor SERP Rankings from Google Search to Sheets with Bright Data This template requires a self-hosted n8n instance to run. A comprehensive n8n automation that extracts competitor data from Google search results for specific keywords and target countries, automatically saving structured data to Google Sheets for competitive analysis and market research. 📋 Overview This workflow provides a professional competitor analysis solution that identifies ranking websites for specific search terms across different countries. Perfect for SEO research, competitive intelligence, market analysis, and content strategy planning. The system uses Bright Data's SERP API for accurate search result extraction and advanced HTML parsing for detailed competitor information. Who is this for? SEO professionals conducting competitive analysis Digital marketers researching market landscapes Business analysts studying competitor positioning Content strategists analyzing competitor content approaches Market researchers tracking competitive intelligence across regions What problem is this workflow solving? Extracting competitor data from Google search results Processing multiple keywords across different countries Organizing results in a structured, analyzable format Eliminating manual copy-paste work Ensuring consistent data collection methodology What this workflow does Manual Trigger: Starts the workflow execution Get Keywords from Sheet: Fetches keywords and target countries from Google Sheets URL Encode Keywords: Converts keywords to URL-safe format Process Keywords in Batches: Handles multiple keywords sequentially Fetch Google Search Results: Uses Bright Data SERP API to scrape HTML Extract Competitor Data from HTML: Parses HTML to extract competitor details Save Competitor Results to Sheet: Stores structured data in Google Sheets Wait to Avoid Rate Limits: Implements 30-second delays between requests Output Data Points | Field | Description | Example | | :--------------- | :--------------------------------- | :------------------------------------------ | | Keyword | Original search term | digital marketing services | | Target Country | Geographic target | US | | websiteName | Domain/company name | hubspot | | websiteUrl | Complete website URL | https://www.hubspot.com/marketing | | websiteTitle | Page title from search results | Digital Marketing Software & Tools | | websiteDescription | Meta description/snippet | Grow your business with HubSpot's digital marketing tools... | ⚙️ Setup Prerequisites n8n instance (self-hosted) Google account with Sheets access Bright Data account with SERP API access Google Sheet Structure This workflow utilizes two Google Sheets: one for input keywords and one for outputting competitor data. Input Sheet: "Keywords" This sheet should contain the keywords and target countries for your search queries. | Column Header | Data Type | Description | Example | | :------------- | :-------- | :------------------------------------------------- | :-------------- | | Keyword | Text | The search term you want to analyze. | digital marketing | | Country | Text | The 2-letter ISO country code for the target region of the search (e.g., US, GB, DE). | US | Output Sheet: "Competitor Results" This sheet will be populated automatically by the workflow with the extracted competitor data. | Column Header | Data Type | Description | Example | | :----------------- | :-------- | :---------------------------------------------------------------------------------- | :----------------------------------------------- | | Keyword | Text | The original search term used for the query. | digital marketing services | | Target Country | Text | The 2-letter ISO country code of the search results. | US | | websiteName | Text | The name of the website or domain found in the search results. | hubspot | | websiteUrl | URL | The full URL of the website or page found in the search results. | https://www.hubspot.com/marketing | | websiteTitle | Text | The title of the page as displayed in the Google search results. | Digital Marketing Software & Tools | | websiteDescription | Text | The meta description or snippet text displayed under the title in search results. | Grow your business with HubSpot's digital marketing tools... | Step-by-Step Setup Import the Workflow: Copy JSON → n8n → Workflows → + Add → Import from JSON Configure Bright Data Credentials: Credential Type: HTTP Header Auth Header Name: Authorization Header Value: Bearer YOUR_API_TOKEN Configure Google Sheets: Create two new Google Sheets as described above: one named "Keywords" (for input) and one named "Competitor Results" (for output). Set up Google Sheets OAuth2 credentials within n8n. Update Workflow Settings: Replace placeholders: YOUR_GOOGLE_SHEET_ID (for both input and output sheets), YOUR_BRIGHTDATA_CREDENTIAL_ID. Ensure correct sheet/tab names are selected in the Google Sheets nodes. Test & Activate: Add test data to your "Keywords" sheet → Execute workflow → Verify output in your "Competitor Results" sheet. 🛠 How to Customize Add More Data Points:** Modify the JavaScript code in the "Extract Competitor Data from HTML" node to parse and extract additional information from the HTML. Custom Filtering:** Implement logic to exclude specific domains, filter results by title length, or other criteria. Expand Geographic Coverage:** Add more 2-letter ISO country codes to the Bright Data SERP API call to broaden your competitive analysis. Batch Processing:** Adjust the settings in the "Process Keywords in Batches" node to optimize for your Bright Data plan and desired execution speed. Rate Limiting:** Modify the "Wait" node (default: 30 seconds) to increase or decrease the delay between requests based on API limits or performance needs. 📊 Use Cases & Examples SEO Competitive Analysis:** Identify top-ranking competitors for your target keywords and analyze their strategies. Market Entry Research:** Understand the competitive landscape in new geographic regions before expanding. Content Strategy Planning:** Analyze competitor page titles and meta descriptions for inspiration and to identify content gaps. International Market Research:** Compare search engine results and competitor positioning across different countries. 📈 Performance & Limits Single Keyword:** 30–60 seconds per keyword. Batch of 10 Keywords:** Typically takes 5–10 minutes. Large Lists (50+ Keywords):** Expect execution times of 30–60 minutes or more, depending on batching and rate limits. Success Rate:** Generally 95%+ for data extraction. Data Accuracy:** Typically 98%+ for extracted fields. API Calls:** 1 Bright Data SERP API call per keyword, plus multiple Google Sheets writes per execution. Rate Limit:** A 30-second delay between requests is recommended to prevent exceeding API limits. 🧰 Troubleshooting Bright Data API error:** Double-check your API token, ensure you have sufficient credits, and confirm SERP API access is enabled on your Bright Data account. No keywords found:** Verify the Google Sheet ID and ensure the column headers in your "Keywords" sheet precisely match the specifications (e.g., "Keyword", "Country"). Google Sheets permission denied:** Re-authenticate your Google Sheets credentials within n8n and check that the correct sharing settings are applied to your sheets. No results extracted:** Review the JavaScript parsing logic in the "Extract Competitor Data from HTML" node. Also, verify the validity of your keywords and target countries. Loop not processing all:** Check the batch settings in the "Process Keywords in Batches" node and ensure all connections within the loop are correctly configured. 🤝 Support & Community n8n Forum:** <https://community.n8n.io> n8n Docs:** <https://docs.n8n.io> Bright Data Support:** Access support directly via your Bright Data dashboard. GitHub Issues:** Report any bugs or suggest new features on the n8n GitHub repository. 🎯 Final Notes This workflow provides a comprehensive foundation for competitor research and market analysis. Customize it to fit your specific industry needs and competitive intelligence requirements. Please note that this template uses Community Nodes. Ensure you understand the risks before using community nodes.
by Laura Piraux
This n8n workflow template uses community nodes and is only compatible with the self-hosted version of n8n. Build an AI agent for Notion (with Notion official MCP server) Use case This template empowers Notion power-users to build their own AI assistant, deeply integrated with their workspace. It solves the constant problem of copy-pasting and context-switching between a separate AI chat and Notion by creating a direct, conversational bridge. Now you can interact with an intelligent agent that can create, retrieve, and update your Notion databases and pages on your behalf, turning your workspace into a truly dynamic productivity hub. How it works When you send a message via the chat interface, the workflow passes it to your chosen AI model. The model, connected to the official Notion tool server, analyzes your request to see if it can be fulfilled by one of its available Notion actions. If it matches a tool, the workflow executes the command using the Notion API—like creating a new page or searching a database—and the AI then confirms the action is complete back in the chat. Setup Prerequisite: This template is for self-hosted n8n instances only, as it requires a community node. Copy this workflow into your self-hosted n8n instance Install the required community node (n8n-nodes-mcp). Add your credentials for your chosen AI Model and the Notion MCP Server. Test the workflow by starting chatting with your new Notion assistant. How to adjust it to your needs You can use the AI model you want and even easily compare different AI models. You can start from this template and then provide other tools to your AI agent to build more powerful workflows.
by Yaron Been
Workflow Overview This advanced n8n automation is a powerful channel research and intelligence gathering tool designed to transform raw YouTube channel data into actionable insights. By intelligently connecting multiple APIs and data sources, this workflow: Discovers Channel Metrics: Automatically retrieves channel statistics Captures detailed performance indicators Provides comprehensive channel intelligence Performs Deep Analysis: Extracts recent video performance data Calculates engagement metrics Aggregates view count insights Uncovers Contact Information: Attempts to retrieve public email addresses Provides direct outreach opportunities Enhances lead generation capabilities Seamless Data Logging: Automatically updates Google Sheets Maintains a live intelligence dashboard Preserves historical channel data Key Benefits 🤖 Full Automation: Continuous channel intelligence gathering 💡 Smart Analysis: Comprehensive performance insights 📊 Real-Time Tracking: Always-updated channel metrics 🔍 Lead Generation: Direct contact information extraction Workflow Architecture 🔹 Stage 1: Channel Identification Google Sheets Trigger**: Detects new channel additions YouTube Data API**: Fetches channel statistics Comprehensive Metric Collection**: Subscriber count Total view metrics Channel overview 🔹 Stage 2: Video Performance Analysis Recent Video Retrieval**: Fetches 5 latest uploads View Count Aggregation**: Calculates total recent views Provides engagement snapshot Performance Insights**: Measures content effectiveness 🔹 Stage 3: Contact Discovery SerpAPI Integration**: Attempts email extraction Public Contact Information**: Retrieves available email addresses Supports outreach and networking 🔹 Stage 4: Data Compilation Intelligent Data Formatting** Google Sheets Update** Live Intelligence Dashboard** Potential Use Cases Marketing Teams**: Influencer research Sales Professionals**: Lead qualification Content Strategists**: Competitive analysis Recruitment Specialists**: Talent scouting Business Development**: Partnership identification Setup Requirements YouTube Data API Google Cloud API credentials Configured API access SerpAPI Account API key for email extraction Web scraping permissions Google Sheets Connected Google account Prepared tracking spreadsheet Appropriate sharing settings n8n Installation Cloud or self-hosted instance Workflow configuration API credential management Future Enhancement Suggestions 🤖 AI-powered channel scoring 📊 Advanced trend analysis 🔔 Automated alert system 🌐 Multi-platform channel tracking 🧠 Machine learning insights generation Technical Considerations Implement robust error handling Use exponential backoff for API calls Maintain flexible data extraction strategies Ensure compliance with platform terms of service Ethical Guidelines Respect content creator privacy Use data for legitimate research Maintain transparent data collection practices Provide opt-out mechanisms Connect With Me Ready to unlock YouTube channel insights? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your channel research with intelligent, automated workflows!
by Sherlockes
What this template is made for: I have a personal Telegram channel and a bot inside it where I save interesting links that I want to save or read later. The idea is that n8n will take care of reading the new links added to this channel and send them, through the corresponding API, to the Hoarder and Readeck installations. How it works Since my server where n8n runs is not always on, a "Schedule Trigger" will be responsible for checking every so often if there is any new content in the Telegram channel where I store the links. This request is made through "http request" and the Telegram API. Next, a code block is responsible for filtering out everything that is not a hyperlink. At this point, the flow splits into two so that parallel and similar processes are performed for Hoarder and Readeck. The corresponding API is accessed to get a list of all the links saved in the corresponding service. A code block is responsible for filtering the list of hyperlinks previously obtained from Telegram so that only those that are not already saved in the service continue. Finally, another "Http Request" node is responsible for using the service API to save the link in the corresponding service. Configuration instructions The template makes use of the environment variables that I have declared in the n8n "docker-compose.yml" file through an external ".env" file. These are the variables I use: Telegram Bot Token Sherlink TG_SHERLINK_BOT_TOKEN=XXXXXXXX:XXXXXXXXXXXXXXXX Id Telegram Channel Sherlink TG_SHERLINK_ID=-XXXXXXXXXXXXX Readeck server READECK_SERVER=http://readeck.midomain.com READECK_API_KEY=xxxxxxxxxxxxx Hoarder server HOARDER_SERVER=http://hoarder.midomain.com HOARDER_API_KEY=xxxxxxxxxxxxxx Created in 1.85.4 n8n version
by LukaszB
Crypto Price Alert – n8n Workflow A simple and effective crypto alert system for anyone who wants to stay up to date with coin price changes — without refreshing charts all day. This workflow checks the current price of your chosen cryptocurrency (via CoinGecko) and sends you an alert on Discord if it goes above or below your target range. It’s lightweight, easy to set up, and runs on autopilot. What the Workflow Does Checks the live price of a selected coin using the CoinGecko API. Compares it to the max/min prices you define manually. Decides if the price is too high or too low. Sends an alert message to Discord depending on the result. How It Works The flow is triggered manually or on a schedule (your choice). It pulls the current price of the coin you set. Compares that price with your min and max values. Sends a “high” or “low” message to your Discord webhook. Setup Steps Enter your coin ID and price thresholds in the “Set Low and High” node. Paste your Discord webhook URLs in the "Message High" and "Message Low" nodes. Optional: Adjust the schedule trigger to run every X minutes/hours. Run once manually to test — takes under 1 minutes. Full instructions and config tips are in sticky notes inside the workflow.
by Not Another Marketer
Your Landing Page is Leaking Sales—Here’s How to Fix It in Seconds Visitors land on your page. But instead of converting, they bounce. Why? Something’s broken. Something’s missing. But what? ❌ Is your CTA too weak? ❌ Is your messaging unclear? ❌ Is your design creating friction? You know something is off, but don’t know what. What if you could get an instant, expert-level report on exactly what to fix? This workflow will do an AI Analysis of your landing page, provide a CRO Audit, so you can optimize your landing page. Who is This For? SaaS Founders & Startups**: Stop leaving money on the table. Make every visitor count. Marketers & Growth Experts**: Turn landing pages into high-converting assets. E-commerce & Lead Gen Businesses**: More conversions = more revenue. How It Works Paste your URL Get an instant roast + fix list Implement changes & watch conversions jump The workflow scrapes the url you input, gets the htlm source code of the landing page, and sends it to OpenAI AI Agent. The Agent makes a deep analysis, roasts the landing page, and provides 10 Conversion Rate Optimization Tips to improve your landing page. Setup Guide You will need OpenAI Credentials with an API Key to run the workflow. The workflow is using the OpenAI-o1 model to deliver the best results. It costs between $0.20/0.30 per run. You can adjust the prompt to your wish in the AI Agent parameters. Once the workflow has been completed, select Logs to get a readable version. Below is an example.
by DanielV
This workflow is designed to translate SRT subtitle files from one language to another using Google Translate. The workflow follows these main steps: Accept an SRT file upload and target language selection Extract and parse the SRT file content Split the content into translatable segments Translate each segment using Google Translate Reassemble the translated content into a proper SRT format Return the translated file to the user You'll need a Google Console Cloud account to access the Translate API. Who is this for? This workflow is designed for content creators, video editors, translators, and anyone who needs to translate subtitle files (.srt) from one language to another. It's particularly useful for those working with international content, educational materials, or preparing videos for global audiences. What problem does this workflow solve? Translating subtitle files manually is time-consuming and error-prone. Professional translation services can be expensive, especially for multiple videos or long content. This workflow automates the translation process while maintaining the proper SRT format including timestamps and subtitle numbering. Setup Set up Google Translate credentials: -- Create a Google Cloud project and enable the Google Translate API -- Create OAuth credentials and configure them in the Google Translate node Customize language options: -- The default workflow includes English (EN) and Japanese (JP) options -- Add more language options by editing the dropdown field in the "Receive SRT File to Translate" node -- Use standard language codes that Google Translate supports Add more languages: -- Edit the form trigger node to include additional language options in the dropdown
by Jimleuk
Mistral OCR is a super convenient way to parse and extract data from multi-page PDFs or single images using AI. What makes it special and differs it from the competition is that Mistral OCR also performs document page splitting and markdown conversion. This helps reduce dependencies required for document parsing workflows where tools like StirlingPDF. Read the official documentation on Mistral OCR API here: https://docs.mistral.ai/capabilities/document/#tag/ocr/operation/ocr_v1_ocr_post How it works To access Mistral-OCR, you'll need to use Mistral Cloud API via the HTTP request node Mistral OCR can only accept 2 file types: PDF and Image. Here, we use 2 different request to the Mistral-OCR API to parse a bank statement PDF and an screenshot of a bank statement to extract the tables. Next, we explore a more secure method of uploading documents to the Mistral OCR API by using Mistral's cloud storage. In example 2, we first store a copy of our documents to Mistral cloud and then generate a signed URL to retreive the file before sending it to Mistral OCR. This ensures the file is not accessible publicly and protects it from unauthorised access. Finally, another way to use Mistral-OCR is via document understanding. This allows you to ask questions about the document rather than extract contents from it. In example 3, I demonstrate this use-case asking Mistral-small to tell me how many deposits are shown in the bank statement. How to use Ensure your documents are either publicly accessible for Mistral-OCR or upload them to Mistral Cloud. Alternatively, signed urls from AWS S3 or Cloudflare R2 should also work. Requirements Mistral Cloud account and API Key. You'll also need credit on your account to use Mistral-OCR. Customising the workflow Mistral-OCR also works for images such as charts and diagrams so try using it on Financial Reports. Mistral-OCR is even cheaper with batching enabled. This returns your results within 24hrs but is half the price per page.
by Richard Uren
This template extracts all customers from shopify using GraphQL and the shopify admin API and sync them into a Baserow table. Setup Notes Update the Endpoint in GraphQL node to reflect your Shopify store. In Baserow create a shopify database with a customer table in Baserow. Create columns in the Baserow customer table for first_name, last_name, and email. It takes about 1 second per row to insert.
by Jimleuk
This template is for Self-Hosted N8N Instances only. This n8n demonstrates how to build a simple SQLite MCP server to perform local database operations as well as use it for Business Intelligence. This MCP example is based off an official MCP reference implementation which can be found here -https://github.com/modelcontextprotocol/servers/tree/main/src/sqlite How it works A MCP server trigger is used and connected to 5 tools: 2 Code Node and 3 Custom Workflow. The 2 Code Node tools use the SQLLite3 library and are simple read-only queries and as such, the Code Node tool can be simply used. The 3 custom workflow tools are used for select, insert and update queries as these are operations which require a bit more discretion. Whilst it may be easier to allow the agent to use raw SQL queries, we may find it a little safer to just allow for the parameters instead. The custom workflow tool allows us to define this restricted schema for tool input which we'll use to construct the SQL statement ourselves. All 3 custom workflow tools trigger the same "Execute workflow" trigger in this very template which has a switch to route the operation to the correct handler. Finally, we use our Code nodes to handle select, insert and update operations. The responses are then sent back to the the MCP client. How to use This SQLite MCP server allows any compatible MCP client to manage a SQLite database by supporting select, create and update operations. You will need to have a SQLite database available before you can use this server. Connect your MCP client by following the n8n guidelines here - https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/#integrating-with-claude-desktop Try the following queries in your MCP client: "Please create a table to store business insights and add the following..." "what business insights do we have on current retail trends?" "Who has contributed the most business insights in the past week?" Requirements SQLite for database. MCP Client or Agent for usage such as Claude Desktop - https://claude.ai/download Customising this workflow If the scope of schemas or tables is too open, try restrict it so the MCP serves a specific purpose for business operations. eg. Confine the querying and editing to HR only tables before providing access to people in that department. Remember to set the MCP server to require credentials before going to production and sharing this MCP server with others!
by Jimleuk
This n8n demonstrates how to build your own Github MCP server to personalise it to your organisation's repositories, issues and pull requests. This n8n implementation, though not as fully featured as the official MCP server offered by Github, allows you to control precisely what access and/or functionality is granted to users which can make MCP use simpler and in some cases, more secure. The use-case in this template is to simply view and comment on issues within a specific repository but can be extended to meet the needs of your team. This MCP example is based off an official MCP reference implementation which can be found here https://github.com/modelcontextprotocol/servers/tree/main/src/github How it works A MCP server trigger is used and connected to 3 custom workflow tools. We're using custom workflow tools as there is quite a few nodes required for each task. Behind these tools are regular Github nodes although preconfigured with credentials and targeted repository. The "Get Issue Comments" and "Create Issue Comment" tools depend on obtaining an Issue Number first. The agent should call the "Get Latest Issues" tool for this. How to use This Github MCP server allows any compatible MCP client to view and comment on Github Issues. You will need to have a Github account and repository access available before you can use this server. Connect your MCP client by following the n8n guidelines here - https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/#integrating-with-claude-desktop Try the following queries in your MCP client: "Can you get me the latest issues about MCP?" "What is the current progress on Issue 12345?" "Please can you add a comment to Issue 12345 that they should try installing the latest version and see if that works?" Requirements Github for account and repository access. The repository need not be your own but you'll still need to ensure you have the correct permissions. MCP Client or Agent for usage such as Claude Desktop - https://claude.ai/download Customising this workflow Extend this template to interactive with pull requests or workflows within your own company's Github repositories. Alternatively, pull in metrics and generate reports for programme managers. Remember to set the MCP server to require credentials before going to production and sharing this MCP server with others!