by David Olusola
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 📁 Google Drive MCP Workflow – AI-Powered File Management Automation 🚀 🧠 Overview A secure and intelligent n8n workflow that connects with Google Drive via MCP (Model Control Protocol). Ideal for AI agent tasks, compliance-driven storage, and document automation. 🌟 Key Features 🔒 Built-In Safety Backs up files before edits (timestamped) Supports rollback using file history Validates file size, type, and permissions 📁 Smart Organization Automatically converts file types (PDF, DOCX, etc.) Moves files to structured folders Auto-archives old files based on age or rules 🔄 MCP Integration Accepts standardized JSON via webhook Real-time execution for AI agents Fully customizable input (action, fileId, format, etc.) ✅ AI Callable MCP Actions These are the commands AI agents can perform via MCP: Download a file (with optional format conversion) Upload a new file to Google Drive Copy a file for backup Move a file to a specific folder Archive old or inactive files Organize documents into folders Convert files to a new format (PDF, DOCX, etc.) Retrieve and review file history for rollback 📝 Example Input { "action": "download", "fileId": "abc123", "folderPath": "/projects/clientA", "convertFormat": "pdf" } 🔐 Security & Performance OAuth2 secured access to Google Drive API No sensitive data stored in transit Real-time audit logs and alerts Batch-friendly with built-in rate limiting 📌 Ideal For Businesses automating file management AI Agents retrieving, sorting, converting, or archiving files Compliance teams needing file versioning and backups ⚙️ Requirements n8n + Google Drive API v3 MCP server + Webhook integration Google OAuth2 Credentials
by Harshil Agrawal
This workflow allows you to send position updates of the ISS every minute to a topic in MQTT using the MQTT node. Cron node: The Cron node will trigger the workflow every minute. HTTP Request node: This node will make a GET request to the API https://api.wheretheiss.at/v1/satellites/25544/positions to fetch the position of the ISS. This information gets passed on to the next node in the workflow. Set node: We will use the Set node to ensure that only the data that we set in this node gets passed on to the next nodes in the workflow. AWS SQS: This node will send the data from the previous node to the iss-position topic. If you have created a topic with a different one, you can use that topic instead.
by Shahrear
📜 AI-Powered Contract Management Pipeline (Google Drive + VLM Run + Sheets + Calendar + Slack) ⚙️ What This Workflow Does This workflow automatically extracts, organizes, and tracks legal contract details from documents uploaded to Google Drive. Using VLM Run’s Execute Agent, it parses key metadata such as contract ID, parties, dates, and terms — then stores, alerts, and schedules reminders through Google Sheets, Calendar, and Slack. 🧩 Requirements Google Drive OAuth2** for monitoring and downloads VLM Run API credentials** with Execute Agent access Google Sheets OAuth2** for structured record storage Google Calendar OAuth2** for key date reminders Slack API credentials** for team notifications A reachable Webhook URL (for receiving parsed contract data) ⚡Quick Setup Configure Google Drive OAuth2 and create upload folder and folder for saving extracted images. Install the verified VLM Run node by searching for VLM Run in the node list, then click Install. Once installed, you can start using it in your workflows. Add VLM Run API credentials for document parsing. Configure Google Sheet and Calendar. For Google Sheet, from the document list, pick your Google Sheet (e.g., test). Then select the sheet inside it (e.g., Sheet1). Set the operation to Append Row — this will add new contract details as new rows. Turn on Map Each Column Manually. Match each contract field (like Contract ID, Title, Parties, Effective Date, Termination Date) to its corresponding column in your Google Sheet. Configure Slack for notifications. ⚙️ How It Works Monitor Contract Uploads – Watches a target Google Drive folder for new file uploads (PDFs, images, or scans). Download Contract File – Automatically downloads new contracts for AI analysis. VLM Run ContractParser – Sends the file to the VLM Run Execute Agent, which extracts structured contract data, including: Contract ID Title Parties (with roles) Property address Effective date Termination date Rent, deposit, payment terms, and governing law Receive Contract Data – The webhook endpoint receives the structured JSON response. Format Contract Data – Normalizes fields, formats dates, and prepares for storage. Save to Expense Database (Google Sheets) – Appends extracted data to a master Google Sheet for centralized contract tracking. Notify via Slack – Posts a concise summary to a Slack channel, showing key contract details for visibility. Create Calendar Events – Automatically schedules Google Calendar events for: Effective Date Termination Date Renewal Reminder (60 days before termination) 💡 Why Use This Workflow Manual contract management is error-prone and time-consuming key details like renewal dates, payment terms, or termination clauses often get lost in email threads or folders. This workflow ensures: Zero missed deadlines** automatic Google Calendar reminders keep your team on track. Instant team visibility** - Slack notifications keep legal, finance, and operations aligned. End-to-end automation** no need for manual parsing, data entry, or follow-ups. 🧠 Perfect For Legal teams automating contract intake and tracking Real estate or lease management workflows Finance or procurement teams needing expiration alerts Organizations centralizing contract metadata in Sheets 🛠️ How to Customize Modify Extraction Fields Edit the VLM Run Execute Agent schema to add fields like contract value, payment schedule, department, or contact email. Change Storage Swap Google Sheets for Airtable, Notion, or BigQuery if you manage large datasets or need relational tracking. Customize Notifications Send Slack alerts only for high-value or expiring contracts, and tag relevant teams (e.g., @legal, @finance). Add Calendar Events Auto-create events for reviews or payment milestones using extra date fields. Add Approvals or Signatures Insert a Google Form or Slack approval step, or trigger DocuSign for e-signature automation. ⚠️ Community Node Disclaimer This workflow uses community nodes (VLM Run) that may need additional permissions and custom setup.
by Robert Breen
Create multi-sheet Excel workbooks in n8n to automate reporting using Google Drive + Google Sheets Build an automated Excel file with multiple tabs directly in n8n. Two Code nodes generate datasets, each is converted into its own Excel worksheet, then combined into a single .xlsx and (optionally) appended to a Google Sheet for sharing—eliminating manual copy-paste and speeding up reporting. Who’s it for Teams that publish recurring reports as Excel with multiple tabs Ops/Marketing/Data folks who want a no-code/low-code way to package JSON into Excel n8n beginners learning the Code → Convert to File → Merge pattern How it works Manual Trigger starts the run. Code nodes emit JSON rows for each table (e.g., People, Locations). Convert to File nodes turn each JSON list into an Excel binary, assigning Sheet1/Sheet2 (or your names). Merge combines both binaries into a single Excel workbook with multiple tabs. Google Sheets (optional) appends the JSON rows to a live spreadsheet for collaboration. Setup (only 2 connections) 1️⃣ Connect Google Sheets (OAuth2) In n8n → Credentials → New → Google Sheets (OAuth2) Sign in with your Google account and grant access Copy the example sheet referenced in the Google Sheets node (open the node and duplicate the linked sheet), or select your own In the workflow’s Google Sheets node, select your Spreadsheet and Worksheet https://docs.google.com/spreadsheets/d/1G6FSm3VdMZt6VubM6g8j0mFw59iEw9npJE0upxj3Y6k/edit?gid=1978181834#gid=1978181834 2️⃣ Connect Google Drive (OAuth2) In n8n → Credentials → New → Google Drive (OAuth2) Sign in with the Google account that will store your Excel outputs and allow access In your Drive-related nodes (if used), point to the folder where you want the .xlsx saved or retrieved Customize the workflow Replace the sample arrays in the Code nodes with your data (APIs, DBs, CSVs, etc.) Rename sheetName in each Convert to File node to match your desired tab names Keep the Merge node in Combine All mode to produce a single workbook In Google Sheets, switch to Manual mapping for strict column order (optional) Best practices (per template guidelines) Rename nodes** to clear, action-oriented names (e.g., “Build People Sheet”, “Build Locations Sheet”) Add a yellow Sticky Note at the top with this description so users see setup in-workflow Do not hardcode credentials** inside HTTP nodes; always use n8n Credentials Remove personal IDs/links before publishing Sticky Note (copy-paste) > Multi-Tab Excel Builder (Google Drive + Google Sheets) > This workflow generates two datasets (Code → JSON), converts each to an Excel sheet, merges them into a single workbook with multiple tabs, and optionally appends rows to Google Sheets. > > Setup (2 connections): > 1) Google Sheets (OAuth2): Create credentials → duplicate/select your target spreadsheet → set Spreadsheet + Worksheet in the node. > 2) Google Drive (OAuth2): Create credentials → choose the folder for storing/retrieving the .xlsx. > > Customize: Edit the Code nodes’ arrays, rename tab names in Convert to File, and adjust the Sheets node mapping as needed. Troubleshooting Missing columns / wrong order:* Use *Manual mapping** in the Google Sheets node Binary not found:* Ensure each *Convert to File* node’s binaryPropertyName matches what *Merge** expects Permissions errors:** Re-authorize Google credentials; confirm you have edit access to the target Sheet/Drive folder 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your PDF)? 📧 rbreen@ynteractive.com 🔗 https://www.linkedin.com/in/robert-breen-29429625/ 🌐 https://ynteractive.com
by Raquel Giugliano
This workflow automates currency rate uploads into SAP Business One via Service Layer, using flexible input sources such as JSON (API), SQL Server, Google Sheets, or manual values. It leverages logic branching, AI validation, and logging for complete control and traceability. ++⚙️ HOW IT WORKS:++ 🔹 1. Receive Data via Webhook The workflow listens on the endpoint /formulario-datos via HTTP POST. The request body should include: origen: one of JSON, SQL, GoogleSheets, or Manual Depending on the value, the flow branches accordingly. 🔹 2. Authenticate with SAP Business One A POST request is sent to SAP B1’s Login endpoint. A session cookie (B1SESSION) is retrieved and used in all subsequent API calls. 🔹 3. Switch by Origin The flow branches into four processing paths based on origen: JSON: The payload is normalized using OpenAI to extract an array of rates. Each rate is sent to SAP individually after parsing. SQL: The SQL query provided in the payload is executed on a connected Microsoft SQL Server. The results are checked by AI to validate the date format. If valid, rates are sent to SAP. GoogleSheets: Rates are pulled from a connected spreadsheet. Each entry is sent to SAP in sequence. Manual: Uses currency, rate, and rateDate directly from the webhook payload. Sends the result directly to SAP. 🔹 4. AI-Powered Enhancements (Optional but enabled) Normalize JSON: Uses OpenAI (LangChain node) to convert any messy structure into a uniform array under the key rate. Date Formatting: Another OpenAI call ensures RateDate is in yyyyMMdd format (required by SAP), converting from ISO, timestamp, or other formats. 🔹 5. Send to SAP Business One (Service Layer) All paths send a POST request to: /SBOBobService_SetCurrencyRate With a payload such as: { "Currency": "USD", "Rate": "0.92", "RateDate": "20250612" } 🔹 6. Log Results All success/failure results are appended to a Google Sheets log (LOGS_N8N) The log includes method, URL, sent payload, status code, and message. ++🛠 SETUP STEPS:++ 1️⃣ Create Required Credentials: Go to Credentials > + New Credential and configure: SAP Business One (Service Layer) Type: HTTP Request Auth or Token Base URL: https://<your-host>:50000/b1s/v1/ Provide Username, Password, and CompanyDB via variables or fields Google Sheets OAuth2 connection to a Google account with access Microsoft SQL Server SQL login credentials and host OpenAI API key with access to models like GPT-4o 2️⃣ Environment Variables (Recommended) Set these variables in n8n → Settings → Variables: SAP_URL=https://<host>:50000/b1s/v1/ SAP_USER=your_username SAP_PASSWORD=your_password SAP_COMPANY_DB=your_companyDB 3️⃣ Prepare Google Sheets Sheet 1: RATE (for charging the data) Columns: Currency, Rate, RateDate Sheet 2: LOGS_N8N (to save the logs, success or failed) Columns: workflow, method, url, json, status_code, message 4️⃣ Activate and Test Deploy the webhook and grab the URL. ++✅ BONUS++ Built-in AI assistance for input validation and structure Logs all results for compliance and audit Flexible integration paths: perfect for hybrid or transitional systems
by Md. Nazmul Islam
AI-Powered MCQ Quiz Generator from YouTube Videos Transform any YouTube video into an interactive MCQ quiz automatically! This workflow uses Google Gemini AI to analyze video content and generate comprehensive multiple-choice questions with automatic grading - perfect for educators, trainers, and content creators. Who is this For This workflow is perfect for: Educators** creating quizzes from educational YouTube content Corporate Trainers** developing assessments from training videos Content Creators** engaging their audience with interactive quizzes Students** testing their knowledge on video lectures Online Course Creators** building assessments from video content Features AI Video Analysis**: Google Gemini 2.5 Flash analyzes entire YouTube videos (up to 50 minutes) Dynamic Question Generation**: Creates up to 90 MCQ questions with 3 options each Automatic Form Creation**: Generates Google Forms with quiz functionality Smart Grading**: Built-in correct answer identification and scoring Error Handling**: Robust error management with user feedback How It Works User Input via n8n Web Form: Form Name (Quiz Title) Email Address YouTube Video URL Number of Questions (1-90) AI Processing Pipeline: Google Gemini analyzes the YouTube video content AI extracts key concepts and generates relevant questions Structured output parser formats questions into JSON Google Forms Integration: Automatically creates a new Google Form Adds all generated questions with multiple choice options Configures quiz settings with correct answers and scoring Completion & Access: User receives direct link to the generated quiz Form ready for immediate use or sharing Video Demo: See this youtube Video to explore "how it works". Set Up Steps Import the Workflow Create a new workflow in n8n Import the JSON file by clicking "three dots" (upper right corner) > "Import from file..." Configure Google Gemini API Get your Google AI Studio API key from Google AI Studio On “HTTP Request to Gemini” node replace the “API_KEY” from url with your API key. Create a "Google Gemini (PaLM) API" credential in n8n Add your API key to the credential Connect the credential to the "Google Gemini Chat Model" node Set Up Google Forms Integration Enable Google Forms API in Google Cloud Console Create a "Google OAuth2 API" credential in n8n Authorize the credential with Forms permissions Connect the credential to both HTTP Request nodes (“Create a Google Form” node and “Create MCQ Quizzes” node) Configure Form Trigger The workflow includes a built-in form trigger No additional setup needed - the form URL will be generated automatically Customize form fields if needed in the “Input YouTube URL" node Test the Workflow Activate the workflow Submit the form to generate a test quiz Verify the Google Form is created successfully Pre-requisites Necessary Accounts:** Google Account (for Forms API access) Google AI Studio Account (for Gemini API access) n8n Instance (cloud or self-hosted) API Access:** Google Forms API enabled Google drive API enabled Google Generative AI API access Valid API keys and OAuth credentials N8N Requirements:** n8n version 1.95.2 or higher LangChain nodes package installed Internet access for API calls Customization Guidance Question Generation Prompts: Modify the prompt in "Set Prompt and model" node for different question styles Adjust difficulty levels or focus areas Change question format (True/False, Fill-in-blanks, etc.) Form Customization: Update form title and description templates Add additional input fields (difficulty level, subject area) Customize success/error messages Advanced Features You Can Add: Email Notifications: Send quiz links via email Analytics Integration: Track quiz performance and completion rates Multi-language Support: Generate quizzes in different languages Question Bank Storage: Save generated questions to a database Batch Processing: Generate multiple quizzes from a YouTube playlist Error Handling Enhancements: Add retry logic for API failures Implement fallback question generation Create detailed error logging Technical Specifications Video Length**: Up to 50 minutes supported Question Limit**: 1-90 questions per quiz Processing Time**: 2-10 minutes depending on video length Supported Formats**: YouTube videos (public and unlisted) Output Format**: Google Forms with automatic grading Limitations & Considerations YouTube video must be publicly accessible or unlisted Processing time increases with video length and question count API rate limits may apply for high-volume usage Some complex visual content may not be fully analyzed Ready to Transform Videos into Quizzes? This workflow streamlines the entire process from video analysis to quiz deployment. Perfect for educators and trainers looking to create engaging assessments from video content quickly and efficiently.
by CustomJS
This n8n template shows how to extract selected pages from a generated PDF with the PDF Toolkit by www.customjs.space. @custom-js/n8n-nodes-pdf-toolkit Notice Community nodes can only be installed on self-hosted instances of n8n. What this workflow does Downloads** each PDF using an HTTP Request. Extract** pages from the PDF file as needed. Requirements Self-hosted** n8n instance CustomJS API key** for extracting PDF files. PDF files to be merged** to be converted into a PDF Workflow Steps: Manual Trigger: Runs with user interaction. Download PDF File: Pass urls for PDF files to merge. Extract Pages from PDF: Extract selected pages from a generated PDF Usage Get API key from customJS Sign up to customJS platform. Navigate to your profile page Press "Show" button to get API key Set Credentials for CustomJS API on n8n Copy and paste your API key generated from CustomJS here. Design workflow A Manual Trigger for starting workflow. HTTP Request Nodes for downloading PDF files. Extract Pages from PDF. You can replace logic for triggering and returning results. For example, you can trigger this workflow by calling a webhook and get a result as a response from webhook. Simply replace Manual Trigger and Write to Disk nodes. Perfect for Taking a note of specific pages from PDF files. Splitting PDF file into multiple parts.
by Miquel Colomer
Do you want to avoid bounces in your Email Marketing campaigns? This workflow verifies emails using the uProc.io email verifier. You need to add your credentials (Email and API Key - real -) located at Integration section to n8n. Node "Create Email Item" can be replaced by any other supported service with email value, like Mailchimp, Calendly, MySQL, or Typeform. The "uProc" node returns a status per checked email (deliverable, undeliverable, spamtrap, softbounce,...). "If" node checks if "deliverable" status exists. If value is not present, you can mark email as invalid to discard bounces. If "deliverable" status is present, you can use email in your Email Marketing campaigns. If you need to know detailed indicators of any email, you can use the tool "Communication" > "Check Email Exists (Extended)" to get advanced information.
by bswlife
Disclaimer The Execute Command node is only supported on self-hosted (local) instances of n8n. Introduction KOKORO TTS - Kokoro TTS is a compact yet powerful text-to-speech model, currently available on Hugging Face and GitHub. Despite its modest size—trained on less than 100 hours of audio—it delivers impressive results, consistently topping the TTS leaderboard on Hugging Face. Unlike larger systems, Kokoro TTS offers the advantage of running locally, even on devices without GPUs, making it accessible for a wide range of users. Who will benefit from this integration? This will be useful for video bloggers, TikTokers, and it will also enable the creation of a free voice chat bot. Currently, TTS models are mostly paid, but this integration will allow for fully free voice generation. The possibilities are limited only by your imagination. Note Unfortunately, we can't interact with the KOKORO API via browser URL (GET/POST), but we can run a Python script through n8n and pass any variables to it. In the tutorial, the D drive is used, but you can rewrite this for any paths, including the C drive. Step 1 You need to have Python installed. link Also, download and extract the portable version of KOKORO from GitHub. Create a file named voicegen.py with the following code in the KOKORO folder: (C:\KOKORO). As you can see, the output path is: (D:\output.mp3). import sys import shutil from gradio_client import Client Set UTF-8 encoding for stdout sys.stdout.reconfigure(encoding='utf-8') Get arguments from command line text = sys.argv[1] # First argument: input text voice = sys.argv[2] # Second argument: voice speed = float(sys.argv[3]) # Third argument: speed (converted to float) print(f"Received text: {text}") print(f"Voice: {voice}") print(f"Speed: {speed}") Connect to local Gradio server client = Client("http://localhost:7860/") Generate speech using the API result = client.predict( text=text, voice=voice, speed=speed, api_name="/generate_speech" ) Define output path output_path = r"D:\output.mp3" Move the generated file shutil.move(result[1], output_path) Print output path print(output_path) Step 2 Go to n8n and create the following workflow. Step 3 Edit Field Module. { "voice": "af_sarah", "text": "Hello world!" } Step 4 We’ll need an Execute Command module with the command: python C:\KOKORO\voicegen.py “{{ $json.text }}” “{{ $json.voice }}” 1 Step 5 The script is already working, but to listen to it, you can connect a Binary module with the path to the generated MP3 file D:/output.mp3 Step 6 Click “Text workflow” and enjoy the result. There are more voices and accents than in ChatGPT, plus it’s free. P.S. If you want, there is a detailed tutorial on my blog.
by Omar Akoudad
This n8n workflow helps eCommerce businesses (especially in the Cash on Delivery space) send real-time order events to the Meta (Facebook) Conversions API, ensuring accurate event tracking and better ad attribution. Features Webhook Listener**: Accepts incoming order data (name, phone, IP, user-agent, etc.) via HTTP POST/GET. Data Normalization**: Cleans and formats first_name, last_name, phone, and event_time according to Facebook's strict specs. Data Hashing**: Securely hashes sensitive user data (SHA256), as required by Meta. Full Custom Data Suppor**t: Pass order value, currency, and more. Ideal For: Shopify, WooCommerce, custom stores (Laravel, Node, etc.) Businesses using Meta Ads and needing high-quality server-side tracking Teams without access to full dev resources, but using n8n for automation How It Works: Receive Order from your store via Webhook or API. Format & Normalize fields to match Facebook’s expected structure. Encrypt Sensitive Fields using SHA256 (name, phone, email). Send to Facebook via the Conversions API endpoint. Requirements: A Meta Business Manager account with Conversions API access Your Access Token and Pixel ID set up in n8n credentials
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically tracks customer satisfaction scores across multiple platforms and surveys to help improve customer experience and identify areas for enhancement. It saves you time by eliminating the need to manually check different feedback sources and provides comprehensive satisfaction analytics. Overview This workflow automatically scrapes customer satisfaction surveys, review platforms, and feedback forms to extract satisfaction scores and sentiment data. It uses Bright Data to access various feedback platforms without being blocked and AI to intelligently analyze satisfaction trends and identify improvement opportunities. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping satisfaction surveys and review platforms without being blocked OpenAI**: AI agent for intelligent satisfaction analysis and trend identification Google Sheets**: For storing satisfaction scores and generating analytics reports How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your satisfaction tracking spreadsheet Customize: Define feedback sources and satisfaction metrics you want to monitor Use Cases Customer Experience**: Monitor satisfaction trends across all customer touchpoints Product Teams**: Identify product features that impact customer satisfaction Support Teams**: Track satisfaction scores for support interactions Management**: Get comprehensive satisfaction reporting for strategic decisions Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #customersatisfaction #satisfactionscores #brightdata #webscraping #customerexperience #n8nworkflow #workflow #nocode #satisfactiontracking #csat #nps #customeranalytics #feedbackanalysis #customerinsights #satisfactionmonitoring #experiencemanagement #customermetrics #satisfactionsurveys #feedbackautomation #customerfeedback #satisfactiondata #customerjourney #experienceanalytics #satisfactionreporting #customersentiment #experienceoptimization #satisfactiontrends #customervoice
by Nskha
Overview The [n8n] YouTube Channel Advanced RSS Feeds Generator workflow facilitates the generation of various RSS feed formats for YouTube channels without requiring API access or administrative permissions. It utilizes third-party services to extract data, making it extremely user-friendly and accessible. Key Use Cases and Benefits Content Aggregation**: Easily gather and syndicate content from any public YouTube channel. No API Key Required**: Avoid the complexities and limitations of Google's API. Multiple Formats**: Supports ATOM, JSON, MRSS, Plaintext, Sfeed, and direct YouTube XML feeds. Flexibility**: Input can be a YouTube channel or video URL, ID, or username. Services/APIs Utilized This workflow integrates with: commentpicker.com**: For retrieving YouTube channel IDs. rss-bridge.org**: To generate various RSS formats. Configuration Instructions Start the Workflow: Activate the workflow in your n8n instance. Input Details: Enter the YouTube channel or video URL, ID, or username via the provided form trigger. Run the Workflow: Execute the workflow to receive links to 13 different RSS feeds, including community and video content feeds. Screenshots Additional Notes Customization**: You can modify the RSS feed formats or integrate additional services as needed. Support and Contributions For support, questions, or contributions, please visit the n8n community forum or the GitHub repository. We welcome contributions from the community!