by Aaron Smusz
This workflow automatically manages Acrobat Sign signatures, respond with "intent" to Acrobat-Sign webhooks. Prerequisites Adobe Acrobat Sign and Sign webhook Basic knowledge of JavaScript Nodes Webhook nodes trigger the workflow on new sign intents on a document Respond to Webhook node sets the response headers. Function node processes data returned by the previous node. Set node sets the required values.
by Shelly-Ann Davy
AI Contact Enrichment 📋 Template Description Overview Automatically enhance and enrich contact data using AI to fill in missing information, generate insights, and create detailed buyer personas. Supports multiple AI providers (OpenAI, Anthropic, etc.) with automatic logging to Supabase. Description This workflow transforms incomplete contact records into rich, actionable profiles. By leveraging AI, it can infer job roles, company information, likely pain points, communication preferences, and buying motivations from minimal input data. Perfect for sales and marketing teams looking to improve data quality and personalize outreach. Key Benefits: Smart Data Completion**: Fill in missing contact fields using AI inference Buyer Persona Generation**: Create detailed profiles from basic information Universal AI Support**: Works with OpenAI, Anthropic Claude, or custom providers CRM Enhancement**: Automatically enrich contacts as they enter your system Lead Qualification**: Assess lead quality and fit based on enriched data Personalization Engine**: Generate insights for tailored outreach Data Quality**: Maintain clean, complete contact records Use Cases: Sales prospecting and lead enrichment Marketing persona development CRM data cleansing and completion Account-based marketing (ABM) research Lead scoring and qualification Personalized email campaign preparation Contact segmentation and targeting ⚙️ Setup Instructions Prerequisites n8n instance (cloud or self-hosted) AI Provider account (OpenAI, Anthropic, or custom) Supabase account with database access Step 1: Configure Environment Variables Add these to your n8n environment settings: AI_PROVIDER=openai # or 'anthropic', 'custom' AI_API_KEY=your_api_key_here AI_MODEL=gpt-3.5-turbo # or 'gpt-4', 'claude-3-sonnet-20240229' AI_ENDPOINT= # Only for custom providers Recommended Models: Cost-effective**: gpt-3.5-turbo (fast, affordable, good for basic enrichment) High-quality**: gpt-4 or claude-3-sonnet-20240229 (better inference, deeper insights) Premium**: claude-3-opus-20240229 (best for complex persona generation) How to set environment variables: n8n Cloud**: Go to Settings → Environment Variables Self-hosted**: Add to your .env file or docker-compose configuration Step 2: Set Up Supabase Database Create the logging table in your Supabase database: CREATE TABLE workflow_logs ( id BIGSERIAL PRIMARY KEY, workflow_name TEXT NOT NULL, data JSONB NOT NULL, ai_response JSONB NOT NULL, created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW() ); CREATE INDEX idx_workflow_logs_created_at ON workflow_logs(created_at); CREATE INDEX idx_workflow_logs_workflow_name ON workflow_logs(workflow_name); -- Optional: Create a view for enriched contacts CREATE VIEW enriched_contacts AS SELECT id, data->>'email' as email, data->>'name' as name, data->>'company' as company, ai_response as enrichment_data, created_at FROM workflow_logs WHERE workflow_name = 'AI Contact Enrichment' ORDER BY created_at DESC; To run this SQL: Open your Supabase project dashboard Go to the SQL Editor Paste the SQL above and click "Run" Step 3: Configure Supabase Credentials in n8n Go to Settings → Credentials Click Add Credential → Supabase API Enter your Supabase URL and API key (found in Project Settings → API) Name it Supabase API Click Save Step 4: Activate the Webhook Import this workflow into n8n Click the Activate toggle in the top-right corner Click on the "Webhook Trigger" node Copy the Production URL (this is your webhook endpoint) Save this URL for integration with your applications Step 5: Test the Workflow Send a test POST request to the webhook: curl -X POST https://your-n8n-instance.com/webhook/contact-enrichment \ -H "Content-Type: application/json" \ -d '{ "email": "john.doe@acmecorp.com", "name": "John Doe", "company": "Acme Corporation", "linkedin_url": "https://linkedin.com/in/johndoe" }' Successful Response: { "success": true, "workflow": "AI Contact Enrichment", "timestamp": "2025-01-14T12:00:00.000Z" } 📥 Expected Payload Format The webhook accepts JSON with basic contact information: Minimal Input { "email": "string (required or name required)", "name": "string (required or email required)" } Recommended Input { "email": "string", "name": "string", "company": "string", "job_title": "string", "linkedin_url": "string", "phone": "string", "location": "string", "website": "string" } Complete Input Example { "email": "sarah.chen@techstartup.io", "name": "Sarah Chen", "company": "TechStartup Inc.", "job_title": "VP of Marketing", "linkedin_url": "https://linkedin.com/in/sarahchen", "phone": "+1-555-0123", "location": "San Francisco, CA", "website": "https://techstartup.io", "industry": "B2B SaaS", "company_size": "50-200 employees", "notes": "Met at SaaS conference 2024" } Field Guidelines: At minimum, provide either email or name More input fields = better AI enrichment quality Include linkedin_url for best results company helps with firmographic enrichment Any additional context improves accuracy 🔄 Workflow Flow Webhook Trigger: Receives basic contact information from your application, form, or CRM Process Data: Adds unique ID and timestamp to the incoming data Prepare AI Request: Configures AI provider settings from environment variables Call AI API: Sends contact data to AI with enrichment prompt Save to Supabase: Archives original data and enrichment results Format Response: Returns success confirmation 🎯 Customization Tips Enhance AI Prompts for Better Enrichment Modify the "Prepare AI Request" node to customize enrichment: // Enhanced prompt for contact enrichment const systemPrompt = `You are an expert sales intelligence analyst. Analyze the provided contact information and generate a comprehensive enrichment including: INFERRED DETAILS: Fill in missing information based on available data Full job title and seniority level Department and reporting structure Years of experience (estimated) Professional background COMPANY INSIGHTS: If company name provided Industry and sub-industry Company size and revenue (estimated) Key products/services Recent news or developments BUYER PERSONA: Create a detailed profile Primary responsibilities Likely pain points and challenges Key priorities and goals Decision-making authority Budget influence level ENGAGEMENT STRATEGY: Provide outreach recommendations Best communication channels Optimal outreach timing Key talking points Personalization suggestions Content interests LEAD SCORE: Rate 1-10 based on: Fit for product/service (specify your ICP) Seniority and decision power Company size and maturity Engagement potential Return as structured JSON with clear sections.`; const userMessage = Contact Information:\n${JSON.stringify($json.data, null, 2)}; const aiConfig = { provider: $env.AI_PROVIDER || 'openai', apiKey: $env.AI_API_KEY, model: $env.AI_MODEL || 'gpt-3.5-turbo', endpoint: $env.AI_ENDPOINT, messages: [ { role: 'system', content: systemPrompt }, { role: 'user', content: userMessage } ] }; return { json: { aiConfig, data: $json } }; Add External Data Sources Enhance enrichment with third-party APIs: After "Process Data" node, add: Clearbit/Hunter.io Node: Get verified company data LinkedIn API: Pull professional information Company Database: Query internal customer data Web Scraping: Extract data from company websites Then merge all data before AI enrichment for best results Connect to Your CRM Auto-update contacts after enrichment: Salesforce Integration: // Add after "Call AI API" node // Update Salesforce contact with enriched data const enrichedData = JSON.parse($json.ai_response); return { json: { contactId: $json.data.salesforce_id, updates: { Description: enrichedData.buyer_persona, Custom_Score__c: enrichedData.lead_score, Pain_Points__c: enrichedData.pain_points } } }; HubSpot Integration: Add HubSpot node to update contact properties Map enriched fields to custom HubSpot properties Pipedrive Integration: Use Pipedrive node to update person records Add custom fields for AI insights Implement Lead Scoring Add scoring logic after enrichment: // Calculate lead score based on enrichment const enrichment = JSON.parse($json.ai_response); let score = 0; // Job title scoring if (enrichment.seniority === 'C-Level') score += 30; else if (enrichment.seniority === 'VP/Director') score += 20; else if (enrichment.seniority === 'Manager') score += 10; // Company size scoring if (enrichment.company_size === 'Enterprise') score += 25; else if (enrichment.company_size === 'Mid-Market') score += 15; // Decision authority scoring if (enrichment.decision_authority === 'High') score += 25; else if (enrichment.decision_authority === 'Medium') score += 15; // Budget influence if (enrichment.budget_influence === 'Direct') score += 20; return { json: { ...enrichment, lead_score: score } }; Add Compliance Checks Insert before AI processing: // Check for opt-out or compliance flags const email = $json.email.toLowerCase(); // Check against suppression list const suppressedDomains = ['competitor.com', 'spam.com']; const domain = email.split('@')[1]; if (suppressedDomains.includes(domain)) { throw new Error('Contact on suppression list'); } // Verify email format const emailRegex = /^+@+\.+$/; if (!emailRegex.test(email)) { throw new Error('Invalid email format'); } return { json: $json }; Batch Enrichment Process multiple contacts: Add Spreadsheet File trigger instead of webhook Add Split In Batches node (process 10-20 at a time) Run enrichment for each contact Combine results and export to CSV 🛠️ Troubleshooting Common Issues Issue: "Enrichment is too generic" Solution**: Provide more input data (company, job title, LinkedIn) Use GPT-4 or Claude models for better inference Enhance the system prompt with specific instructions Issue: "AI_API_KEY is undefined" Solution**: Ensure environment variables are set correctly Verify variable names match exactly (case-sensitive) Issue: "Enrichment contradicts actual data" Solution**: AI makes inferences - always validate critical information Add validation step to check enriched data against known facts Use external APIs for verification Issue: "Too slow for real-time use" Solution**: Implement queue system for async processing Use faster models (gpt-3.5-turbo) for speed Process in batches during off-peak hours Issue: "Supabase credentials not found" Solution**: Check credential name matches exactly: "Supabase API" Verify Supabase URL and API key are correct Debugging Tips Test with known contacts first to validate accuracy Compare AI enrichment against actual data Check execution logs for API errors Start with minimal prompt, then enhance gradually Use "Execute Node" to test individual steps 📊 Analyzing Enriched Data Query and analyze your enriched contacts: -- Get all enriched contacts SELECT * FROM enriched_contacts ORDER BY created_at DESC; -- Find high-value leads (assuming scoring implemented) SELECT email, name, company, ai_response->>'lead_score' as score FROM enriched_contacts WHERE (ai_response->>'lead_score')::int > 70 ORDER BY (ai_response->>'lead_score')::int DESC; -- Analyze enrichment by company SELECT data->>'company' as company, COUNT(*) as contact_count, AVG((ai_response->>'lead_score')::int) as avg_score FROM workflow_logs WHERE workflow_name = 'AI Contact Enrichment' AND ai_response->>'lead_score' IS NOT NULL GROUP BY data->>'company' ORDER BY contact_count DESC; -- Find contacts needing follow-up SELECT email, name, ai_response->>'engagement_strategy' as strategy, created_at FROM enriched_contacts WHERE created_at > NOW() - INTERVAL '7 days' ORDER BY created_at DESC; Export Enriched Data -- Export to CSV COPY ( SELECT data->>'email' as email, data->>'name' as name, data->>'company' as company, ai_response->>'job_title' as enriched_title, ai_response->>'seniority' as seniority, ai_response->>'lead_score' as score FROM workflow_logs WHERE workflow_name = 'AI Contact Enrichment' ) TO '/tmp/enriched_contacts.csv' WITH CSV HEADER; 📈 Integration Ideas Form Integration Automatically enrich new leads from forms: Typeform**: Trigger on form submission Google Forms**: Use Google Sheets trigger Calendly**: Enrich after meeting booking Webflow Forms**: Webhook trigger from form CRM Integration Real-time enrichment as contacts enter CRM: Salesforce**: Trigger on new lead/contact creation HubSpot**: Enrich on form submission or import Pipedrive**: Auto-enrich new persons Close**: Webhook on lead creation Email Tools Enhance cold outreach campaigns: Instantly.ai**: Enrich before campaign launch Lemlist**: Generate personalization variables Apollo.io**: Supplement with AI insights Mailshake**: Enrich prospect lists Marketing Automation Power ABM and segmentation: Marketo**: Enrich leads for scoring Pardot**: Enhance prospect profiles ActiveCampaign**: Personalization data Klaviyo**: E-commerce customer insights Slack Integration Team notifications and collaboration: Send enrichment summaries to sales channel Notify reps of high-value leads Share persona insights with marketing Alert on key account contacts 🔒 Security & Compliance Best Practices Data Protection Encrypt Sensitive Data: Use environment variables for all credentials Access Control: Limit webhook access with authentication Data Retention: Set automatic deletion policies in Supabase Audit Logging: Track all enrichment activities Privacy Compliance GDPR Compliance: Get consent before enriching personal data Allow contacts to request data deletion Document legal basis for processing CCPA Compliance: Honor do-not-sell requests Data Minimization: Only enrich necessary fields Right to Access: Allow contacts to view enriched data AI Ethics Bias Awareness: Review AI inferences for bias Accuracy Validation: Verify critical information Transparency: Disclose use of AI enrichment Human Oversight: Review before critical decisions 💡 Best Practices Input Data Quality Always include email or full name** as anchor point Add LinkedIn URLs** for 50% better accuracy Provide company name** for firmographic insights Include any known details** - more data = better results Prompt Engineering Be specific** about your ideal customer profile (ICP) Request structured output** (JSON format) Define scoring criteria** that match your business Ask for actionable insights** not just descriptions Post-Enrichment Workflow Always validate** critical information before use Review AI inferences** for accuracy and bias Update CRM promptly** to maintain data freshness Track enrichment ROI** (conversion rates, time saved) Performance Optimization Batch process** during off-peak hours Use appropriate models** (gpt-3.5 for speed, gpt-4 for quality) Cache common enrichments** to reduce API costs Set rate limits** to avoid API throttling 🏷️ Tags sales-automation, lead-enrichment, ai-automation, crm-integration, data-enrichment, contact-intelligence, buyer-personas, lead-scoring, webhook, supabase, openai, anthropic, b2b-sales 📝 License This workflow template is provided as-is for use with n8n. 🤝 Support For questions or issues: n8n Community Forum: https://community.n8n.io n8n Documentation: https://docs.n8n.io 🌟 Example Output Input: { "email": "mike.johnson@cloudtech.com", "name": "Mike Johnson", "company": "CloudTech Solutions", "job_title": "Director of IT" } AI-Generated Enrichment: { "full_title": "Director of Information Technology", "seniority": "Director", "department": "Technology/IT", "experience_years": "10-15", "company_insights": { "industry": "Cloud Computing", "size": "Mid-Market (100-500)", "revenue_estimate": "$10M-$50M" }, "buyer_persona": { "responsibilities": ["Infrastructure management", "Vendor selection", "Security oversight"], "pain_points": ["Legacy system migration", "Cost optimization", "Security compliance"], "priorities": ["Scalability", "Cost reduction", "Team efficiency"] }, "engagement_strategy": { "best_channels": ["Email", "LinkedIn"], "timing": "Tuesday-Thursday, 9-11 AM", "talking_points": ["ROI and cost savings", "Security features", "Ease of implementation"], "personalization": "Reference cloud migration challenges" }, "lead_score": 75 } 🔄 Version History v1.0.0** (2025-01-14): Initial release with universal AI provider support
by Colleen Brady
Who is this for? This workflow is built for anyone who works with YouTube content, whether you're: A learner looking to understand a video’s key points A content creator repurposing video material A YouTube manager looking to update titles, descriptions A social media strategist searching for the most shareable clips Don't just ask questions about what's said. Find out what's going on in a video too. Video Overview: https://www.youtube.com/watch?v=Ovg_KfKxnC8 What problem does this solve? YouTube videos hold valuable insights, but watching and processing them manually takes time. This workflow automates: Quick content extraction**: Summarize key ideas without watching full videos Visual analysis**: Understand what’s happening beyond spoken words Clip discovery**: Identify the best moments for social sharing How the workflow works This n8n-powered automation: Uses Google’s Gemini 1.5 Flash AI for intelligent video analysis Provides multiple content analysis templates tailored to different needs What makes this workflow powerful? The easiest place to start is by requesting a summary or transcript. From there, you can refine the prompts to match your specific use case and the type of video content you’re working with. But what's even more amazing? You can ask questions about what’s happening in the video — and get detailed insights about the people, objects, and scenes. It's jaw-dropping. This workflow is versatile — the actions adapt based on the values set. That means you can use a single workflow to: Extract transcripts Generate an extended YouTube description Write a summary blog post You can also modify the trigger based on how you want to run the workflow — use a webhook, connect it to an event in Airtable, or leave it as-is for on-demand use. The output can then be sent anywhere: Notion, Airtable, CMS platforms, or even just stored for reference. How to set it up Connect your Google API key Paste a YouTube video URL Select an analysis method Run the workflow and get structured results Analysis Templates Basic & Timestamped Transcripts**: Extract spoken content Summaries**: Get concise takeaways Visual Scene Analysis**: Detect objects, settings, and people Clip Finder**: Locate shareable moments Actionable Insights**: Extract practical information Customization Options Modify templates to fit your needs Connect with external platforms Adjust formatting preferences Advanced Configuration This workflow is designed for use with gemini-1.5-flash. In the future, you can update the flow to work with different models or even modify the HTTP request node to define which API endpoint should be used. It's also been designed so you can use this flow on it's own or add to a new / existing worflow. This workflow helps you get the most out of YouTube content — quickly and efficiently.
by James Li
Summary Onfleet is a last-mile delivery software that provides end-to-end route planning, dispatch, communication, and analytics to handle the heavy lifting while you can focus on your customers. This workflow template loads in a spreadsheet from your local storage and automatically creates Onfleet tasks on a one-time basis upon workflow trigger. You can use this workflow as a task importer. Configurations Update the Read Binary File node with the absolute file path to the local spreadsheet of interest Update the Onfleet node with your own Onfleet credentials, to register for an Onfleet API key, please visit https://onfleet.com/signup to get started You can easily change how the Onfleet task is created by mapping to additional data in the spreadsheet For import templates, visit Onfleet Support to learn more 👍
by Angel Menendez
This workflow is triggered by a parent workflow initiated via a Slack shortcut. Upon activation, it collects input from a modal window in Slack and initiates a vulnerability scan using the Qualys API. Key Features Trigger:** Launched by a parent workflow through a Slack shortcut with modal input. API Integration:** Utilizes the Qualys API for vulnerability scanning. Data Conversion:** Converts XML scan results to JSON for further processing. Loop Mechanism:** Continuously checks the scan status until completion. Slack Notifications:** Posts scan summary and detailed results to a specified Slack channel. Workflow Nodes Start VM Scan in Qualys: Initiates the scan with specified parameters. Convert XML to JSON: Converts the scan results from XML format to JSON. Fetch Scan Results: Retrieves scan results from Qualys. Check if Scan Finished: Verifies whether the scan is complete. Loop Mechanism: Handles the repetitive checking of the scan status. Slack Notifications: Posts updates and results to Slack. Relevant Links Qualys API Documentation Qualys Platform Documentation Parent workflow link Link to Report Generator Subworkflow
by Francis Njenga
AI Content Generator Workflow Introduction This workflow automates the process of creating high-quality articles using AI, organizing them in Google Drive, and tracking their progress in Google Sheets. It's perfect for marketers, bloggers, and businesses looking to streamline content creation. With minimal setup, you can have a fully operational system to generate, save, and manage your articles in one cohesive workflow. How It Works Collect Inputs: Users fill out a form with details like article title, keywords, and instructions. Generate Content: AI creates an outline and writes the article based on user inputs. Organize Files: Saves the outline and final article in Google Drive for easy access. Track Progress: Updates Google Sheets with links to the generated content for tracking. Set Up Steps Time Required**: Approximately 15–20 minutes to connect all integrations and test the workflow. Steps**: Connect Google Drive and Google Sheets: Authorize access to store files and update the spreadsheet. Set Up OpenAI Integration: Add your OpenAI API key for generating the outline and article content. Customize the Form: Modify the form fields to match the details you want to collect for each article. Test the Workflow: Run the workflow with sample inputs to ensure everything works smoothly. This workflow not only simplifies the process of article creation but also sets a foundation for expanding into additional automations, like posting to social media platforms.
by Askan
The News Site from Colt, a telecom company, does not offer an RSS feed, therefore web scraping is the choice to extract and process the news. The goal is to get only the newest posts, a summary of each post and their respective (technical) keywords. Note that the news site offers the links to each news post, but not the individual news. We collect first the links and dates of each post before extracting the newest ones. The result is sent to a SQL database, in this case a NocoDB database. This process happens each week thru a cron job. Requirements: Basic understanding of CSS selectors and how to get them via browser (usually: right click → inspect) ChatGPT API account - normal account is not sufficient A NocoDB database - of course you may choose any type of output target Assumptions: CSS selectors work on the news site The post has a date with own CSS selector - meaning date is not part of the news content "Warnings" Not every site likes to be scraped, especially not in high frequency Each website is structured in different ways, the workflow may then need several adaptations.
by Miquel Colomer
This workflow is useful if you have lots of tasks running daily. MySQL node (or the database used to save data shown in n8n - could be Mongo, Postgres, ... -) remove old entries from execution_entity table that contains the history of the executed workflows. If you have multiple tasks executed every minute, 1024 rows will be created every day (60 minutes x 24 hours) per every task. This will increase the table size fastly. SQL query deletes entries older than 30 days taking stoppedAt column as a reference for date calculations. You only have to setup Mysql connection properly and config cron to execute once per day in a low traffic hour, this way
by Simon Mayerhofer
This workflow allows you to batch update/insert Airtable rows in groups of 10, significantly reducing the number of API calls and increasing performance. 🚀 How It Works Copy the 3 Nodes Copy the three nodes inside the red note box into your workflow. Set Your Fields In the Set Fields node, define the fields you want to update. ➤ Only include fields that match column names in your Airtable table. ➤ Make sure the field names are spelled exactly as they appear in Airtable. ➤ Make sure the field type are correctly set. So numbers columns in Airtable need numbers type set as the field. Configure the Airtable Batch Node Enter your Airtable Base ID The part with app... in the URL: airtable\.com / app8pqOLeka1Cglwg / tblnXZOdy8VtkAAJD/... Enter your Airtable Table ID The part with tbl... in the URL: airtable\.com / app8pqOLeka1Cglwg / tblXXZOdy8VtkAAJD /... Set Matching Fields (fieldsToMergeOn) Provide a string array that tells Airtable how to match existing rows. Examples: Match by one field (e.g. TaskID): {{["TaskID"]}} Match by multiple fields (e.g. firstname and lastname): {{["firstname", "lastname"]}} Choose the Mode (mode field) Available options: upsert: Update if a record exists, otherwise insert a new one insert: Always insert as new records update: Only update existing records (you must provide a field named id)
by Jimleuk
Are you a popular tech startup accelerator (named after a particular higher order function) overwhelmed with 1000s of pitch decks on a daily basis? Wish you could filter through them quickly using AI but the decks are unparseable through conventional means? Then you're in luck! This n8n template uses Multimodal LLMs to parse and extract valuable data from even the most overly designed pitch decks in quick fashion. Not only that, it'll also create the foundations of a RAG chatbot at the end so you or your colleagues can drill down into the details if needed. With this template, you'll scale your capacity to find interesting companies you'd otherwise miss! Requires n8n v1.62.1+ How It Works Airtable is used as the pitch deck database and PDF decks are downloaded from it. An AI Vision model is used to transcribe each page of the pitch deck into markdown. An Information Extractor is used to generate a report from the transcribed markdown and update required information back into pitch deck database. The transcribed markdown is also uploaded to a vector store to build an AI chatbot which can be used to ask questions on the pitch deck. Check out the sample Airtable here: https://airtable.com/appCkqc2jc3MoVqDO/shrS21vGqlnqzzNUc How To Use This template depends on the availability of the Airtable - make a duplicate of the airtable (link) and its columns before running the workflow. When a new pitchdeck is received, enter the company name into the Name column and upload the pdf into the File column. Leave all other columns blank. If you have the Airtable trigger active, the execution should start immediately once the file is uploaded. Otherwise, click the manual test trigger to start the workflow. When manually triggered, all "new" pitch decks will be handled by the workflow as separate executions. Requirements OpenAI for LLM Airtable For Database and Interface Qdrant for Vector Store Customising This Workflow Extend this starter template by adding more AI agents to validate claims made in the pitch deck eg. Linkedin Profiles, Page visits, Reviews etc.
by Jan Oberhauser
Simpe API which queries the received country code via GraphQL and returns it. Example URL: https://n8n.exampl.ecom/webhook/1/webhook/webhook?code=DE Receives country code from an incoming HTTP Request Reads data via GraphQL Converts the data to JSON Constructs return string
by jason
This workflow will gather data every minute from the GitHub (https://github.com), Docker (https://www.docker.com/), npm (https://www.npmjs.com/) and Product Hunt (https://www.producthunt.com/) website APIs and display select information on a Smashing (https://smashing.github.io/) dashboard. For convenience sake, the dashboard piece can be easily downloaded as a docker container (https://hub.docker.com/r/tephlon/n8n_dashboard) and installed into your docker environment.