by System Admin
No description available
by Artem Boiko
Convert a Revit model to Excel and parse it into structured items ready for downstream ETL. This minimal template runs a local RvtExporter.exe, checks success, derives the expected *_rvt.xlsx filename, reads it from disk, and parses it to data items in n8n. What it does Setup: define path_to_revit_converter and revit_file. Run converter: execute RvtExporter.exe "<converter>" "<revit_file>" (writes *_rvt.xlsx next to the RVT). Check success: branch on converter error output. Read Excel: compute <revit_file> → *_rvt.xlsx and read it from disk. Parse: convert the workbook into structured items (rows → items). Prerequisites Windows** host (local executable and filesystem paths). DDC Revit toolkit installed: C:\\DDC_Converter_Revit\\datadrivenlibs\\RvtExporter.exe. A local .rvt you can read; the converter will write *_rvt.xlsx next to it. How to use Import this JSON into n8n. Open “Setup – Define file paths” and set: path_to_revit_converter: C:\\DDC_Converter_Revit\\datadrivenlibs\\RvtExporter.exe revit_file: e.g., C:\\Sample_Projects\\your_project.rvt Run Manual Trigger. On success, the flow will read *_rvt.xlsx and emit parsed items. Outputs On disk: <YourProject>_rvt.xlsx (created by the converter). In n8n: parsed rows as items, ready for Transform/Load phases. Notes & tips If your converter writes the Excel to a different folder/file name, update the “Success – Create Excel filename” node to point to the actual path. Ensure write permissions in the project folder and avoid non-ASCII characters in paths when possible. This template is purposefully minimal (Extract-only). Chain it with your own Transform/Load steps. Categories Data Extraction · Files & Storage · ETL · CAD/BIM Tags cad-bim, revit, ifc, dwg, extract, xlsx, etl Author DataDrivenConstruction.io info@datadrivenconstruction.io Consulting and Training We work with leading construction, engineering, consulting agencies and technology firms around the world to help them implement open data principles, automate CAD/BIM processing and build robust ETL pipelines. If you would like to test this solution with your own data, or are interested in adapting the workflow to real project tasks, feel free to contact us. Docs & Issues: Full Readme on GitHub
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. Introduction This template listens for new events in Google Calendar and automatically creates or updates participant profiles in KlickTipp. Every attendee’s email address (and other available details) is mapped directly into your KlickTipp account without manual steps. This ensures your contact list always reflects the latest event registrations in real time. Perfect for coaches, consultants, and event organizers who want to capture RSVP information instantly and keep their marketing lists up to date. With each new attendee synced, you can launch reminder emails, follow-up campaigns, or trigger automated onboarding sequences. How it works This template keeps your KlickTipp list in sync with Google Calendar across the full event lifecycle. For each attendee, the workflow optionally filters out internal domains, then writes event details into KlickTipp custom fields. It watches your calendar for: Event Created* → creates/updates each attendee as a KlickTipp contact and adds the *event created/updated tag. Event Cancelled* → tags attendees with *event canceled. Event Updated* → routes attendees by *responseStatus** and tags them: accepted → event confirmed declined → event declined tentative → event considered Setup Instructions KlickTipp Preparation Prepare custom fields Google Calendar | event summary, data type: "Single line" Google Calendar | event description, data type: "Single line" Google Calendar | event location, data type: "Single line" Google Calendar | event start datetime, data type: "Datetime" Google Calendar | event end datetime, data type: "Datetime" Prepare tags: Google Calendar | event created/updated Google Calendar | event canceled Google Calendar | event declined Google Calendar | event confirmed Google Calendar | event considered Credential Configuration Connect your Google Calendar account using Client ID and Client Secret from the Google Cloud. Authenticate your KlickTipp connection with username/password credentials (API access required). Customization Recommended poll frequency: every 1–5 minutes for near real-time updates. Adjust to your local timezone if necessary. Each trigger works independently, allowing partial deployments if only certain event types are needed.
by Adrian Kendall
Summary This is a minimal template that focuses on how to integrate n8n and Home Assistant for event-based triggering from Home Assistant using the AppDaemon addon to call a webhook node. Problem Solved: There is no Home Assistant trigger node in n8n. You can poll the Home Assistant API on a schedule. A more efficient work around is to use the AppDaemon addon to create a listener app within Home Assistant. When the listener detects the event, it is subscribed to. An AppDaemon app is initiated that calls a N8N webhook passing the the event data to the workflow. AppDaemon runs python code. The template contains a sticky note. Within the sticky note there is a code example (repeated below) for a AppDeamon app, the code contains annotated instructions on configuration. Steps: Install the AppDaemon Add-on A. Open Home Assistant. In your Home Assistant UI, go to Settings - Add-ons (or Supervisor - Add-on Store, depending on your version). B. Search and Install. In the Add-on Store, search for "AppDaemon 4". Click on the result and then the Install button to start the installation. C. Start the Add-on. Once installed, open the AppDaemon 4 add-on page. Click Start to launch AppDaemon. (Optional but recommended) Enable Start on boot and Watchdog options to make sure AppDaemon starts automatically and restarts if it crashes. D. Verify Installation. Check the logs in the AppDaemon add-on page to ensure it’s running without issues. No need to set access tokens or Home Assistant URL manually; the add-on is pre-configured to connect with your Home Assistant. E. Configure AppDaemon. After installation, a directory named appdaemon will appear inside your Home Assistant config directory (/config/appdaemon/). Inside, you’ll find a file called appdaemon.yaml. For most uses, the default configuration is fine, but you can customize it as needed. Create the AppDaemon App. F. Prepare Your Apps Directory. Inside /config/appdaemon/, locate or create an apps folder. Path: /config/appdaemon/apps/. G. Create a Python App Inside the apps folder, create a new Python file (example: n8n_WebHook.py). Open the file in an editor and paste the example code into the file. import appdaemon.plugins.hass.hassapi as hass import requests import json class EventTon8nWebhook(hass.Hass): """ AppDaemon app that listens for Home Assistant events and forwards them to n8n webhook """ def initialize(self): """ Initialize the event listener and configure webhook settings """ EDIT: Replace 'your_event_name' with the actual event you want to listen for Common HA events: 'state_changed', 'call_service', 'automation_triggered', etc. self.target_event = self.args.get('target_event', 'your_event_name') EDIT: Set your n8n webhook URL in apps.yaml or replace the default here self.webhook_url = self.args.get('webhook_url', 'n8n_webhook_url') EDIT: Optional - set timeout for webhook requests (seconds) self.webhook_timeout = self.args.get('webhook_timeout', 10) EDIT: Optional - enable/disable SSL verification self.verify_ssl = self.args.get('verify_ssl', True) Set up the event listener self.listen_event(self.event_handler, self.target_event) self.log(f"Event listener initialized for event: {self.target_event}") self.log(f"Webhook URL configured: {self.webhook_url}") def event_handler(self, event_name, data, kwargs): """ Handle the triggered event and forward to n8n webhook Args: event_name (str): Name of the triggered event data (dict): Event data from Home Assistant kwargs (dict): Additional keyword arguments from the event """ try: Prepare payload for n8n webhook payload = { 'event_name': event_name, 'event_data': data, 'event_kwargs': kwargs, 'timestamp': self.datetime().isoformat(), 'source': 'home_assistant_appdaemon' } self.log(f"Received event '{event_name}' - forwarding to n8n") self.log(f"Event data: {data}") Send to n8n webhook self.send_to_n8n(payload) except Exception as e: self.log(f"Error handling event {event_name}: {str(e)}", level="ERROR") def send_to_n8n(self, payload): """ Send payload to n8n webhook Args:payload (dict): Data to send to n8n """ try: headers = { 'Content-Type': 'application/json', #EDIT assume header authentication parameter and value below need to match what is set in the credential used in the node. 'CredName': 'credValue', #set to what you set up as a credential for the webhook node } response = requests.post( self.webhook_url, json=payload, headers=headers, timeout=self.webhook_timeout, verify=self.verify_ssl ) response.raise_for_status() self.log(f"Successfully sent event to n8n webhook. Status: {response.status_code}") EDIT: Optional - log response from n8n for debugging if response.text: self.log(f"n8n response: {response.text}") except requests.exceptions.Timeout: self.log(f"Timeout sending to n8n webhook after {self.webhook_timeout}s", level="ERROR") except requests.exceptions.RequestException as e: self.log(f"Error sending to n8n webhook: {str(e)}", level="ERROR") except Exception as e: self.log(f"Unexpected error sending to n8n: {str(e)}", level="ERROR") H. Register Your App In the same apps folder, locate or create a file named apps.yaml. Add an entry to register your app: EventTon8nWebhook: module: n8n_WebHook class: EventTon8nWebhook Module should match your Python filename (without the .py). class matches the class name inside your Python file, i.e. EventTon8nWebHook I. Reload AppDaemon In Home Assistant, return to the AppDaemon 4 add-on page and click Restart. Watch the logs; you should see a log entry from your app confirming it is initialised running. Set Up n8n J. In your workflow create a webhook trigger node as the first node, or use the template as your starting point. Ensure the webhook URL (production or test) is correctly copied to the python code. If you are using authentication this needs a credential creating for the webhook. The example uses header auth. Naturally this should match what is in the code. For security likely better to put the credentials in a "secrets.yaml" file rather than hard code like this demo. Execute the workflow or activate if using the production webhook. Either wait for an event in Home Assistant or one can be manually triggered from the developer settings page, event tab. K. Finally develop your workflow to process the received event per your use case. See the Home Assistant docs on events for details of the event types you can subscribe to: Home Assistant Events
by System Admin
Tagged with: , , , ,
by System Admin
No description available
by System Admin
Tagged with: , , , ,
by System Admin
Tagged with: , , , ,
by Paul
Danelfin MCP Server About Danelfin Danelfin is an AI-powered stock analytics platform that helps investors find the best stocks and optimize their portfolios with explainable AI insights to make smarter, data-driven investment decisions. The platform uses AI-driven stock analytics to optimize investment strategies. Danelfin uses Explainable Artificial Intelligence to help everyone make smart and data-driven investment decisions. The platform provides API solutions for developers, analysts, and fintech firms wanting to integrate predictive stock data into their own applications. Key Features AI Stock Picker**: Advanced algorithms to identify high-potential stocks Portfolio Optimization**: Data-driven portfolio management and optimization tools Explainable AI**: Transparent AI insights that show reasoning behind recommendations Predictive Analytics**: AI Scores and forecasting capabilities Multi-Asset Coverage**: Analysis of stocks and ETFs across various markets MCP Server Endpoints This Danelfin MCP (Model Context Protocol) server exposes three main endpoints that provide comprehensive market analysis capabilities: 🏆 /ranking Endpoint GET: https://apirest.dan... The ranking endpoint provides AI-powered stock rankings and ratings. This tool allows users to: Access ranked lists of stocks based on AI-generated scores Retrieve performance rankings across different timeframes Get comparative analysis of stock performance Access predictive AI scores for investment decision-making 🏭 /sectors Endpoint GET: https://apirest.dan... The sectors endpoint provides analysis for US market stocks including AI ratings, prices, stock charts, technical, fundamental, and sentiment analysis organized by market sectors. Features include: Sector-wise market analysis and performance metrics AI ratings and price data for sector classifications Technical and fundamental analysis by sector Sentiment analysis across different market sectors Comparative sector performance insights 🏢 /industries Endpoint GET: https://apirest.dan... The industries endpoint offers granular industry-level analysis within market sectors: Industry-specific stock analysis and ratings Performance metrics at the industry classification level AI-powered insights for specific industry verticals Comparative industry analysis within sectors Industry trend analysis and forecasting Integration Benefits This MCP server enables seamless integration of Danelfin's AI-powered stock analysis capabilities into applications and workflows. Users can: Access Real-time Data**: Get up-to-date stock rankings, sector performance, and industry analysis Leverage AI Insights**: Utilize explainable AI recommendations for investment decisions Streamline Research**: Automate stock research workflows with comprehensive market data Enhance Decision-Making**: Make data-driven investment choices with predictive analytics Scale Analysis**: Process large-scale market analysis across multiple dimensions Use Cases Investment Research**: Comprehensive stock analysis and ranking for portfolio managers Algorithmic Trading**: Integration of AI scores into trading algorithms and strategies Financial Advisory**: Enhanced client advisory services with AI-powered insights Risk Management**: Sector and industry analysis for portfolio risk assessment Market Analysis**: Real-time market intelligence for institutional investors The Danelfin MCP server bridges the gap between advanced AI stock analytics and practical investment applications, providing developers and financial professionals with powerful tools for data-driven market analysis.
by System Admin
Tagged with: , , , ,
by Kai S. Huxmann
How to secure GET Webhooks? What are webhooks? Webhooks are special URLs that instantly trigger workflows when they receive an incoming HTTP request (like GET or POST). They're perfect for connecting external tools to n8n in real time. 🔐 Why webhooks should be protected Unprotected webhooks are publicly accessible on the internet — anyone with the link can trigger your workflow. This can lead to spam, unwanted requests, or even data loss. ✅ Best Practice: Use built-in Authentication n8n provides native authentication options for webhook nodes: Basic Auth Header Auth JWT Auth These methods are highly recommended if supported by your external app or service. You can find them in the “Authentication” dropdown of the webhook node. 🛠️ When to use THIS SETUP Sometimes, external tools don’t support custom headers or advanced auth methods — for example: A button click in Google Sheets A link shared via email or chat with a trusted partner IoT devices or basic web apps In those cases, you can protect a webhook by adding a secret query parameter (e.g. ?secret=abc123xyz456...) and validating it with an IF node at the start of your workflow. This way, only those requests with the secret can trigger the core elements of your workflow. It's a simple yet powerful way to secure GET-based workflows. Only use if better methods aren't available.
by Robert Breen
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This n8n workflow provides automated Power BI dataset refresh capabilities with built-in refresh history monitoring. It triggers a dataset refresh in Power BI and simultaneously checks the refresh history to track data update status. This is perfect for data analysts, business intelligence teams, or anyone who needs to automate Power BI dataset refreshes and monitor their success rates without manual intervention. Key Features: Automated Power BI dataset refresh triggering Simultaneous refresh history monitoring Manual trigger for on-demand execution Real-time status tracking Integration with Power BI REST API Support for workspace and personal datasets Step-by-Step Implementation Guide Prerequisites Before setting up this workflow, you'll need: n8n instance (cloud or self-hosted) Power BI account (Pro or Premium) Microsoft Azure App Registration for API access Power BI dataset that you want to refresh Step 1: Set Up Azure App Registration Go to Azure Portal Navigate to Azure Active Directory → App registrations Click New registration Configure your app: Name: n8n-powerbi-integration Supported account types: Accounts in this organizational directory only Redirect URI: https://your-n8n-instance.com/rest/oauth2-credential/callback Click Register Note down the Application (client) ID and Directory (tenant) ID Step 2: Configure App Permissions In your app registration, go to API permissions Click Add a permission → Power BI Service Select Delegated permissions and add: Dataset.ReadWrite.All Dataset.Read.All Workspace.Read.All Click Grant admin consent for your organization Step 3: Create Client Secret In your app registration, go to Certificates & secrets Click New client secret Add description: n8n-powerbi-secret Set expiration (recommended: 24 months) Click Add and copy the secret value immediately Step 4: Configure Power BI API Credentials in n8n In n8n, go to Credentials → Add Credential → Power BI OAuth2 API Configure as follows: Client ID: Your Application (client) ID from Step 1 Client Secret: Your client secret from Step 3 Scope: https://analysis.windows.net/powerbi/api/.default Save and test the connection Step 6: Import and Configure the Workflow Copy the provided workflow JSON In n8n, click Import from File or Import from Clipboard Paste the workflow JSON Configure each node as detailed below: Node Configuration Details: When clicking 'Execute workflow' (Manual Trigger) Type**: Manual Trigger Purpose**: Allows manual execution of the workflow No configuration needed Refresh Datasource (Power BI) Resource**: dataset Operation**: refresh Group ID**: me (for personal workspace) or your workspace ID Dataset ID**: Your Power BI dataset ID (from Step 5) Credentials**: Select your "Power BI account" Check Refresh History (Power BI) Resource**: dataset Operation**: getRefreshHistory Group ID**: me (for personal workspace) or your workspace ID Dataset ID**: Your Power BI dataset ID (same as above) Top**: 10 (number of recent refresh records to retrieve) Credentials**: Select your "Power BI account" Workflow Flow Summary Manual Trigger → User initiates workflow execution Parallel Execution: Refresh Dataset → Triggers Power BI dataset refresh Get History → Retrieves recent refresh history Results → Both operations complete simultaneously providing: Confirmation of refresh initiation Historical context of previous refreshes Contact Information Robert A Ynteractive For support, customization, or questions about this workflow: 📧 Email: rbreen@ynteractive.com 🌐 Website: https://ynteractive.com/ 💼 LinkedIn: https://www.linkedin.com/in/robert-breen-29429625/ Need help implementing this workflow or want custom automation solutions? Get in touch for professional n8n consulting and workflow development services.