by System Admin
Tagged with: , , , ,
by Artem Boiko
Convert a Revit model to Excel and parse it into structured items ready for downstream ETL. This minimal template runs a local RvtExporter.exe, checks success, derives the expected *_rvt.xlsx filename, reads it from disk, and parses it to data items in n8n. What it does Setup: define path_to_revit_converter and revit_file. Run converter: execute RvtExporter.exe "<converter>" "<revit_file>" (writes *_rvt.xlsx next to the RVT). Check success: branch on converter error output. Read Excel: compute <revit_file> → *_rvt.xlsx and read it from disk. Parse: convert the workbook into structured items (rows → items). Prerequisites Windows** host (local executable and filesystem paths). DDC Revit toolkit installed: C:\\DDC_Converter_Revit\\datadrivenlibs\\RvtExporter.exe. A local .rvt you can read; the converter will write *_rvt.xlsx next to it. How to use Import this JSON into n8n. Open “Setup – Define file paths” and set: path_to_revit_converter: C:\\DDC_Converter_Revit\\datadrivenlibs\\RvtExporter.exe revit_file: e.g., C:\\Sample_Projects\\your_project.rvt Run Manual Trigger. On success, the flow will read *_rvt.xlsx and emit parsed items. Outputs On disk: <YourProject>_rvt.xlsx (created by the converter). In n8n: parsed rows as items, ready for Transform/Load phases. Notes & tips If your converter writes the Excel to a different folder/file name, update the “Success – Create Excel filename” node to point to the actual path. Ensure write permissions in the project folder and avoid non-ASCII characters in paths when possible. This template is purposefully minimal (Extract-only). Chain it with your own Transform/Load steps. Categories Data Extraction · Files & Storage · ETL · CAD/BIM Tags cad-bim, revit, ifc, dwg, extract, xlsx, etl Author DataDrivenConstruction.io info@datadrivenconstruction.io Consulting and Training We work with leading construction, engineering, consulting agencies and technology firms around the world to help them implement open data principles, automate CAD/BIM processing and build robust ETL pipelines. If you would like to test this solution with your own data, or are interested in adapting the workflow to real project tasks, feel free to contact us. Docs & Issues: Full Readme on GitHub
by System Admin
Tagged with: , , , ,
by System Admin
No description available
by System Admin
No description available
by System Admin
Tagged with: , , , ,
by Adrian Kendall
Summary This is a minimal template that focuses on how to integrate n8n and Home Assistant for event-based triggering from Home Assistant using the AppDaemon addon to call a webhook node. Problem Solved: There is no Home Assistant trigger node in n8n. You can poll the Home Assistant API on a schedule. A more efficient work around is to use the AppDaemon addon to create a listener app within Home Assistant. When the listener detects the event, it is subscribed to. An AppDaemon app is initiated that calls a N8N webhook passing the the event data to the workflow. AppDaemon runs python code. The template contains a sticky note. Within the sticky note there is a code example (repeated below) for a AppDeamon app, the code contains annotated instructions on configuration. Steps: Install the AppDaemon Add-on A. Open Home Assistant. In your Home Assistant UI, go to Settings - Add-ons (or Supervisor - Add-on Store, depending on your version). B. Search and Install. In the Add-on Store, search for "AppDaemon 4". Click on the result and then the Install button to start the installation. C. Start the Add-on. Once installed, open the AppDaemon 4 add-on page. Click Start to launch AppDaemon. (Optional but recommended) Enable Start on boot and Watchdog options to make sure AppDaemon starts automatically and restarts if it crashes. D. Verify Installation. Check the logs in the AppDaemon add-on page to ensure it’s running without issues. No need to set access tokens or Home Assistant URL manually; the add-on is pre-configured to connect with your Home Assistant. E. Configure AppDaemon. After installation, a directory named appdaemon will appear inside your Home Assistant config directory (/config/appdaemon/). Inside, you’ll find a file called appdaemon.yaml. For most uses, the default configuration is fine, but you can customize it as needed. Create the AppDaemon App. F. Prepare Your Apps Directory. Inside /config/appdaemon/, locate or create an apps folder. Path: /config/appdaemon/apps/. G. Create a Python App Inside the apps folder, create a new Python file (example: n8n_WebHook.py). Open the file in an editor and paste the example code into the file. import appdaemon.plugins.hass.hassapi as hass import requests import json class EventTon8nWebhook(hass.Hass): """ AppDaemon app that listens for Home Assistant events and forwards them to n8n webhook """ def initialize(self): """ Initialize the event listener and configure webhook settings """ EDIT: Replace 'your_event_name' with the actual event you want to listen for Common HA events: 'state_changed', 'call_service', 'automation_triggered', etc. self.target_event = self.args.get('target_event', 'your_event_name') EDIT: Set your n8n webhook URL in apps.yaml or replace the default here self.webhook_url = self.args.get('webhook_url', 'n8n_webhook_url') EDIT: Optional - set timeout for webhook requests (seconds) self.webhook_timeout = self.args.get('webhook_timeout', 10) EDIT: Optional - enable/disable SSL verification self.verify_ssl = self.args.get('verify_ssl', True) Set up the event listener self.listen_event(self.event_handler, self.target_event) self.log(f"Event listener initialized for event: {self.target_event}") self.log(f"Webhook URL configured: {self.webhook_url}") def event_handler(self, event_name, data, kwargs): """ Handle the triggered event and forward to n8n webhook Args: event_name (str): Name of the triggered event data (dict): Event data from Home Assistant kwargs (dict): Additional keyword arguments from the event """ try: Prepare payload for n8n webhook payload = { 'event_name': event_name, 'event_data': data, 'event_kwargs': kwargs, 'timestamp': self.datetime().isoformat(), 'source': 'home_assistant_appdaemon' } self.log(f"Received event '{event_name}' - forwarding to n8n") self.log(f"Event data: {data}") Send to n8n webhook self.send_to_n8n(payload) except Exception as e: self.log(f"Error handling event {event_name}: {str(e)}", level="ERROR") def send_to_n8n(self, payload): """ Send payload to n8n webhook Args:payload (dict): Data to send to n8n """ try: headers = { 'Content-Type': 'application/json', #EDIT assume header authentication parameter and value below need to match what is set in the credential used in the node. 'CredName': 'credValue', #set to what you set up as a credential for the webhook node } response = requests.post( self.webhook_url, json=payload, headers=headers, timeout=self.webhook_timeout, verify=self.verify_ssl ) response.raise_for_status() self.log(f"Successfully sent event to n8n webhook. Status: {response.status_code}") EDIT: Optional - log response from n8n for debugging if response.text: self.log(f"n8n response: {response.text}") except requests.exceptions.Timeout: self.log(f"Timeout sending to n8n webhook after {self.webhook_timeout}s", level="ERROR") except requests.exceptions.RequestException as e: self.log(f"Error sending to n8n webhook: {str(e)}", level="ERROR") except Exception as e: self.log(f"Unexpected error sending to n8n: {str(e)}", level="ERROR") H. Register Your App In the same apps folder, locate or create a file named apps.yaml. Add an entry to register your app: EventTon8nWebhook: module: n8n_WebHook class: EventTon8nWebhook Module should match your Python filename (without the .py). class matches the class name inside your Python file, i.e. EventTon8nWebHook I. Reload AppDaemon In Home Assistant, return to the AppDaemon 4 add-on page and click Restart. Watch the logs; you should see a log entry from your app confirming it is initialised running. Set Up n8n J. In your workflow create a webhook trigger node as the first node, or use the template as your starting point. Ensure the webhook URL (production or test) is correctly copied to the python code. If you are using authentication this needs a credential creating for the webhook. The example uses header auth. Naturally this should match what is in the code. For security likely better to put the credentials in a "secrets.yaml" file rather than hard code like this demo. Execute the workflow or activate if using the production webhook. Either wait for an event in Home Assistant or one can be manually triggered from the developer settings page, event tab. K. Finally develop your workflow to process the received event per your use case. See the Home Assistant docs on events for details of the event types you can subscribe to: Home Assistant Events
by System Admin
Tagged with: , , , ,
by System Admin
Tagged with: , , , ,
by Paul
Danelfin MCP Server About Danelfin Danelfin is an AI-powered stock analytics platform that helps investors find the best stocks and optimize their portfolios with explainable AI insights to make smarter, data-driven investment decisions. The platform uses AI-driven stock analytics to optimize investment strategies. Danelfin uses Explainable Artificial Intelligence to help everyone make smart and data-driven investment decisions. The platform provides API solutions for developers, analysts, and fintech firms wanting to integrate predictive stock data into their own applications. Key Features AI Stock Picker**: Advanced algorithms to identify high-potential stocks Portfolio Optimization**: Data-driven portfolio management and optimization tools Explainable AI**: Transparent AI insights that show reasoning behind recommendations Predictive Analytics**: AI Scores and forecasting capabilities Multi-Asset Coverage**: Analysis of stocks and ETFs across various markets MCP Server Endpoints This Danelfin MCP (Model Context Protocol) server exposes three main endpoints that provide comprehensive market analysis capabilities: 🏆 /ranking Endpoint GET: https://apirest.dan... The ranking endpoint provides AI-powered stock rankings and ratings. This tool allows users to: Access ranked lists of stocks based on AI-generated scores Retrieve performance rankings across different timeframes Get comparative analysis of stock performance Access predictive AI scores for investment decision-making 🏭 /sectors Endpoint GET: https://apirest.dan... The sectors endpoint provides analysis for US market stocks including AI ratings, prices, stock charts, technical, fundamental, and sentiment analysis organized by market sectors. Features include: Sector-wise market analysis and performance metrics AI ratings and price data for sector classifications Technical and fundamental analysis by sector Sentiment analysis across different market sectors Comparative sector performance insights 🏢 /industries Endpoint GET: https://apirest.dan... The industries endpoint offers granular industry-level analysis within market sectors: Industry-specific stock analysis and ratings Performance metrics at the industry classification level AI-powered insights for specific industry verticals Comparative industry analysis within sectors Industry trend analysis and forecasting Integration Benefits This MCP server enables seamless integration of Danelfin's AI-powered stock analysis capabilities into applications and workflows. Users can: Access Real-time Data**: Get up-to-date stock rankings, sector performance, and industry analysis Leverage AI Insights**: Utilize explainable AI recommendations for investment decisions Streamline Research**: Automate stock research workflows with comprehensive market data Enhance Decision-Making**: Make data-driven investment choices with predictive analytics Scale Analysis**: Process large-scale market analysis across multiple dimensions Use Cases Investment Research**: Comprehensive stock analysis and ranking for portfolio managers Algorithmic Trading**: Integration of AI scores into trading algorithms and strategies Financial Advisory**: Enhanced client advisory services with AI-powered insights Risk Management**: Sector and industry analysis for portfolio risk assessment Market Analysis**: Real-time market intelligence for institutional investors The Danelfin MCP server bridges the gap between advanced AI stock analytics and practical investment applications, providing developers and financial professionals with powerful tools for data-driven market analysis.
by Harshil Agrawal
You can use the Function node to create multiple JSON items from an array.
by Artem Boiko
Turn a .rvt project into open, analysis-ready data (XLSX + optional DAE/PDF) using the RvtExporter.exe from the DDC Revit toolkit. This n8n template provides a Form UI to set paths and flags, then runs the exporter with safe defaults. Converts CAD/BIM files with customizable export modes (basic: 309 categories, standard: 724 categories, complete: all 1209 categories) and optional outputs like bounding box, Revit schedules, or PDF drawings. What it does Launches RvtExporter.exe with your chosen Revit file and export mode Optional feature flags: bbox, schedule, sheets2pdf, -no-xlsx, -no-collada (Optional) Output folder auto-generates _rvt.xlsx and _rvt.dae names Prerequisites Windows host (the exporter is a Windows executable) DDC Revit toolkit installed (path like: C:\\DDC_Converter_Revit\\datadrivenlibs\\RvtExporter.exe) A local .rvt file you can read How to use Import this json workflow into n8n. Open the Form UI trigger (ensure pop-ups are allowed). Fill in: Path to Revit Converter: C:\\DDC_Converter_Revit\\datadrivenlibs\\RvtExporter.exe Revit File Path: e.g. C:\\Sample_Projects\\your_project.rvt Export Mode: basic | standard | complete | custom Conversion Options: multi-select flags (optional) (optional) output_folder: e.g. C:\\DDC_Output Click Submit → the workflow runs Run Revit Exporter. Consulting and Training We work with leading construction, engineering, consulting agencies and technology firms around the world to help them implement open data principles, automate CAD/BIM processing and build robust ETL pipelines. If you would like to test this solution with your own data, or are interested in adapting the workflow to real project tasks, feel free to contact us. Author DataDrivenConstruction.io info@datadrivenconstruction.io Docs & Issues: Full Readme on GitHub