by System Admin
Tagged with: , , , ,
by Yaron Been
This workflow automatically identifies and qualifies Instagram influencers based on your marketing criteria. It saves you hours of manual research by automatically filtering profiles that meet specific engagement, follower, and verification requirements, then storing qualified leads directly in Google Sheets. Overview This workflow uses Bright Data to scrape Instagram profile data, then applies smart filters to identify high-quality influencers or brand accounts. Only profiles that meet all your criteria (verified status, follower count, engagement rate, and account type) are saved to your lead database, keeping your list clean and actionable. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping Instagram profile data without restrictions Google Sheets**: For storing qualified influencer leads and profile data How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the Instagram scraping node Configure Google Sheets: Connect your Google Sheets account and copy the template spreadsheet Customize Filters: Adjust the criteria (followers, engagement rate, etc.) to match your needs Run: Simply paste any Instagram profile URL and execute the workflow Use Cases Influencer Marketing**: Build a database of qualified influencers for campaigns Brand Partnerships**: Identify potential brand collaboration opportunities Competitor Analysis**: Track competitor accounts and their engagement metrics Lead Generation**: Find business accounts in your niche for B2B outreach Market Research**: Analyze account types and engagement patterns in your industry Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #influencermarketing #instagram #brightdata #webscraping #leadgeneration #n8nworkflow #workflow #nocode #instagrammarketing #influenceroutreach #socialmediastrategy #brandpartnerships #marketingautomation #instagramanalytics #influencerdatabase #contentcreators #digitalmarketing #socialmediatools #influencerresearch #instagramleads #marketingtools #influenceridentification #instagramscraping #leadqualification #influencerengagement #brandcollaboration #instagramautomation #marketingdatabase
by System Admin
Tagged with: , , , ,
by Vigh Sandor
PDF Digital Signature API with PAdES Compliance Sign PDF documents with legally-compliant digital signatures using X.509 certificates. Supports multiple PAdES signature levels (B, T, LT, LTA) with optional visible stamps. What this workflow does This workflow creates a professional PDF signing service that: Accepts PDF files via webhook API Signs documents using X.509 certificates (PFX format) Returns cryptographically signed PDFs compliant with EU eIDAS standards Supports both visible and invisible signatures Provides multi-language landing pages for easy testing Perfect for contracts, invoices, legal documents, and any PDF requiring digital authentication. Use Cases Legal Document Signing**: Sign contracts and agreements with legally-binding digital signatures Invoice Authentication**: Add cryptographic signatures to invoices for validation Regulatory Compliance**: Meet EU eIDAS and other digital signature requirements Document Archival**: Create long-term valid signatures for permanent storage Automated Signing Pipeline**: Integrate PDF signing into your existing workflows How it Works Workflow Process File Upload: Receives PDF, certificate (PFX), and password via webhook Dependency Check: Automatically installs Java and signing tool if needed Certificate Processing: Extracts certificate and private key from PFX Signature Selection: Routes to appropriate signing method based on level PDF Signing: Signs document using open-pdf-sign tool Response: Returns signed PDF and cleans up temporary files Signature Levels Explained Choose the signature level based on your needs: BASELINE-B (Basic, 2-3 seconds) Fastest option Short-term validity (months) Best for: Testing, internal documents BASELINE-T (Timestamp, 3-5 seconds) - Recommended Includes trusted timestamp Medium-term validity (years) Best for: Contracts, invoices, business documents BASELINE-LT (Long-Term, 5-10 seconds) Includes revocation information Long-term validity (decades) Best for: Banking, healthcare, government BASELINE-LTA (Archival, 8-12 seconds) Maximum compliance level Permanent validity Best for: Critical legal documents Visible vs Invisible Signatures Invisible (default): No visual mark on document Preserves original appearance Signature in document metadata Visible: Shows signature stamp on PDF Includes logo and signature details More reassuring for recipients Add isVisible=true and logoFile to request Customization Change Signature Level Modify the signLevel parameter in your request: B - Basic T - Timestamp (default) LT - Long-term LTA - Archival Customize Visible Signature Upload a logo and add customization parameters to the signing command nodes: --hint "Digitally Signed" # Custom text --page 2 # Sign on page 2 --label-signee "Signed by" # Custom label --label-timestamp "Date" # Custom timestamp label --no-hint # Hide hint row --signature-reason "Contract Approval" # Reason text Adjust File Paths Modify these nodes to change temporary file locations: Write Files : PDF - PDF storage path Write Files : PFX - Certificate storage path Write Files : LOGO - Logo storage path Add Authentication For production use, add authentication before the webhook: Insert HTTP Request node to validate API key Add rate limiting Log signature operations Technical Details What Gets Installed The workflow automatically installs: OpenJDK 11 JRE (Java runtime) curl (for downloading) open-pdf-sign v0.3.0 (signing tool) Certificate Processing Uses OpenSSL to extract: X.509 certificate chain (.pem) Private key (.pem) All files use timestamped names to prevent conflicts. Security Features Automatic cleanup of sensitive files after each request No persistent storage of certificates or keys HTTPS recommended for production Supports password-protected certificates Standards Compliance Implements ETSI EN 319 142 PAdES standards: EU eIDAS regulation compliant Validates in Adobe Acrobat Reader Verifiable at EU DSS Demo webapp FAQ Q: Where do I get certificates? A: For testing, use free certificates from Codegic. For production, purchase from DigiCert, GlobalSign, or Sectigo. Q: What PDF sizes are supported? A: Up to 50MB by default. Adjust n8n configuration for larger files. Q: Can I sign multiple PDFs at once? A: Call the API once per PDF, or modify the workflow to accept multiple files. Q: Will signatures work in Adobe Reader? A: Yes, if using certificates from trusted CAs. Self-signed certificates will show warnings. Q: How do I verify signed PDFs? A: Open in Adobe Acrobat Reader and check the signature panel, or use the EU DSS validation tool webapp. Q: Can I use this commercially? A: Yes, the workflow is free for personal and commercial use. Support Documentation**: See workflow sticky notes for detailed information Tool Source**: open-pdf-sign on GitHub Standards**: ETSI PAdES specifications Community**: n8n Community Forum License: Free for personal and commercial use Dependencies: OpenJDK 11, OpenSSL, curl, open-pdf-sign v0.3.0 (Apache 2.0)
by WeblineIndia
🌡 IoT Sensor Data Cleaner + InfluxDB Logger (n8n | Webhook | Function | InfluxDB) This workflow accepts raw sensor data from IoT devices via webhook, applies basic cleaning and transformation logic, and writes the cleaned data to an InfluxDB instance for time-series tracking. Perfect for renewable energy sites, smart farms and environmental monitoring setups using dashboards like Grafana or Chronograf. ⚡ Quick Implementation Steps Import the workflow JSON into your n8n instance. Edit the Set Config node to include your InfluxDB credentials and measurement name. Use the webhook URL (/webhook/sensor-data) in your IoT device or form to send sensor data. Start monitoring your data directly in InfluxDB! 🎯 Who’s It For IoT developers and integrators. Renewable energy and environmental monitoring teams. Data engineers working with time-series data. Smart agriculture and utility automation platforms. 🛠 Requirements | Tool | Purpose | |------|---------| | n8n Instance | For automation | | InfluxDB (v1 or v2) | To store time-series sensor data | | IoT Device or Platform | To POST sensor data | | Function Node | To filter and transform data | 🧠 What It Does Accepts JSON-formatted sensor data via HTTP POST. Validates the data (removes invalid or noisy readings). Applies transformation (rounding, timestamp formatting). Pushes the cleaned data to InfluxDB for real-time visualization. 🧩 Workflow Components Webhook Node:** Exposes an HTTP endpoint to receive sensor data. Function Node:** Filters out-of-range values, formats timestamp, rounds data. Set Node:** Stores configurable values like InfluxDB host, user/pass, and measurement name. InfluxDB Node:** Writes valid records into the specified database bucket. 🔧 How To Set Up – Step-by-Step Import Workflow: Upload the provided .json file into your n8n workspace. Edit Configuration Node: Update InfluxDB connection info in the Set Config node: influxDbHost, influxDbDatabase, influxDbUsername, influxDbPassword measurement: What you want to name the data set (e.g., sensor_readings) Send Data to Webhook: Webhook URL: https://your-n8n/webhook/sensor-data Example payload: { "temperature": 78.3, "humidity": 44.2, "voltage": 395.7, "timestamp": "2024-06-01T12:00:00Z" } View in InfluxDB: Log in to your InfluxDB/Grafana dashboard and query the new measurement. ✨ How To Customize | Customization | Method | |---------------|--------| | Add more fields (e.g., wind_speed) | Update the Function & InfluxDB nodes | | Add field/unit conversion | Use math in the Function node | | Send email alerts on anomalies | Add IF → Email branch after Function node | | Store in parallel in Google Sheets | Add Google Sheets node for hybrid logging | ➕ Add‑ons (Advanced) | Add-on | Description | |--------|-------------| | 📊 Grafana Integration | Real-time charts using InfluxDB | | 📧 Email on Faulty Data | Notify if voltage < 0 or temperature too high | | 🧠 AI Filtering | Add OpenAI or TensorFlow for anomaly detection | | 🗃 Dual Logging | Save data to both InfluxDB and BigQuery/Sheets | 📈 Use Case Examples Remote solar inverter sends temperature and voltage via webhook. Environmental sensor hub logs humidity and air quality data every minute. Smart greenhouse logs climate control sensor metrics. Edge IoT devices periodically report health and diagnostics remotely. 🧯 Troubleshooting Guide | Issue | Cause | Solution | |-------|-------|----------| | No data logged in InfluxDB | Invalid credentials or DB name | Recheck InfluxDB values in config | | Webhook not triggered | Wrong method or endpoint | Confirm it is a POST to /webhook/sensor-data | | Data gets filtered | Readings outside valid range | Check logic in Function node | | Data not appearing in dashboard | Influx write format error | Inspect InfluxDB log and field names | 📞 Need Assistance? Need help integrating this workflow into your energy monitoring system or need InfluxDB dashboards built for you? 👉 Contact WeblineIndia | Experts in workflow automation and time-series analytics.
by WeblineIndia
GitHub PR Deep-Link & Routing Validator (ExecuteCommand + GitHub Comment) 🚀 Quick-Start TL;DR Import the workflow JSON into n8n (Cloud or self-hosted). Create a GitHub Personal Access Token with repo:public_repo (or repo) scope and add it to n8n credentials. Open the “CONFIG - Variables” node and tweak: manifestPath – path to your deep-link manifest (AndroidManifest.xml, Info.plist, etc.). scriptPath – helper script that boots the emulator & checks each route. Enable the workflow. Every push to a PR branch triggers validation and posts a Markdown pass/fail matrix back to the PR. What It Does This workflow delivers an automated, CI-friendly smoke-test of every deep link defined in your mobile app. On each push to an open GitHub PR, it: Clones the PR branch. Runs a lightweight validation script (provided) that spins up an emulator/simulator, attempts to open each declared URI, and records OK/FAIL. Generates a Markdown table summarizing the results. Comments that table in the PR, letting reviewers spot broken schemes at a glance. Who’s It For Mobile teams maintaining Android or iOS deep-link manifests. CI engineers who need a simple, language-agnostic check they can publish to each PR. OSS maintainers wanting a template-library-ready n8n recipe. Requirements | Requirement | Notes | |-------------|-------| | n8n Cloud / CE | Works everywhere; self-hosted users need Docker with Android / Xcode if validating on-runner. | | GitHub Personal Access Token | Used for posting PR comments. | | Emulator-capable runner | Local dev hardware or CI image that can run adb / xcrun simctl. | How It Works GitHub Trigger fires on pull_request → synchronize (i.e., each push to the PR branch). Set (CONFIG - Variables) centralises repo URL, manifest path, script path, timeout, and comment mode. ExecuteCommand clones the repo and calls the validation script. Function converts CLI CSV output into a Markdown table. GitHub node posts (or appends) the results as a comment on the PR. How To Set Up Auth: In n8n, add a GitHub credential with your PAT named “GitHub Personal Access Token”. Import: Settings → Import workflow and paste the JSON above. Edit Config: Double-click CONFIG - Variables and change any default values. Validation Script: Commit scripts/validate_deeplinks.sh into your repo (see sample below). Enable the workflow. Push to any PR branch and watch the comment appear. Sample validate_deeplinks.sh #!/usr/bin/env bash set -e while getopts "m:" opt; do case $opt in m) MANIFEST="$OPTARG" ;; esac done echo "⇨ Parsing deep links from $MANIFEST" rudimentary parser—replace with something smarter for XML/Plist grep -oE 'http[s]?://+' "$MANIFEST" | while read -r uri; do if adb shell am start -W -a android.intent.action.VIEW -d "$uri" >/dev/null 2>&1; then echo "$uri,OK" else echo "$uri,FAIL" fi done How To Customise Multiple manifests:** duplicate the Execute-Command step or extend the script to accept a list. Replace-latest comment:** switch commentMode to replace-latest and update the GitHub node to search for the newest bot comment before editing. Status checks instead of comments:** call the GitHub → “Create Commit Status” endpoint. Add-Ons | Add-On | Idea | |--------|------| | Multi-platform sweep | Loop over Android + iOS manifests and aggregate results. | | Slack/Teams alert | Push failures into your chat of choice via Incoming-Webhook node. | | Parallel device grid | Trigger multiple emulators (API 19 → 34) to catch OS-specific issues. | Use Case Examples Ensure new features don’t break existing URI schemes before merge. Catch mis-typed hosts/paths introduced by junior devs. Baseline check on dependency bumps (e.g., upgrading Navigation libraries). Validate white-label builds that override path segments. Automated QA gate that blocks merge if any link fails. (…and many more!) Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|---------------|----------| | Workflow hangs at “Execute – Validate” | Emulator image isn’t installed | Pre-install SDK & start the emulator in a startup script | | PR comment missing | Token lacks repo scope | Regenerate PAT with proper scopes | | All links marked FAIL | Manifest path incorrect | Update manifestPath in CONFIG | | Command node hits timeout | Huge manifest / slow CI | Increase timeoutSecs in CONFIG | Need a Hand? 🤝 Stuck or want to extend this with multi-platform coverage? WeblineIndia’s automation experts can help. Drop us a note to fine-tune or scale out your n8n workflows — fast.
by Jenny
Evaluate Hybrid Search on Legal Dataset This is the second part of *"Hybrid Search with Qdrant & n8n, Legal AI."** The first part, "Indexing", covers preparing and uploading the dataset to Qdrant.* Overview This pipeline demonstrates how to perform Hybrid Search on a Qdrant collection using questions and text chunks (containing answers) from the LegalQAEval dataset (isaacus). On a small subset of questions, it shows: How to set up hybrid retrieval in Qdrant with: BM25-based keyword retrieval; mxbai-embed-large-v1 semantic retrieval; Reciprocal Rank Fusion (RRF), a simple zero-shot fusion of the two searches; How to run a basic evaluation: Calculate hits@1 — the percentage of evaluation questions where the top-1 retrieved text chunk contains the correct answer After running this pipeline, you will have a quality estimate of a simple hybrid retrieval setup. From there, you can reuse Qdrant’s Query Points node to build a legal RAG chatbot. Embedding Inference By default, this pipeline uses Qdrant Cloud Inference to convert questions to embeddings. You can also use an external embedding provider (e.g. OpenAI). In that case, minimally update the pipeline, similar to the adjustments showed in Part 1: Indexing. Prerequisites Completed Part 1 pipeline*, *"Hybrid Search with Qdrant & n8n, Legal AI: Indexing", and the collection created in it; All the requirements of Part 1 pipeline; Hybrid Search The example here is a basic hybrid query. You can extend/enhance it with: Reranking strategies; Different fusion techniques; Score boosting based on metadata; ... More details: Hybrid Queries in Qdrant. P.S. To ask retrieval in Qdrant-related questions, join the Qdrant Discord. Star Qdrant n8n community node repo <3
by System Admin
Tagged with: , , , ,
by PDF Vector
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Build Citation Networks from Research Papers Automatically build and visualize citation networks by fetching papers and their references. Discover influential works and research trends in any field. Workflow Features: Start with seed papers (DOIs, PubMed IDs, etc.) Fetch cited and citing papers recursively Build network graph data Export to visualization tools (Gephi, Cytoscape) Identify key papers and research clusters Process Flow: Input: Seed paper identifiers Fetch Papers: Get paper details and references Expand Network: Fetch cited papers (configurable depth) Build Graph: Create nodes and edges Analyze: Calculate metrics (centrality, clusters) Export: Generate visualization-ready data Applications: Research trend analysis Finding seminal papers in a field Grant proposal background research
by System Admin
Tagged with: , , , ,
by System Admin
Tagged with: , , , ,
by System Admin
Tagged with: , , , ,